We develop dimension reduction estimating methods for probability density with data missing at random in the presence of covariables. In this paper, we propose two families of sufficient dimension reduction based nonparametric density estimators by modifying the regression calibration estimator and the inverse probability weighted estimator due to Wang (2008). The proposed methods overcome the challenges faced with high dimensional covariates: model specification and curse of dimensionality. The curse of dimensionality is overcome by replacing the covariables Xi in the regression calibration estimator and the inverse probability weighted estimator, respectively, with a root-n consistent estimator of a score S(Xi) for i=1,2,…,n. Three different scores S(⋅) are found by dimension reduction techniques. It is shown that the two families of proposed estimators are asymptotically normal, respectively, by taking three different scores. The asymptotic variances are the same when the same score is taken. With different scores, the asymptotic variances are different. A comparison for the two families of density estimators is made by taking different scores. Simulations are carried out to demonstrate the excellent performances of the proposed methods. A real data analysis is used to illustrate our methods.