Approximating conditional density functions using dimension reduction

J Q Fan, L Peng, Q W Yao, Wen Yang Zhang

Research output: Contribution to journalArticlepeer-review

11 Citations (SciVal)


We propose to approximate the conditional density function of a random variable Y given a dependent random d-vector X by that of Y given theta X-tau, where the unit vector theta is selected such that the average Kullback-Leibler discrepancy distance between the two conditional density functions obtains the minimum. Our approach is nonparametric as far as the estimation of the conditional density functions is concerned. We have shown that this nonparametric estimator is asymptotically adaptive to the unknown index theta in the sense that the first order asymptotic mean squared error of the estimator is the same as that when theta was known. The proposed method is illustrated using both simulated and real-data examples.
Original languageEnglish
Pages (from-to)445-456
Number of pages12
JournalActa Mathematicae Applicatae Sinica-English Series
Issue number3
Publication statusPublished - 2009


  • dimension reduction
  • Kullback-Leibler discrepancy
  • Shannon's entropy
  • nonparametric regression
  • local linear regression
  • Conditional density function


Dive into the research topics of 'Approximating conditional density functions using dimension reduction'. Together they form a unique fingerprint.

Cite this