Approximating conditional density functions using dimension reduction

J Q Fan, L Peng, Q W Yao, Wen Yang Zhang

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

We propose to approximate the conditional density function of a random variable Y given a dependent random d-vector X by that of Y given theta X-tau, where the unit vector theta is selected such that the average Kullback-Leibler discrepancy distance between the two conditional density functions obtains the minimum. Our approach is nonparametric as far as the estimation of the conditional density functions is concerned. We have shown that this nonparametric estimator is asymptotically adaptive to the unknown index theta in the sense that the first order asymptotic mean squared error of the estimator is the same as that when theta was known. The proposed method is illustrated using both simulated and real-data examples.
Original languageEnglish
Pages (from-to)445-456
Number of pages12
JournalActa Mathematicae Applicatae Sinica-English Series
Volume25
Issue number3
DOIs
Publication statusPublished - 2009

Fingerprint

Conditional Density
Dimension Reduction
Density Function
Probability density function
Unit vector
Nonparametric Estimator
Mean Squared Error
Random variables
Discrepancy
Random variable
First-order
Estimator
Unknown
Dependent

Keywords

  • dimension reduction
  • Kullback-Leibler discrepancy
  • Shannon's entropy
  • nonparametric regression
  • local linear regression
  • Conditional density function

Cite this

Approximating conditional density functions using dimension reduction. / Fan, J Q; Peng, L; Yao, Q W; Zhang, Wen Yang.

In: Acta Mathematicae Applicatae Sinica-English Series, Vol. 25, No. 3, 2009, p. 445-456.

Research output: Contribution to journalArticle

Fan, J Q ; Peng, L ; Yao, Q W ; Zhang, Wen Yang. / Approximating conditional density functions using dimension reduction. In: Acta Mathematicae Applicatae Sinica-English Series. 2009 ; Vol. 25, No. 3. pp. 445-456.
@article{6d03adce9d924f8dacde3d823c6c5dd1,
title = "Approximating conditional density functions using dimension reduction",
abstract = "We propose to approximate the conditional density function of a random variable Y given a dependent random d-vector X by that of Y given theta X-tau, where the unit vector theta is selected such that the average Kullback-Leibler discrepancy distance between the two conditional density functions obtains the minimum. Our approach is nonparametric as far as the estimation of the conditional density functions is concerned. We have shown that this nonparametric estimator is asymptotically adaptive to the unknown index theta in the sense that the first order asymptotic mean squared error of the estimator is the same as that when theta was known. The proposed method is illustrated using both simulated and real-data examples.",
keywords = "dimension reduction, Kullback-Leibler discrepancy, Shannon's entropy, nonparametric regression, local linear regression, Conditional density function",
author = "Fan, {J Q} and L Peng and Yao, {Q W} and Zhang, {Wen Yang}",
year = "2009",
doi = "10.1007/s10255-008-8815-1",
language = "English",
volume = "25",
pages = "445--456",
journal = "Acta Mathematicae Applicatae Sinica-English Series",
issn = "0168-9673",
publisher = "Springer Verlag",
number = "3",

}

TY - JOUR

T1 - Approximating conditional density functions using dimension reduction

AU - Fan, J Q

AU - Peng, L

AU - Yao, Q W

AU - Zhang, Wen Yang

PY - 2009

Y1 - 2009

N2 - We propose to approximate the conditional density function of a random variable Y given a dependent random d-vector X by that of Y given theta X-tau, where the unit vector theta is selected such that the average Kullback-Leibler discrepancy distance between the two conditional density functions obtains the minimum. Our approach is nonparametric as far as the estimation of the conditional density functions is concerned. We have shown that this nonparametric estimator is asymptotically adaptive to the unknown index theta in the sense that the first order asymptotic mean squared error of the estimator is the same as that when theta was known. The proposed method is illustrated using both simulated and real-data examples.

AB - We propose to approximate the conditional density function of a random variable Y given a dependent random d-vector X by that of Y given theta X-tau, where the unit vector theta is selected such that the average Kullback-Leibler discrepancy distance between the two conditional density functions obtains the minimum. Our approach is nonparametric as far as the estimation of the conditional density functions is concerned. We have shown that this nonparametric estimator is asymptotically adaptive to the unknown index theta in the sense that the first order asymptotic mean squared error of the estimator is the same as that when theta was known. The proposed method is illustrated using both simulated and real-data examples.

KW - dimension reduction

KW - Kullback-Leibler discrepancy

KW - Shannon's entropy

KW - nonparametric regression

KW - local linear regression

KW - Conditional density function

UR - http://www.scopus.com/inward/record.url?scp=66749126022&partnerID=8YFLogxK

UR - http://dx.doi.org/10.1007/s10255-008-8815-1

U2 - 10.1007/s10255-008-8815-1

DO - 10.1007/s10255-008-8815-1

M3 - Article

VL - 25

SP - 445

EP - 456

JO - Acta Mathematicae Applicatae Sinica-English Series

JF - Acta Mathematicae Applicatae Sinica-English Series

SN - 0168-9673

IS - 3

ER -