asm: Optimal convex M-estimation for linear regression via antitonic score matching

Oliver Feng (Developer), Yu-Chun Kao (Developer), Min Xu (Developer), Richard Samworth (Developer)

Research output: Non-textual formSoftware

Abstract

Performs linear regression with respect to a data-driven convex loss function that is chosen to minimize the asymptotic covariance of the resulting M-estimator. The convex loss function is estimated in 5 steps: (1) form an initial OLS (ordinary least squares) or LAD (least absolute deviation) estimate of the regression coefficients; (2) use the resulting residuals to obtain a kernel estimator of the error density; (3) estimate the score function of the errors by differentiating the logarithm of the kernel density estimate; (4) compute the L2 projection of the estimated score function onto the set of decreasing functions; (5) take a negative antiderivative of the projected score function estimate. Newton's method (with Hessian modification) is then used to minimize the convex empirical risk function. Further details of the method are given in Feng et al. (2024).
Original languageEnglish
PublisherCRAN
Media of outputOnline
DOIs
Publication statusPublished - 11 May 2024

Fingerprint

Dive into the research topics of 'asm: Optimal convex M-estimation for linear regression via antitonic score matching'. Together they form a unique fingerprint.

Cite this