Penalized likelihood estimation: Convergence under incorrect model
Chong Gu
Statistics & Probability Letters, 1998, vol. 36, issue 4, 359-364
Abstract:
Penalized likelihood method is among the most effective tools for nonparametric multivariate function estimation. Recently, a generic computation-oriented asymptotic theory has been developed in the density estimation setting, and been extended to other settings such as conditional density estimation, regression, and hazard rare estimation, under the assumption that the true function resides in a reproducing kernel Hilbert space in which the estimate is sought. In this article, we illustrate that the theory may remain valid, after appropriate modifications, even when the true function resides outside of the function space under consideration. Through a certain moment identity, it is shown that the Kullback-Leibler projection of the true function in the function space under consideration, if it exists, acts as the proxy of the true function as the destination of asymptotic convergence.
Keywords: Density; estimation; Hazard; rate; estimation; Kullback-Leibler; Regression (search for similar items in EconPapers)
Date: 1998
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167-7152(97)00082-5
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:36:y:1998:i:4:p:359-364
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
Access Statistics for this article
Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul
More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().