Asymptotic properties of neural network sieve estimators
Xiaoxi Shen,
Chang Jiang,
Lyudmila Sakhanenko and
Qing Lu
Journal of Nonparametric Statistics, 2023, vol. 35, issue 4, 839-868
Abstract:
Neural networks have become one of the most popularly used methods in machine learning and artificial intelligence. Due to the universal approximation theorem, a neural network with one hidden layer can approximate any continuous function on compact support as long as the number of hidden units is sufficiently large. Statistically, a neural network can be classified into a nonlinear regression framework. However, if we consider it parametrically, due to the unidentifiability of the parameters, it is difficult to derive its asymptotic properties. Instead, we consider the estimation problem in a nonparametric regression framework and use the results from sieve estimation to establish the consistency, the rates of convergence and the asymptotic normality of the neural network estimators. We also illustrate the validity of the theories via simulations.
Date: 2023
References: Add references at CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://hdl.handle.net/10.1080/10485252.2023.2209218 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:gnstxx:v:35:y:2023:i:4:p:839-868
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/GNST20
DOI: 10.1080/10485252.2023.2209218
Access Statistics for this article
Journal of Nonparametric Statistics is currently edited by Jun Shao
More articles in Journal of Nonparametric Statistics from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().