EconPapers    
Economics at your fingertips  
 

Mean field analysis of neural networks: A central limit theorem

Justin Sirignano and Konstantinos Spiliopoulos

Stochastic Processes and their Applications, 2020, vol. 130, issue 3, 1820-1852

Abstract: We rigorously prove a central limit theorem for neural network models with a single hidden layer. The central limit theorem is proven in the asymptotic regime of simultaneously (A) large numbers of hidden units and (B) large numbers of stochastic gradient descent training iterations. Our result describes the neural network’s fluctuations around its mean-field limit. The fluctuations have a Gaussian distribution and satisfy a stochastic partial differential equation. The proof relies upon weak convergence methods from stochastic analysis. In particular, we prove relative compactness for the sequence of processes and uniqueness of the limiting process in a suitable Sobolev space.

Date: 2020
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0304414918306197
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:spapps:v:130:y:2020:i:3:p:1820-1852

Ordering information: This journal article can be ordered from
http://http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01

DOI: 10.1016/j.spa.2019.06.003

Access Statistics for this article

Stochastic Processes and their Applications is currently edited by T. Mikosch

More articles in Stochastic Processes and their Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:spapps:v:130:y:2020:i:3:p:1820-1852