A generalization bound of deep neural networks for dependent data
Quan Huu Do,
Binh T. Nguyen and
Lam Si Tung Ho
Statistics & Probability Letters, 2024, vol. 208, issue C
Abstract:
Existing generalization bounds for deep neural networks require data to be independent and identically distributed (iid). This assumption may not hold in real-life applications such as evolutionary biology, infectious disease epidemiology, and stock price prediction. This work establishes a generalization bound of feed-forward neural networks for non-stationary φ-mixing data.
Keywords: Neural networks; Generalization bound; Non-stationary process; Mixing stochastic process (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167715224000294
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:208:y:2024:i:c:s0167715224000294
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
DOI: 10.1016/j.spl.2024.110060
Access Statistics for this article
Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul
More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().