EconPapers    
Economics at your fingertips  
 

Analytic Regularity and Approximation Limits of Coefficient-Constrained Shallow Networks

Jean-Gabriel Attali

Papers from arXiv.org

Abstract: We study approximation limits of single-hidden-layer neural networks with analytic activation functions under global coefficient constraints. Under uniform $\ell^1$ bounds, or more generally sub-exponential growth of the coefficients, we show that such networks generate model classes with strong quantitative regularity, leading to uniform analyticity of the realized functions. As a consequence, up to an exponentially small residual term, the error of best network approximation on generic target functions is bounded from below by the error of best polynomial approximation. In particular, networks with analytic activation functions with controlled coefficients cannot outperform classical polynomial approximation rates on non-analytic targets. The underlying rigidity phenomenon extends to smoother, non-analytic activations satisfying Gevrey-type regularity assumptions, yielding sub-exponential variants of the approximation barrier. The analysis is entirely deterministic and relies on a comparison argument combined with classical Bernstein-type estimates; extensions to higher dimensions are also discussed.

Date: 2026-01
References: Add references at CitEc
Citations:

Downloads: (external link)
http://arxiv.org/pdf/2601.04914 Latest version (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2601.04914

Access Statistics for this paper

More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().

 
Page updated 2026-01-09
Handle: RePEc:arx:papers:2601.04914