The estimation of simultaneous approximation order for neural networks
Fengjun Li and
Zongben Xu
Chaos, Solitons & Fractals, 2008, vol. 36, issue 3, 572-580
Abstract:
A three-layer feed forward artificial neural network with trigonometric hidden-layer units is constructed. The essential order of approximation for the network which can simultaneously approximate function and its derivatives is estimated and a theorem of saturation (the largest capacity of simultaneous approximation) is proved. These results can precisely characterize the approximation ability of the network and the relationship among the rate of simultaneous approximation, the topological structure of hidden-layer and the properties of approximated functions.
Date: 2008
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S096007790700015X
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:chsofr:v:36:y:2008:i:3:p:572-580
DOI: 10.1016/j.chaos.2007.01.020
Access Statistics for this article
Chaos, Solitons & Fractals is currently edited by Stefano Boccaletti and Stelios Bekiros
More articles in Chaos, Solitons & Fractals from Elsevier
Bibliographic data for series maintained by Thayer, Thomas R. ().