EconPapers    
Economics at your fingertips  
 

On the overtraining phenomenon of backpropagation neural networks

S.G. Tzafestas, P.J. Dalianis and G. Anthopoulos

Mathematics and Computers in Simulation (MATCOM), 1996, vol. 40, issue 5, 507-521

Abstract: A very important subject for the consolidation of neural networks is the study of their capabilities. In this paper, the relationships between network size, training set size and generalization capabilities are examined. The phenomenon of overtraining in backpropagation networks is discussed and an extension to an existing algorithm is described. The extended algorithm provides a new energy function and its advantages, such as improved plasticity and performance along with its dynamic properties, are explained. The algorithm is applied to some common problems (XOR, numeric character recognition and function approximation) and simulation results are presented and discussed.

Date: 1996
References: View complete reference list from CitEc
Citations: View citations in EconPapers (3)

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/0378475495000038
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:matcom:v:40:y:1996:i:5:p:507-521

DOI: 10.1016/0378-4754(95)00003-8

Access Statistics for this article

Mathematics and Computers in Simulation (MATCOM) is currently edited by Robert Beauwens

More articles in Mathematics and Computers in Simulation (MATCOM) from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:matcom:v:40:y:1996:i:5:p:507-521