Neural networks: a need for caution
B. Curry and
P. Morgan
Omega, 1997, vol. 25, issue 1, 123-133
Abstract:
This paper deals with the computational aspects of neural networks. Specifically, it is suggested that the now traditional method of backpropagation (BP) may not be the most appropriate basis for learning. The argument is based on the known deficiencies of gradient descent methods, of which BP is an application. Simulation results also suggest that improved performance may be obtained by employing direct optimization procedures such as the polytope algorithm. The main reason for such performance differences appears to be that the root mean square function is subject to narrow 'valleys' and other anomalies.
Keywords: neural; network; backpropagation; polytope; gradient; descent; and; direct; optimization (search for similar items in EconPapers)
Date: 1997
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (17)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0305-0483(96)00052-7
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:jomega:v:25:y:1997:i:1:p:123-133
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
Access Statistics for this article
Omega is currently edited by B. Lev
More articles in Omega from Elsevier
Bibliographic data for series maintained by Catherine Liu ().