Geometrical interpretation of the back-propagation algorithm for the perceptron
Marco Budinich and
Edoardo Milotti
Physica A: Statistical Mechanics and its Applications, 1992, vol. 185, issue 1, 369-377
Abstract:
It is well known that the gradient descent algorithm works well for the perceptron when the solution to the perceptron problem exists because the cost function has a simple shape — with just one minimum — in the conjugate weight-space. Working in the conjugate space we show that if a perceptron solution does not exist the cost function of a perceptron with d inputs and n patterns has an average O(nd) relative minima (for large n). In this case finding the best solution (the solution with the minimum number of errors) becomes a challenging problem. for any local search algorithm.
Date: 1992
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/0378437192904778
Full text for ScienceDirect subscribers only. Journal offers the option of making the article available online on Science direct for a fee of $3,000
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:phsmap:v:185:y:1992:i:1:p:369-377
DOI: 10.1016/0378-4371(92)90477-8
Access Statistics for this article
Physica A: Statistical Mechanics and its Applications is currently edited by K. A. Dawson, J. O. Indekeu, H.E. Stanley and C. Tsallis
More articles in Physica A: Statistical Mechanics and its Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().