Categorization in a layered neural network
J.A. Martins and
W.K. Theumann
Physica A: Statistical Mechanics and its Applications, 1998, vol. 253, issue 1, 38-56
Abstract:
The layered feedforward network of Domany, Meir and Kinzel is extended to investigate the ability to recognize any one of a macroscopic number of concepts (ancestors) when the network is trained through a finite number of examples (descendents) of each concept, by means of a generalized Hebbian learning rule between cells on two consecutive layers. Learning curves describing the generalization error are obtained, as well as phase diagrams and basins of attraction that exhibit the possible coexistence of a generalization phase, a retrieval phase of examples and a spin-glass phase.
Date: 1998
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0378437197006894
Full text for ScienceDirect subscribers only. Journal offers the option of making the article available online on Science direct for a fee of $3,000
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:phsmap:v:253:y:1998:i:1:p:38-56
DOI: 10.1016/S0378-4371(97)00689-4
Access Statistics for this article
Physica A: Statistical Mechanics and its Applications is currently edited by K. A. Dawson, J. O. Indekeu, H.E. Stanley and C. Tsallis
More articles in Physica A: Statistical Mechanics and its Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().