Learning and retrieval in attractor neural networks with noise
R Erichsen and
W.k Theumann
Physica A: Statistical Mechanics and its Applications, 1995, vol. 220, issue 3, 390-402
Abstract:
A recent study on noiseless learning and retrieval in attractor neural networks above saturation, by Griniasty and Gutfreund, is extended to take account for imperfect learning by means of a temperature β−1 = T. Violations of the constraint imposed on the local stabilities are taken into account by various cost functions. The distribution of local stabilities and the fraction of errors during the learning stage are analysed. The retrieval dynamics for the sparsely connected network is studied showing high retrieval overlap in reduced retrieval regions for finite T.
Date: 1995
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/0378437195001827
Full text for ScienceDirect subscribers only. Journal offers the option of making the article available online on Science direct for a fee of $3,000
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:phsmap:v:220:y:1995:i:3:p:390-402
DOI: 10.1016/0378-4371(95)00182-7
Access Statistics for this article
Physica A: Statistical Mechanics and its Applications is currently edited by K. A. Dawson, J. O. Indekeu, H.E. Stanley and C. Tsallis
More articles in Physica A: Statistical Mechanics and its Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().