EconPapers    
Economics at your fingertips  
 

Statistical mechanics of neural networks: what are the differences between wide and narrow basins?

K.Y.M. Wong and D. Sherrington

Physica A: Statistical Mechanics and its Applications, 1992, vol. 185, issue 1, 453-460

Abstract: We consider training noise in neural networks as a means of tuning the structure of retrieval basins, and study how learning and retrieving properties depend on it. The stability of the replica symmetric solution and the correlation in the weight space indicate that neural networks can be roughly classified into Hebbian-like and MSN-like (MSN meaning the maximally stable network). Re-entrant retrieval, noise robustness, selectivity, damage spreading and activity distribution all illustrate the differences in retrieval behaviours arising from the different basin structures.

Date: 1992
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/037843719290489D
Full text for ScienceDirect subscribers only. Journal offers the option of making the article available online on Science direct for a fee of $3,000

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:phsmap:v:185:y:1992:i:1:p:453-460

DOI: 10.1016/0378-4371(92)90489-D

Access Statistics for this article

Physica A: Statistical Mechanics and its Applications is currently edited by K. A. Dawson, J. O. Indekeu, H.E. Stanley and C. Tsallis

More articles in Physica A: Statistical Mechanics and its Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:phsmap:v:185:y:1992:i:1:p:453-460