EconPapers    
Economics at your fingertips  
 

Feed-forward chains of recurrent attractor neural networks with finite dilution near saturation

F.L. Metz and W.K. Theumann

Physica A: Statistical Mechanics and its Applications, 2006, vol. 368, issue 1, 273-286

Abstract: A stationary state replica analysis for a dual neural network model that interpolates between a fully recurrent symmetric attractor network and a strictly feed-forward layered network, studied by Coolen and Viana, is extended in this work to account for finite dilution of the recurrent Hebbian interactions between binary Ising units within each layer. Gradual dilution is found to suppress part of the phase transitions that arise from the competition between recurrent and feed-forward operation modes of the network. Despite that, a long chain of layers still exhibits a relatively good performance under finite dilution for a balanced ratio between inter-layer and intra-layer interactions.

Keywords: Layered recurrent neural works; Dilution; Stationary states (search for similar items in EconPapers)
Date: 2006
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0378437106001026
Full text for ScienceDirect subscribers only. Journal offers the option of making the article available online on Science direct for a fee of $3,000

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:phsmap:v:368:y:2006:i:1:p:273-286

DOI: 10.1016/j.physa.2005.11.049

Access Statistics for this article

Physica A: Statistical Mechanics and its Applications is currently edited by K. A. Dawson, J. O. Indekeu, H.E. Stanley and C. Tsallis

More articles in Physica A: Statistical Mechanics and its Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:phsmap:v:368:y:2006:i:1:p:273-286