Evolving Computational Neural Networks Through Evolutionary Computation
Xin Yao
Additional contact information
Xin Yao: The University of Birmingham
Chapter 3 in GeoComputational Modelling, 2001, pp 35-70 from Springer
Abstract:
Abstract Computational neural networks [CNNs] have been used widely in many application areas in recent years. Most applications use feedforward CNNs and the backpropagation [BP] training algorithm. There are numerous variants of the classical BP algorithm and other training algorithms, but all these training algorithms assume a fixed CNN architecture. They only train weights in the fixed architecture that includes both connectivity and node transfer functions [see also Chapter 8 in this volume]. The problem of designing a near optimal CNN architecture for an application remains unsolved. This is an important issue, because there is strong biological and engineering evidence to support the contention that the function, i.e. the information processing capability of an CNN, is determined by its architecture.
Keywords: Hide Node; Good Individual; Parent Network; Minimum Error Rate; Partial Training (search for similar items in EconPapers)
Date: 2001
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:adspcp:978-3-662-04637-1_3
Ordering information: This item can be ordered from
http://www.springer.com/9783662046371
DOI: 10.1007/978-3-662-04637-1_3
Access Statistics for this chapter
More chapters in Advances in Spatial Science from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().