EconPapers    
Economics at your fingertips  
 

A new class of nonmonotone conjugate gradient training algorithms

Ioannis E. Livieris and Panagiotis Pintelas

Applied Mathematics and Computation, 2015, vol. 266, issue C, 404-413

Abstract: In this paper, we propose a new class of conjugate gradient algorithms for training neural networks which is based on a new modified nonmonotone scheme proposed by Shi and Wang (2011). The utilization of a nonmonotone strategy enables the training algorithm to overcome the case where the sequence of iterates runs into the bottom of a curved narrow valley, a common occurrence in neural network training process. Our proposed class of methods ensures sufficient descent, avoiding thereby the usual inefficient restarts and it is globally convergent under mild conditions. Our experimental results provide evidence that the proposed nonmonotone conjugate gradient training methods are efficient, outperforming classical methods, proving more stable, efficient and reliable learning.

Keywords: Artificial neural networks; Conjugate gradient algorithm; Nonmonotone line search; Global convergence (search for similar items in EconPapers)
Date: 2015
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0096300315006773
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:apmaco:v:266:y:2015:i:c:p:404-413

DOI: 10.1016/j.amc.2015.05.053

Access Statistics for this article

Applied Mathematics and Computation is currently edited by Theodore Simos

More articles in Applied Mathematics and Computation from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:apmaco:v:266:y:2015:i:c:p:404-413