Robustness analysis of a hybrid of recursive neural dynamics for online matrix inversion
Ke Chen and
Chenfu Yi
Applied Mathematics and Computation, 2016, vol. 273, issue C, 969-975
Abstract:
Encouraged by superior convergence performance achieved by a recently-proposed hybrid of recursive neural dynamics for online matrix inversion, we investigate its robustness properties in this paper when there exists large model implementation errors. Theoretical analysis shows that the perturbed dynamic system is still global stable with the tight steady-state bound of solution error estimated. Moreover, this paper analyses global exponential convergence rate and finite convergence time of such a hybrid dynamical model to a relatively loose solution error bound. Computer simulation results substantiate our analysis on the perturbed hybrid neural dynamics for online matrix inversion when having large implementation errors.
Keywords: Recursive neural dynamics; Online matrix inversion; Lyapunov stability theory; Steady-state error bound; Exponential convergence rate (search for similar items in EconPapers)
Date: 2016
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0096300315013685
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:apmaco:v:273:y:2016:i:c:p:969-975
DOI: 10.1016/j.amc.2015.10.026
Access Statistics for this article
Applied Mathematics and Computation is currently edited by Theodore Simos
More articles in Applied Mathematics and Computation from Elsevier
Bibliographic data for series maintained by Catherine Liu ().