General Recurrent Neural Network for Solving Generalized Linear Matrix Equation
Zhan Li,
Hong Cheng and
Hongliang Guo
Complexity, 2017, vol. 2017, 1-7
Abstract:
This brief proposes a general framework of the nonlinear recurrent neural network for solving online the generalized linear matrix equation (GLME) with global convergence property. If the linear activation function is utilized, the neural state matrix of the nonlinear recurrent neural network can globally and exponentially converge to the unique theoretical solution of GLME. Additionally, as compared with the case of using the linear activation function, two specific types of nonlinear activation functions are proposed for the general nonlinear recurrent neural network model to achieve superior convergence. Illustrative examples are shown to demonstrate the efficacy of the general nonlinear recurrent neural network model and its superior convergence when activated by the aforementioned nonlinear activation functions.
Date: 2017
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://downloads.hindawi.com/journals/8503/2017/9063762.pdf (application/pdf)
http://downloads.hindawi.com/journals/8503/2017/9063762.xml (text/xml)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hin:complx:9063762
DOI: 10.1155/2017/9063762
Access Statistics for this article
More articles in Complexity from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().