A Complex Neural Network Algorithm for Computing the Largest Real Part Eigenvalue and the corresponding Eigenvector of a Real Matrix
- DOI
- 10.2991/icsma-16.2016.100How to use a DOI?
- Keywords
- Complex neural network; real matrix; largest real part; eigenvalue; eigenvector
- Abstract
In this study, we propose a novel complex neural network algorithm, which extends the neural network based approaches that can asymptotically compute the largest or smallest eigenvalues and the corresponding eigenvectors of real symmetric matrices, to the case of directly calculating the largest real part eigenvalue and the corresponding eigenvector of a real matrix. The proposed neural network algorithm is described by a group of complex differential equations, which is deduced from the classical neural network model. The proposed algorithm is a class of continuous time recurrent neural network (RNN), it has parallel processing ability in an asynchronous manner and could achieve high computing capability. This paper provides a rigorous mathematical proof for its convergence in the case of real matrices for a more clear understanding of network dynamic behaviors relating to the computation of eigenvector and eigenvalue. The proposed approach has obvious virtues such as fast convergence speed and non-sensitivity to initial value. Numerical examples showed that the proposed algorithm has good performance.
- Copyright
- © 2016, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Hang Tan AU - Xuesong Liang AU - Liping Wan PY - 2016/12 DA - 2016/12 TI - A Complex Neural Network Algorithm for Computing the Largest Real Part Eigenvalue and the corresponding Eigenvector of a Real Matrix BT - Proceedings of the 2016 4th International Conference on Sensors, Mechatronics and Automation (ICSMA 2016) PB - Atlantis Press SP - 577 EP - 585 SN - 1951-6851 UR - https://doi.org/10.2991/icsma-16.2016.100 DO - 10.2991/icsma-16.2016.100 ID - Tan2016/12 ER -