Abstract
New learning rules for computing eigenspaces and eigenvectors for symmetric and nonsymmetric matrices are proposed. By applying Liapunov stability theory, these systems are shown to be globally convergent. Properties of limiting solutions of the systems and weighted versions are also examined. The proposed systems may be viewed as generalizations of Oja's and Xu's principal subspace learning rules. Numerical examples showing the convergence behavior are also presented.
Original language | English (US) |
---|---|
Article number | 4253004 |
Pages (from-to) | 1779-1782 |
Number of pages | 4 |
Journal | Proceedings - IEEE International Symposium on Circuits and Systems |
DOIs | |
State | Published - 2007 |
Event | 2007 IEEE International Symposium on Circuits and Systems, ISCAS 2007 - New Orleans, LA, United States Duration: May 27 2007 → May 30 2007 |
Keywords
- Global convergence
- Liapunov stability
- Minor components
- Oja's learning rule
- Principal components