Differential and geometric properties of rayleigh quotients with applications

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, learning rules are proposed for simultaneous computation of minor eigenvectors of a covariance matrix. To understand the optimality conditions of Rayleigh quotients, many interesting identities and properties related are derived. For example, it is shown that the Hessian matrix is singular at each critical point of the Rayleigh quotient. Based on these properties, MCA rules are derived by optimizing a weighted inverse Rayleigh quotient so that the optimum weights at equilibrium points are exactly the desired eigenvectors of a covariance matrix instead of an arbitrary orthonormal basis of the minor subspace. Variations of the derived MCA learning rules are obtained by imposing orthogonal and quadratic constraints and change of variables. Some of the proposed algorithms can also perform PCA by merely changing the sign of the step-size.

Original languageEnglish (US)
Title of host publicationISCAS 2006
Subtitle of host publication2006 IEEE International Symposium on Circuits and Systems, Proceedings
Pages4216-4219
Number of pages4
StatePublished - Dec 1 2006
EventISCAS 2006: 2006 IEEE International Symposium on Circuits and Systems - Kos, Greece
Duration: May 21 2006May 24 2006

Publication series

NameProceedings - IEEE International Symposium on Circuits and Systems
ISSN (Print)0271-4310

Other

OtherISCAS 2006: 2006 IEEE International Symposium on Circuits and Systems
CountryGreece
CityKos
Period5/21/065/24/06

Keywords

  • Adaptive learning algorithm
  • Extreme eigenvalues
  • Minor component analysis
  • Principal component analysis

Fingerprint Dive into the research topics of 'Differential and geometric properties of rayleigh quotients with applications'. Together they form a unique fingerprint.

Cite this