Higher order orthogonal iteration of tensors (HOOT) and its relation to PCA and GLRAM

Bernard N. Sheehan, Yousef Saad

Research output: Chapter in Book/Report/Conference proceedingConference contribution

23 Scopus citations

Abstract

This paper presents a unified view of a number of dimension reduction techniques under the common framework of tensors. Specifically, it is established that PCA, and the recently introduced 2-D PCA and Generalized Low Rank Approximation of Matrices (GLRAM), are special instances of the higher order orthogonal iteration of tensors (HOOT). The connection of these algorithms to HOOT has not been pointed out before in the literature. The pros and cons of these specializations versus HOOT are discussed.

Original languageEnglish (US)
Title of host publicationProceedings of the 7th SIAM International Conference on Data Mining
Pages355-365
Number of pages11
StatePublished - Dec 1 2007
Event7th SIAM International Conference on Data Mining - Minneapolis, MN, United States
Duration: Apr 26 2007Apr 28 2007

Publication series

NameProceedings of the 7th SIAM International Conference on Data Mining

Other

Other7th SIAM International Conference on Data Mining
CountryUnited States
CityMinneapolis, MN
Period4/26/074/28/07

Keywords

  • Dimension reduction
  • GLRAM
  • HOOT
  • HOSVD
  • Principal component analysis
  • Tensor

Fingerprint Dive into the research topics of 'Higher order orthogonal iteration of tensors (HOOT) and its relation to PCA and GLRAM'. Together they form a unique fingerprint.

Cite this