Kullback-Leibler principal component for tensors is not NP-hard

Kejun Huang, Nicholas D. Sidiropoulos

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Scopus citations

Abstract

We study the problem of nonnegative rank-one approximation of a nonnegative tensor, and show that the globally optimal solution that minimizes the generalized Kullback-Leibler divergence can be efficiently obtained, i.e., it is not NP-hard. This result works for arbitrary nonnegative tensors with an arbitrary number of modes (including two, i.e., matrices). We derive a closed-form expression for the KL principal component, which is easy to compute and has an intuitive probabilistic interpretation. For generalized KL approximation with higher ranks, the problem is for the first time shown to be equivalent to multinomial latent variable modeling, and an iterative algorithm is derived that resembles the expectation-maximization algorithm. On the Iris dataset, we showcase how the derived results help us learn the model in an unsupervised manner, and obtain strikingly close performance to that from supervised methods.

Original languageEnglish (US)
Title of host publicationConference Record of 51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017
EditorsMichael B. Matthews
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages693-697
Number of pages5
ISBN (Electronic)9781538618233
DOIs
StatePublished - Jul 2 2017
Event51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017 - Pacific Grove, United States
Duration: Oct 29 2017Nov 1 2017

Publication series

NameConference Record of 51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017
Volume2017-October

Other

Other51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017
Country/TerritoryUnited States
CityPacific Grove
Period10/29/1711/1/17

Bibliographical note

Funding Information:
This work is supported in part by NSF IIS-1247632, IIS-1447788, and IIS-1704074.

Publisher Copyright:
© 2017 IEEE.

Keywords

  • canonical polyadic decomposition
  • generalized Kullback-Leibler (KL) divergence
  • latent variable modeling
  • tensor factorization

Fingerprint

Dive into the research topics of 'Kullback-Leibler principal component for tensors is not NP-hard'. Together they form a unique fingerprint.

Cite this