Stochastic Mirror Descent for Low-Rank Tensor Decomposition Under Non-Euclidean Losses

Wenqiang Pu, Shahana Ibrahim, Xiao Fu, Mingyi Hong

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

This work considers low-rank canonical polyadic decomposition (CPD) under a class of non-Euclidean loss functions that frequently arise in statistical machine learning and signal processing. These loss functions are often used for certain types of tensor data, e.g., count and binary tensors, where the least squares loss is considered unnatural. Compared to the least squares loss, the non-Euclidean losses are generally more challenging to handle. Non-Euclidean CPD has attracted considerable interests and a number of prior works exist. However, pressing computational and theoretical challenges, such as scalability and convergence issues, still remain. This work offers a unified stochastic algorithmic framework for large-scale CPD decomposition under a variety of non-Euclidean loss functions. Our key contribution lies in a tensor fiber sampling strategy-based flexible stochastic mirror descent framework. Leveraging the sampling scheme and the multilinear algebraic structure of low-rank tensors, the proposed lightweight algorithm ensures global convergence to a stationary point under reasonable conditions. Numerical results show that our framework attains promising non-Euclidean CPD performance. The proposed framework also exhibits substantial computational savings compared to state-of-the-art methods.

Original languageEnglish (US)
Pages (from-to)1803-1818
Number of pages16
JournalIEEE Transactions on Signal Processing
Volume70
DOIs
StatePublished - 2022

Bibliographical note

Funding Information:
The work of Wenqiang Pu was supported by the NSFC, China under Grant 62101350. The work of Xiao Fu was supported in part by NSF ECCS under Grants 1808159 and IIS-1910118 and in part by ARO under Grant W911NF-19-1-0247. The work of Mingyi Hong was supported in part by NSF under Grant CIF-1910385 and in part by ARO under Grant W911NF-19-1-0247

Publisher Copyright:
© 1991-2012 IEEE.

Keywords

  • KL-divergence
  • Tensor decomposition
  • mirror descent method
  • stochastic optimization
  • β-divergence

Fingerprint

Dive into the research topics of 'Stochastic Mirror Descent for Low-Rank Tensor Decomposition Under Non-Euclidean Losses'. Together they form a unique fingerprint.

Cite this