Abstract
Canonical polyadic decomposition (CPD) has been a workhorse for multimodal data analytics. This work puts forth a stochastic algorithmic framework for CPD under β-divergence, which is wellmotivated in statistical learning-where the Euclidean distance is typically not preferred. Despite the existence of a series of prior works addressing this topic, pressing computational and theoretical challenges, e.g., scalability and convergence issues, still remain. In this paper, a unified stochastic mirror descent framework is developed for large-scale β-divergence CPD. Our key contribution is the integrated design of a tensor fiber sampling strategy and a flexible stochastic Bregman divergence-based mirror descent iterative procedure, which significantly reduces the computation and memory cost per iteration for various β. Leveraging the fiber sampling scheme and the multilinear algebraic structure of low-rank tensors, the proposed lightweight algorithm also ensures global convergence to a stationary point under mild conditions. Numerical results on synthetic and real data show that our framework attains significant computational saving compared with state-of-the-art methods.
Original language | English (US) |
---|---|
Pages (from-to) | 2925-2929 |
Number of pages | 5 |
Journal | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings |
Volume | 2021-June |
DOIs | |
State | Published - 2021 |
Event | 2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021 - Virtual, Toronto, Canada Duration: Jun 6 2021 → Jun 11 2021 |
Bibliographical note
Funding Information:X. Fu is supported in part by NSF ECCS 1808159, IIS-1910118 and ARO award W911NF-19-1-0247. M.Hong is supported in part by NSF Award CIF-1910385, and ARO awardW911NF-19-1-0247.
Publisher Copyright:
©2021 IEEE.
Keywords
- ?-divergence
- Mirror descent method
- Stochastic optimization
- Tensor decomposition