Parallel matrix factorization for low-rank tensor completion

Yangyang Xu, Ruru Hao, Wotao Yin, Zhixun Su

Research output: Contribution to journalArticlepeer-review

203 Scopus citations


Higher-order low-rank tensors naturally arise in many applications including hyperspectral data recovery, video inpainting, seismic data reconstruction, and so on. We propose a new model to recover a low-rank tensor by simultaneously performing low-rank matrix factorizations to the all-mode matricizations of the underlying tensor. An alternating minimization algorithm is applied to solve the model, along with two adaptive rank-adjusting strategies when the exact rank is not known. Phase transition plots reveal that our algorithm can recover a variety of synthetic low-rank tensors from significantly fewer samples than the compared methods, which include a matrix completion method applied to tensor recovery and two state-of-the-art tensor completion methods. Further tests on real- world data show similar advantages. Although our model is non-convex, our algorithm performs consistently throughout the tests and gives better results than the compared methods, some of which are based on convex models. In addition, subsequence convergence of our algorithm can be established in the sense that any limit point of the iterates satisfies the KKT condtions.

Original languageEnglish (US)
Pages (from-to)601-624
Number of pages24
JournalInverse Problems and Imaging
Issue number2
StatePublished - 2015

Bibliographical note

Publisher Copyright:
© 2015 American Institute of Mathematical Sciences.


  • Alternating least squares
  • Higher-order tensor
  • Low-rank matrix completion
  • Low-rank tensor completion
  • Non-convex optimization


Dive into the research topics of 'Parallel matrix factorization for low-rank tensor completion'. Together they form a unique fingerprint.

Cite this