On optimal low rank Tucker approximation for tensors: the case for an adjustable core size

Bilian Chen, Zhening Li, Shuzhong Zhang

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

Approximating high order tensors by low Tucker-rank tensors have applications in psychometrics, chemometrics, computer vision, biomedical informatics, among others. Traditionally, solution methods for finding a low Tucker-rank approximation presume that the size of the core tensor is specified in advance, which may not be a realistic assumption in many applications. In this paper we propose a new computational model where the configuration and the size of the core become a part of the decisions to be optimized. Our approach is based on the so-called maximum block improvement method for non-convex block optimization. Numerical tests on various real data sets from gene expression analysis and image compression are reported, which show promising performances of the proposed algorithms.

Original languageEnglish (US)
Pages (from-to)811-832
Number of pages22
JournalJournal of Global Optimization
Volume62
Issue number4
DOIs
StatePublished - Aug 25 2015

Keywords

  • Low-rank approximation
  • Maximum block improvement
  • Multiway array
  • Tucker decomposition

Fingerprint Dive into the research topics of 'On optimal low rank Tucker approximation for tensors: the case for an adjustable core size'. Together they form a unique fingerprint.

Cite this