Efficient algorithms for 'universally' constrained matrix and tensor factorization

Kejun Huang, Nicholas D. Sidiropoulos, Athanasios P. Liavas

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations

Abstract

We propose a general algorithmic framework for constrained matrix and tensor factorization, which is widely used in unsupervised learning. The new framework is a hybrid between alternating optimization (AO) and the alternating direction method of multipliers (ADMM): each matrix factor is updated in turn, using ADMM. This combination can naturally accommodate a great variety of constraints on the factor matrices, hence the term 'universal'. Computation caching and warm start strategies are used to ensure that each update is evaluated efficiently, while the outer AO framework guarantees that the algorithm converges monotonically. Simulations on synthetic data show significantly improved performance relative to state-of-the-art algorithms.

Original languageEnglish (US)
Title of host publication2015 23rd European Signal Processing Conference, EUSIPCO 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2521-2525
Number of pages5
ISBN (Electronic)9780992862633
DOIs
StatePublished - Dec 22 2015
Event23rd European Signal Processing Conference, EUSIPCO 2015 - Nice, France
Duration: Aug 31 2015Sep 4 2015

Publication series

Name2015 23rd European Signal Processing Conference, EUSIPCO 2015

Other

Other23rd European Signal Processing Conference, EUSIPCO 2015
CountryFrance
CityNice
Period8/31/159/4/15

Bibliographical note

Funding Information:
Supported in part by NSF IIS-1247632, IIS-1447788, and a UM Informatics Institute fellowship

Fingerprint Dive into the research topics of 'Efficient algorithms for 'universally' constrained matrix and tensor factorization'. Together they form a unique fingerprint.

Cite this