TY - JOUR
T1 - Joint Tensor Factorization and Outlying Slab Suppression With Applications
AU - Fu, Xiao
AU - Huang, Kejun
AU - Ma, Wing Kin
AU - Sidiropoulos, Nicholas D.
AU - Bro, Rasmus
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015/12/1
Y1 - 2015/12/1
N2 - We consider factoring low-rank tensors in the presence of outlying slabs. This problem is important in practice, because data collected in many real-world applications, such as speech, fluorescence, and some social network data, fit this paradigm. Prior work tackles this problem by iteratively selecting a fixed number of slabs and fitting, a procedure which may not converge. We formulate this problem from a group-sparsity promoting point of view, and propose an alternating optimization framework to handle the corresponding ℓp (0< p ≤ 1) minimization-based low-rank tensor factorization problem. The proposed algorithm features a similar per-iteration complexity as the plain trilinear alternating least squares (TALS) algorithm. Convergence of the proposed algorithm is also easy to analyze under the framework of alternating optimization and its variants. In addition, regularization and constraints can be easily incorporated to make use of a priori information on the latent loading factors. Simulations and real data experiments on blind speech separation, fluorescence data analysis, and social network mining are used to showcase the effectiveness of the proposed algorithm.
AB - We consider factoring low-rank tensors in the presence of outlying slabs. This problem is important in practice, because data collected in many real-world applications, such as speech, fluorescence, and some social network data, fit this paradigm. Prior work tackles this problem by iteratively selecting a fixed number of slabs and fitting, a procedure which may not converge. We formulate this problem from a group-sparsity promoting point of view, and propose an alternating optimization framework to handle the corresponding ℓp (0< p ≤ 1) minimization-based low-rank tensor factorization problem. The proposed algorithm features a similar per-iteration complexity as the plain trilinear alternating least squares (TALS) algorithm. Convergence of the proposed algorithm is also easy to analyze under the framework of alternating optimization and its variants. In addition, regularization and constraints can be easily incorporated to make use of a priori information on the latent loading factors. Simulations and real data experiments on blind speech separation, fluorescence data analysis, and social network mining are used to showcase the effectiveness of the proposed algorithm.
KW - Canonical polyadic decomposition
KW - PARAFAC
KW - group sparsity
KW - iteratively reweighted
KW - outliers
KW - robustness
KW - tensor decomposition
UR - http://www.scopus.com/inward/record.url?scp=84959503792&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84959503792&partnerID=8YFLogxK
U2 - 10.1109/TSP.2015.2469642
DO - 10.1109/TSP.2015.2469642
M3 - Article
AN - SCOPUS:84959503792
SN - 1053-587X
VL - 63
SP - 6315
EP - 6328
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 23
M1 - 7208891
ER -