Matrix completion and extrapolation (MCEX) are dealt with here over reproducing kernel Hilbert spaces (RKHSs) in order to account for prior information present in the available data. Aiming at a fast and low-complexity solver, the task is formulated as one of kernel ridge regression. The resultant MCEX algorithm can also afford online implementation, while the class of kernel functions also encompasses several existing approaches to MC with prior information. Numerical tests on synthetic and real datasets show that the novel approach is faster than widespread methods such as alternating least-squares (ALS) or stochastic gradient descent (SGD), and that the recovery error is reduced, especially when dealing with noisy data.
Bibliographical noteFunding Information:
Manuscript received July 31, 2018; revised March 18, 2019; accepted July 24, 2019. Date of publication August 2, 2019; date of current version August 23, 2019. The associate editor coordinating the review of this manuscript and approving it for publication was Prof. Jarvis Haupt. This work was supported in part by the Ministerio de Economia y Competitividad of the Spanish Government and ERDF funds under Grants TEC2016-75067-C4-2-R and TEC2015-515 69648-REDC; in part by Catalan Government funds under Grant 2017 SGR 578 AGAUR; and in part by NSF under Grants 1500713, 1514056, 1711471, and 1509040. (Corresponding author: Pere Giménez-Febrer.) P. Giménez-Febrer and A. Pagès-Zamora are with the SPCOM Group, Univer-sitat Politècnica de Catalunya-Barcelona Tech, Barcelona 08034, Spain (e-mail: firstname.lastname@example.org; email@example.com).
© 2019 IEEE.
- Kernel ridge regression
- Matrix completion
- Online learning