Abstract
Prior information can be incorporated in matrix completion to improve estimation accuracy and extrapolate the missing entries. Reproducing kernel Hilbert spaces provide tools to leverage the said prior information, and derive more reliable algorithms. This paper analyzes the generalization error of such approaches, and presents numerical tests confirming the theoretical results.
Original language | English (US) |
---|---|
Article number | 8974415 |
Pages (from-to) | 326-330 |
Number of pages | 5 |
Journal | IEEE Signal Processing Letters |
Volume | 27 |
DOIs | |
State | Published - 2020 |
Bibliographical note
Funding Information:Manuscript received June 20, 2019; revised December 25, 2019; accepted January 21, 2020. Date of publication January 27, 2020; date of current version February 29, 2020. This work was supported in part by the ERDF funds TEC2013-41315-R and TEC2016-75067-C4-2, in part by the Catalan Government 2017 SGR 578, and in part by the NSF under Grant 1500713, Grant 1514056, Grant 1711471, and Grant 1509040. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Nancy F. Chen. (Corresponding author: Pere Giménez Febrer.) P. Giménez-Febrer and A. Pagès-Zamora are with the SPCOM Group, Univer-sitat Politècnica de Catalunya-Barcelona Tech, 08034 Barcelona, Spain (e-mail: p.gimenez@upc.edu; alba.pages@upc.edu).
Publisher Copyright:
© 1994-2012 IEEE.
Keywords
- Matrix completion
- Rademacher complexity
- generalization error
- kernel regression