Generalization Error Bounds for Kernel Matrix Completion and Extrapolation

Pere Gimenez-Febrer, Alba Pages-Zamora, Georgios B. Giannakis

Research output: Contribution to journalArticlepeer-review

4 Scopus citations


Prior information can be incorporated in matrix completion to improve estimation accuracy and extrapolate the missing entries. Reproducing kernel Hilbert spaces provide tools to leverage the said prior information, and derive more reliable algorithms. This paper analyzes the generalization error of such approaches, and presents numerical tests confirming the theoretical results.

Original languageEnglish (US)
Article number8974415
Pages (from-to)326-330
Number of pages5
JournalIEEE Signal Processing Letters
StatePublished - 2020

Bibliographical note

Funding Information:
Manuscript received June 20, 2019; revised December 25, 2019; accepted January 21, 2020. Date of publication January 27, 2020; date of current version February 29, 2020. This work was supported in part by the ERDF funds TEC2013-41315-R and TEC2016-75067-C4-2, in part by the Catalan Government 2017 SGR 578, and in part by the NSF under Grant 1500713, Grant 1514056, Grant 1711471, and Grant 1509040. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Nancy F. Chen. (Corresponding author: Pere Giménez Febrer.) P. Giménez-Febrer and A. Pagès-Zamora are with the SPCOM Group, Univer-sitat Politècnica de Catalunya-Barcelona Tech, 08034 Barcelona, Spain (e-mail:;

Publisher Copyright:
© 1994-2012 IEEE.


  • Matrix completion
  • Rademacher complexity
  • generalization error
  • kernel regression


Dive into the research topics of 'Generalization Error Bounds for Kernel Matrix Completion and Extrapolation'. Together they form a unique fingerprint.

Cite this