Spectrally Sparse Nonparametric Regression via Elastic Net Regularized Smoothers

Research output: Contribution to journalArticlepeer-review

Abstract

Nonparametric regression frameworks, such as generalized additive models (GAMs) and smoothing spline analysis of variance (SSANOVA) models, extend the generalized linear model (GLM) by allowing for unknown functional relationships between an exponential family response variable and a collection of predictor variables. The unknown functional relationships are typically estimated using penalized likelihood estimation, which adds a roughness penalty to the (negative) log-likelihood function. In this article, I propose a spectral parameterization of a smoothing spline, which allows for an efficient application of Elastic Net regression to smooth and select eigenvectors of a kernel matrix. The classic (ridge regression) solution for a smoothing spline is a special case of the proposed kernel eigenvector smoothing and selection operator. Extensions for tensor product smoothers are developed for both the GAM and SSANOVA frameworks. Using simulated and real data examples, I demonstrate that the proposed approach offers practical and computational gains over typical approaches for fitting GAMs, SSANOVA models, and Elastic Net penalized GLMs. Supplementary materials for this article are available online.

Original languageEnglish (US)
Pages (from-to)182-191
Number of pages10
JournalJournal of Computational and Graphical Statistics
Volume30
Issue number1
DOIs
StatePublished - Sep 29 2020

Bibliographical note

Publisher Copyright:
© 2020 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

Keywords

  • Algorithms
  • Functional data
  • Generalized additive model
  • Regularization methods
  • Smoothing spline ANOVA

Fingerprint

Dive into the research topics of 'Spectrally Sparse Nonparametric Regression via Elastic Net Regularized Smoothers'. Together they form a unique fingerprint.

Cite this