Abstract
Sparse penalized quantile regression is a useful tool for variable selection, robust estimation, and heteroscedasticity detection in high-dimensional data analysis. The computational issue of the sparse penalized quantile regression has not yet been fully resolved in the literature, due to nonsmoothness of the quantile regression loss function. We introduce fast alternating direction method of multipliers (ADMM) algorithms for computing the sparse penalized quantile regression. The convergence properties of the proposed algorithms are established. Numerical examples demonstrate the competitive performance of our algorithm: it significantly outperforms several other fast solvers for high-dimensional penalized quantile regression. Supplementary materials for this article are available online.
Original language | English (US) |
---|---|
Pages (from-to) | 319-331 |
Number of pages | 13 |
Journal | Technometrics |
Volume | 60 |
Issue number | 3 |
DOIs | |
State | Published - Jul 3 2018 |
Bibliographical note
Publisher Copyright:© 2018, © 2018 American Statistical Association and the American Society for Quality. © 2018, © Yuwen Gu and Jun Fan.
Keywords
- Alternating direction method of multipliers
- Lasso
- Nonconvex penalty
- Quantile regression
- Variable selection