An Iterative Coordinate Descent Algorithm for High-Dimensional Nonconvex Penalized Quantile Regression

Bo Peng, Lan Wang

Research output: Contribution to journalArticlepeer-review

59 Scopus citations


We propose and study a new iterative coordinate descent algorithm (QICD) for solving nonconvex penalized quantile regression in high dimension. By permitting different subsets of covariates to be relevant for modeling the response variable at different quantiles, nonconvex penalized quantile regression provides a flexible approach for modeling high-dimensional data with heterogeneity. Although its theory has been investigated recently, its computation remains highly challenging when p is large due to the nonsmoothness of the quantile loss function and the nonconvexity of the penalty function. Existing coordinate descent algorithms for penalized least-squares regression cannot be directly applied. We establish the convergence property of the proposed algorithm under some regularity conditions for a general class of nonconvex penalty functions including popular choices such as SCAD (smoothly clipped absolute deviation) and MCP (minimax concave penalty). Our Monte Carlo study confirms that QICD substantially improves the computational speed in the p ≫ n setting. We illustrate the application by analyzing a microarray dataset.

Original languageEnglish (US)
Pages (from-to)676-694
Number of pages19
JournalJournal of Computational and Graphical Statistics
Issue number3
StatePublished - Jul 3 2015

Bibliographical note

Publisher Copyright:
© 2015 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.


  • Coordinate descent quantile regression
  • High-dimensional data
  • MCP
  • Nonconvex penalty
  • SCAD
  • Variable selection


Dive into the research topics of 'An Iterative Coordinate Descent Algorithm for High-Dimensional Nonconvex Penalized Quantile Regression'. Together they form a unique fingerprint.

Cite this