Abstract
This paper deals with the grouped variable selection problem. A widely used strategy is to augment the negative log-likelihood function with a sparsity-promoting penalty. Existing methods include the group Lasso, group SCAD, and group MCP. The group Lasso solves a convex optimization problem but suffers from underestimation bias. The group SCAD and group MCP avoid this estimation bias but require solving a nonconvex optimization problem that may be plagued by suboptimal local optima. In this work, we propose an alternative method based on the generalized minimax concave (GMC) penalty, which is a folded concave penalty that maintains the convexity of the objective function. We develop a new method for grouped variable selection in linear regression, the group GMC, that generalizes the strategy of the original GMC estimator. We present a primal-dual algorithm for computing the group GMC estimator and also prove properties of the solution path to guide its numerical computation and tuning parameter selection in practice. We establish error bounds for both the group GMC and original GMC estimators. A rich set of simulation studies and a real data application indicate that the proposed group GMC approach outperforms existing methods in several different aspects under a wide array of scenarios.
Original language | English (US) |
---|---|
Pages (from-to) | 2912-2961 |
Number of pages | 50 |
Journal | Electronic Journal of Statistics |
Volume | 17 |
Issue number | 2 |
DOIs | |
State | Published - 2023 |
Bibliographical note
Publisher Copyright:© 2023, Institute of Mathematical Statistics. All rights reserved.
Keywords
- Sparse linear regression
- convex optimization
- convex-nonconvex penalization
- high-dimensional data analysis