High-dimensional generalizations of asymmetric least squares regression and their applications

Yuwen Gu, Hui Zou

Research output: Contribution to journalArticlepeer-review

43 Scopus citations


Asymmetric least squares regression is an important method that has wide applications in statistics, econometrics and finance. The existing work on asymmetric least squares only considers the traditional low dimension and large sample setting. In this paper, we systematically study the Sparse Asymmetric LEast Squares (SALES) regression under high dimensions where the penalty functions include the Lasso and nonconvex penalties. We develop a unified efficient algorithm for fitting SALES and establish its theoretical properties. As an important application, SALES is used to detect heteroscedasticity in high-dimensional data. Another method for detecting heteroscedasticity is the sparse quantile regression. However, both SALES and the sparse quantile regression may fail to tell which variables are important for the conditional mean and which variables are important for the conditional scale/variance, especially when there are variables that are important for both the mean and the scale. To that end, we further propose a COupled Sparse Asymmetric LEast Squares (COSALES) regression which can be efficiently solved by an algorithm similar to that for solving SALES. We establish theoretical properties of COSALES. In particular, COSALES using the SCAD penalty or MCP is shown to consistently identify the two important subsets for the mean and scale simultaneously, even when the two subsets overlap. We demonstrate the empirical performance of SALES and COSALES by simulated and real data.

Original languageEnglish (US)
Pages (from-to)2661-2694
Number of pages34
JournalAnnals of Statistics
Issue number6
StatePublished - Dec 2016

Bibliographical note

Publisher Copyright:
© Institute of Mathematical Statistics, 2016.


  • Asymmetric least squares
  • High dimensions


Dive into the research topics of 'High-dimensional generalizations of asymmetric least squares regression and their applications'. Together they form a unique fingerprint.

Cite this