High-Dimensional Generalizations of Asymmetric Least Squares Regression and Their Applications
A.k.a. the penalized expectile regression
By Yuwen Gu and Hui Zou in Research Expectile regression
December 1, 2016
Gu, Y., & Zou, H. (2016). High-dimensional generalizations of asymmetric least squares regression and their applications. The Annals of Statistics, 44(6), 2661–2694.
Abstract
Asymmetric least squares regression is an important method that has wide applications in statistics, econometrics and finance. The existing work on asymmetric least squares only considers the traditional low dimension and large sample setting. In this paper, we systematically study the Sparse Asymmetric LEast Squares (SALES) regression under high dimensions where the penalty functions include the Lasso and nonconvex penalties. We develop a unified efficient algorithm for fitting SALES and establish its theoretical properties. As an important application, SALES is used to detect heteroscedasticity in high-dimensional data. Another method for detecting heteroscedasticity is the sparse quantile regression. However, both SALES and the sparse quantile regression may fail to tell which variables are important for the conditional mean and which variables are important for the conditional scale/variance, especially when there are variables that are important for both the mean and the scale. To that end, we further propose a COupled Sparse Asymmetric LEast Squares (COSALES) regression which can be efficiently solved by an algorithm similar to that for solving SALES. We establish theoretical properties of COSALES. In particular, COSALES using the SCAD penalty or MCP is shown to consistently identify the two important subsets for the mean and scale simultaneously, even when the two subsets overlap. We demonstrate the empirical performance of SALES and COSALES by simulated and real data.
- Posted on:
- December 1, 2016
- Length:
- 2 minute read, 242 words
- Categories:
- Research Expectile regression