In this article we proposed a data-driven method of generalized adaptive ridge (gar) for an automatic yet adaptive regression shrinkage and selection. We show that, in theory, gar can be equivalent to adaptive lasso, adaptive ridge regression and adaptive elastic net under appropriate conditions. Specifically, if the regression parameters truly enjoy a sparse representation, gar performs like the most recently proposed adaptive lasso (Zou, 2006), hence, is able to identify relevant predictors consistently. If the regression parameters are not that sparse, gar performs like the adaptive ridge regression, which is well-known for its high prediction accuracy and reliability against the multi-collinearity problem. If the predictor dimension is much larger than the sample size, and the parameters are sparse, gar performs like adaptive elastic net which is newly suggested in this paper and an extension from elastic net. Due to its flexibility, gar performs either better or equivalent to these methods, in terms of prediction accuracy. Simulation results confirm its competitive performance.

TR Number: 
Junshan Qiu, Xiangrong Yin, & Hansheng Wang

To request a copy of this report, please email us. We will send you a pdf copy if one is available.