PDF

Description

In this paper, we study the model selection property of the Elastic net. In the classical settings when $p$ (the number of predictors) and $q$ (the number of predictors with non-zero coefficients in the true linear model) are fixed, Yuan and Lin (2007) give a necessary and sufficient condition for the Elastic net to consistently select the true model, which is called the Elastic Irrepresentable Condition (EIC) in this paper. Here we study the general case when $p,q$ and $n$ all go to infinity. For general scalings of $p,q$ and $n$, when Gaussian noise is assumed, sufficient conditions on $p$, $q$ and $n$ are given in this paper such that EIC guarantees the Elastic net's model selection consistency. We show that to make these conditions hold, $n$ should grow at a rate faster than $q*\log(p-q)$. For the classical case, when $p$ and $q$ are fixed, we also study the relationship between EIC and the Irrepresentable Condition (IC) which is necessary and sufficient for the Lasso to select the true model. Through theoretical results and simulation studies, we provide insights into when and why EIC is weaker than IC and when the Elastic net can consistently select the true model even when the Lasso can not.

Details

Files

Statistics

from
to
Export
Download Full History