文档介绍:376Estimation Consistency of the Group Lasso and its ApplicationsHan LiuMachine Learning DepartmentCarnegie Mellon UniversityPittsburgh, PA 15213Jian ZhangDepartment of StatisticsPurdue UniversityWest Lafayette, IN, 47907-2066AbstractWe extend the`2-consistency result of (Mein-shausen and Yu 2008) from the Lasso to thegroup Lasso. Our main theorem shows thatthe group Lasso achieves estimation consis-tency under a mild condition and an asymp-totic upper bound on the number of selectedvariables can be obtained. As a result, we canapply the nonnegative garrote procedure tothe group Lasso result to obtain an estimatorwhich is simultaneously estimation and vari-able selection consistent. In particular, oursetting allows both the number of groups andthe number of variables per group increaseand thus is applicable to high-dimensionalproblems. We also provide estimation con-sistency analysis for a version of the sparseadditive models with increasing ?nite-sample results are also IntroductionRecently many regularization-based methods havebeen proposed for the purpose of variable selection inhigh-dimensional regression. The Lasso (Tibshirani,1996; Chen et al., 1998) is the most popular one dueto putational feasibility and amenability to the-oretical analysis. One well-known result is that theLasso estimator is not variable selection consistent ifthe irrepresen