We consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, where the number of covariates and the size of the response may be much larger than the sample size. We provide an -oracle inequality satisfied by the Lasso estimator according to the Kullback−Leibler loss. This result is an extension of the -oracle inequality established by Meynet in [ESAIM: PS 17 (2013) 650–671]. in the multivariate case. We focus on the Lasso for its -regularization properties rather than for the variable selection procedure.
DOI : 10.1051/ps/2015011
Mots clés : Finite mixture of multivariate regression model, Lasso, ℓ1-oracle inequality
@article{PS_2015__19__649_0, author = {Devijver, Emilie}, title = {An $\ell{}_{1}$-oracle inequality for the {Lasso} in multivariate finite mixture of multivariate {Gaussian} regression models}, journal = {ESAIM: Probability and Statistics}, pages = {649--670}, publisher = {EDP-Sciences}, volume = {19}, year = {2015}, doi = {10.1051/ps/2015011}, mrnumber = {3433431}, zbl = {1392.62179}, language = {en}, url = {http://www.numdam.org/articles/10.1051/ps/2015011/} }
TY - JOUR AU - Devijver, Emilie TI - An $\ell{}_{1}$-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models JO - ESAIM: Probability and Statistics PY - 2015 SP - 649 EP - 670 VL - 19 PB - EDP-Sciences UR - http://www.numdam.org/articles/10.1051/ps/2015011/ DO - 10.1051/ps/2015011 LA - en ID - PS_2015__19__649_0 ER -
%0 Journal Article %A Devijver, Emilie %T An $\ell{}_{1}$-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models %J ESAIM: Probability and Statistics %D 2015 %P 649-670 %V 19 %I EDP-Sciences %U http://www.numdam.org/articles/10.1051/ps/2015011/ %R 10.1051/ps/2015011 %G en %F PS_2015__19__649_0
Devijver, Emilie. An $\ell{}_{1}$-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models. ESAIM: Probability and Statistics, Tome 19 (2015), pp. 649-670. doi : 10.1051/ps/2015011. http://www.numdam.org/articles/10.1051/ps/2015011/
Simultaneous analysis of Lasso and Dantzig selector. Ann. Stat. 37 (2009) 1705–1732. | MR | Zbl
, and ,S. Boucheron, G. Lugosi and P. Massart, Concentration Inequalities: A Nonasymptotic Theory of Independence. OUP, Oxford (2013). | MR | Zbl
S. Cohen and E. Le Pennec, Conditional density estimation by penalized likelihood model selection and applications. Research Report RR-7596 (2011).
Least angle regression. Ann. Stat. 32 (2004) 407–499. | MR | Zbl
, , and ,P. Massart, Concentration inequalities and model selection. Vol. 33 of Lect. Notes Math. Springer, Saint-Flour, Cantal (2007). | MR | Zbl
The Lasso as an -ball model selection procedure. Electron. J. Stat. 5 (2011) 669–687. | MR | Zbl
and ,G. McLachlan and D. Peel, Finite Mixture Models. Wiley series in probability and statistics: Applied probability and statistics. Wiley (2004). | MR | Zbl
An -oracle inequality for the lasso in finite mixture gaussian regression models. ESAIM: PS 17 (2013) 650–671. | MR | Zbl
,Exponential screening and optimal rates of sparse estimation. Ann. Stat. 39 (2011) 731–771. | MR | Zbl
and ,-penalization for mixture regression models. Test 19 (2010) 209–256. | MR | Zbl
, and ,Regression shrinkage and selection via the lasso. J.R. Stat. Soc. Ser. B. 58 (1996) 267–288. | MR | Zbl
,On the conditions used to prove oracle results for the Lasso. Electron. J. Stat. 3 (2009) 1360–1392. | MR | Zbl
and ,The adaptive and the thresholded lasso for potentially misspecified models (and a lower bound for the lasso). Electron. J. Stat. 5 (2011) 688–749. | MR | Zbl
, and ,A.W. van der Vaart and J. Wellner, Weak Convergence and Empirical Processes: With Applications to Statistics. Springer Ser. Stat. Springer (1996). | MR | Zbl
V. Vapnik, Estimation of Dependences Based on Empirical Data. Springer Ser. Stat. Springer-Verlag, New York (1982). | MR | Zbl
Cité par Sources :