Adaptive Sparse PLS for Logistic Regression

PhD
ABS4NGS
statistics
seminar
Statistics seminar, IMAG/SupAgro, Montpellier (France)
Authors

Ghislain Durif

Franck Picard

Sophie Lambert-Lacroix

Published

June 8, 2015

Summary

For a few years, data analysis has been struggling with statistical issues related to the “curse of high dimensionality”. In this context, i.e. when the number of considered variables is far larger than the number of observations in the sample, standard methods of classification are inappropriate, thus calling for the development of specific methodologies. I will present a new approach suitable for classification in the high dimensional cases. It uses sparse Partial Least Squares (sparse PLS) performing compression and variable selection combined to Ridge penalized logistic regression. In particular, we have developed an adaptive version of sparse PLS to improve the dimension reduction process. I will illustrate the interest of our work with simulation results, especially showing the accuracy of our method, compared with other state-of-the-art approaches. The particular combination of the iterative optimization of logistic regression and sparse PLS in our procedure appears to ensure convergence and stability concerning the hyper-parameter tuning, contrary to other methods processing classification with sparse PLS. These results are confirmed on a real data set, using expression levels of thousands of genes concerning less than three hundred patients to predict the relapse for breast cancer.