Jonas Wallin
Senior lecturer, Director of third cycle studies, Department of Statistics
Coordinate Descent for SLOPE
Author
Editor
- Francisco Ruiz
- Jennifer Dy
- Jan-Willem van de Meent
Summary, in English
The lasso is the most famous sparse regression and feature selection method. One reason for its popularity is the speed at which the underlying optimization problem can be solved. Sorted L-One Penalized Estimation (SLOPE) is a generalization of the lasso with appealing statistical properties. In spite of this, the method has not yet reached widespread interest. A major reason for this is that current software packages that fit SLOPE rely on algorithms that perform poorly in high dimensions. To tackle this issue, we propose a new fast algorithm to solve the SLOPE optimization problem, which combines proximal gradient descent and proximal coordinate descent steps. We provide new results on the directional derivative of the SLOPE penalty and its related SLOPE thresholding operator, as well as provide convergence guarantees for our proposed solver. In extensive benchmarks on simulated and real data, we demonstrate our method's performance against a long list of competing algorithms.
Department/s
- Department of Statistics
Publishing year
2023
Language
English
Pages
4802-4821
Publication/Series
Proceedings of Machine Learning Research
Volume
206
Links
Document type
Conference paper
Topic
- Probability Theory and Statistics
Conference name
26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023
Conference date
2023-04-25 - 2023-04-27
Conference place
Valencia, Spain
Status
Published