Strong screening rules for group-based SLOPE models
File(s)
Author(s)
Feser, Fabio
Evangelou, Marina
Type
Conference Paper
Abstract
Tuning the regularization parameter in penalized regression models is an expensive task, requiring multiple models to be fit along a path of parameters. Strong screening rules
drastically reduce computational costs by lowering the dimensionality of the input prior to fitting. We develop strong screening rules for group-based Sorted L-One Penalized Estimation (SLOPE) models: Group SLOPE and Sparse-group SLOPE. The developed rules are applicable to the wider family of group-based OWL models, including OSCAR. Our
experiments on both synthetic and real data show that the screening rules significantly accelerate the fitting process. The screening rules make it accessible for group SLOPE and
sparse-group SLOPE to be applied to high-dimensional datasets, particularly those encountered in genetics.
drastically reduce computational costs by lowering the dimensionality of the input prior to fitting. We develop strong screening rules for group-based Sorted L-One Penalized Estimation (SLOPE) models: Group SLOPE and Sparse-group SLOPE. The developed rules are applicable to the wider family of group-based OWL models, including OSCAR. Our
experiments on both synthetic and real data show that the screening rules significantly accelerate the fitting process. The screening rules make it accessible for group SLOPE and
sparse-group SLOPE to be applied to high-dimensional datasets, particularly those encountered in genetics.
Date Acceptance
2025-01-22
Citation
Proceedings of Machine Learning Research
ISSN
2640-3498
Publisher
MLResearchPress
Journal / Book Title
Proceedings of Machine Learning Research
Copyright Statement
Subject to copyright. This paper is embargoed until publication.
Source
28th International Conference on Artificial Intelligence and Statistics (AISTATS)
Publication Status
Accepted
Start Date
2025-05-03
Finish Date
2025-05-05
Coverage Spatial
Mai Khao, Thailand