Paper accepted at ESANN 2026!

Published: 04/07/2026
News

Unlike classical pruning methods based on predefined sparsity scores, heuristics, or two-stage segmentation-classification pipelines, our end-to-end approach learns an optimal trade-off between token information density and model performance. It produces learned, highly interpretable pruning masks while preserving high classification accuracy.

Tom Devynck, Bilal FAYE, Djamel Bouchaffra, Nadjib Lazaar, Hanane AZZAG, and Mustapha Lebbah. Energy-based dropout with patch-level regularization. In European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Bruges, Belgium, April 22–24 2026.