Efficient Machine Learning
Topics: Efficient Machine Learning, Pruning, Scalability
Modern Machine Learning models are massive, requiring significant resources. Our research aims to make these models faster, smaller, and greener by utilizing techniques like pruning, quantization, distillation, approximate algorithms, and efficient architectures. Beyond compression methods, we also focus on the efficiency evaluation of models in various deployment settings (e.g. cloud or edge). These methods collectively enhance ML efficiency, making models more sustainable and accessible for various applications.
Selected Publications
- Xun Wang, John Rachwan, Stephan Günnemann, Bertrand Charpentier
Structurally Prune Anything: Any Architecture, Any Framework, Any Time
Preprint, 2024 - Joahnnes Getzner, Bertrand Charpentier, Stephan Günnemann
Accuracy is not the only Metric that matters: Estimating the Energy Consumption of Deep Learning Models
Tackling Climate Change with Machine Learning: Global Perspectives and Local Challenges Workshop (TCCML - ICLR), 2023 - Morgane Ayle, Bertrand Charpentier, John Rachwan, Daniel Zügner, Simon Geisler, Stephan Günnemann
On the Robustness and Anomaly Detection of Sparse Neural Networks
Sparsity in Neural Networks Workshop (SNN), 2022 - John Rachwan, Daniel Zügner, Bertrand Charpentier, Simon Geisler, Morgane Ayle, Stephan Günnemann
Winning the Lottery Ahead of Time: Efficient Early Network Pruning
International Conference on Machine Learning (ICML), 2022 - Marin Biloš, Johanna Sommer, Syama Sundar Rangapuram, Tim Januschowski, Stephan Günnemann
Neural Flows: Efficient Alternative to Neural ODEs
Neural Information Processing Systems (NeurIPS), 2021 - Johannes Gasteiger, Marten Lienen, Stephan Günnemann
Scalable Optimal Transport in High Dimensions for Graph Distances, Embedding Alignment, and More
International Conference on Machine Learning (ICML), 2021 - Marin Biloš, Stephan Günnemann
Scalable Normalizing Flows for Permutation Invariant Densities
International Conference on Machine Learning (ICML), 2021 - Oleksandr Shchur, Nicholas Gao, Marin Biloš, Stephan Günnemann
Fast and Flexible Temporal Point Processes with Triangular Maps
Neural Information Processing Systems (NeurIPS), 2020 - Aleksandar Bojchevski, Johannes Gasteiger, Stephan Günnemann
Efficient Robustness Certificates for Discrete Data: Sparsity-Aware Randomized Smoothing for Graphs, Images and More
International Conference on Machine Learning (ICML), 2020 - Aleksandar Bojchevski*, Johannes Gasteiger*, Bryan Perozzi, Amol Kapoor, Martin Blais, Benedek Rozemberczki, Michal Lukasik, Stephan Günnemann
Scaling Graph Neural Networks with Approximate PageRank
ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD), 2020