Zum Inhalt springen
  • de
  • en
  • Data Analytics and Machine Learning Group
  • TUM School of Computation, Information and Technology
  • Technische Universität München
Technische Universität München
  • Startseite
  • Team
    • Stephan Günnemann
    • Sirine Ayadi
    • Tim Beyer
    • Jonas Dornbusch
    • Eike Eberhard
    • Dominik Fuchsgruber
    • Nicholas Gao
    • Simon Geisler
    • Lukas Gosch
    • Filippo Guerranti
    • Leon Hetzel
    • Niklas Kemper
    • Amine Ketata
    • Marcel Kollovieh
    • Anna-Kathrin Kopetzki
    • Arthur Kosmala
    • Aleksei Kuvshinov
    • Richard Leibrandt
    • Marten Lienen
    • David Lüdke
    • Aman Saxena
    • Sebastian Schmidt
    • Yan Scholten
    • Jan Schuchardt
    • Leo Schwinn
    • Johanna Sommer
    • Tom Wollschläger
    • Alumni
      • Amir Akbarnejad
      • Roberto Alonso
      • Bertrand Charpentier
      • Marin Bilos
      • Aleksandar Bojchevski
      • Johannes Gasteiger, né Klicpera
      • Maria Kaiser
      • Richard Kurle
      • Hao Lin
      • John Rachwan
      • Oleksandr Shchur
      • Armin Moin
      • Daniel Zügner
  • Lehre
    • Sommersemester 2025
      • Advanced Machine Learning: Deep Generative Models
      • Applied Machine Learning
      • Seminar: Selected Topics in Machine Learning Research
      • Seminar: Current Topics in Machine Learning
    • Wintersemester 2024/25
      • Machine Learning
      • Seminar: Selected Topics in Machine Learning Research
      • Seminar: Current Topics in Machine Learning
    • Sommersemester 2024
      • Machine Learning for Graphs and Sequential Data
      • Advanced Machine Learning: Deep Generative Models
      • Applied Machine Learning
      • Seminar: Selected Topics in Machine Learning Research
    • Wintersemester 2023/24
      • Machine Learning
      • Applied Machine Learning
      • Seminar: Selected Topics in Machine Learning Research
      • Seminar: Machine Learning for Sequential Decision Making
    • Sommersemester 2023
      • Machine Learning for Graphs and Sequential Data
      • Advanced Machine Learning: Deep Generative Models
      • Large-Scale Machine Learning
      • Seminar
    • Wintersemester 2022/23
      • Machine Learning
      • Large-Scale Machine Learning
      • Seminar
    • Sommersemester 2022
      • Machine Learning for Graphs and Sequential Data
      • Large-Scale Machine Learning
      • Seminar (Selected Topics)
      • Seminar (Time Series)
    • Wintersemester 2021/22
      • Machine Learning
      • Large-Scale Machine Learning
      • Seminar
    • Sommersemester 2021
      • Machine Learning for Graphs and Sequential Data
      • Large-Scale Machine Learning
      • Seminar
    • Wintersemester 2020/21
      • Machine Learning
      • Large-Scale Machine Learning
      • Seminar
    • Sommersemester 2020
      • Machine Learning for Graphs and Sequential Data
      • Large-Scale Machine Learning
      • Seminar
    • Wintersemester 2019/20
      • Machine Learning
      • Large-Scale Machine Learning
    • Sommersemester 2019
      • Mining Massive Datasets
      • Large-Scale Machine Learning
      • Oberseminar
    • Wintersemester 2018/19
      • Machine Learning
      • Large-Scale Machine Learning
      • Oberseminar
    • Sommersemester 2018
      • Mining Massive Datasets
      • Large-Scale Machine Learning
      • Oberseminar
    • Wintersemester 2017/18
      • Machine Learning
      • Oberseminar
    • Sommersemester 2017
      • Robust Data Mining Techniques
      • Efficient Inference and Large-Scale Machine Learning
      • Oberseminar
    • Wintersemester 2016/17
      • Mining Massive Datasets
    • Sommersemester 2016
      • Large-Scale Graph Analytics and Machine Learning
    • Wintersemester 2015/16
      • Mining Massive Datasets
    • Sommersemester 2015
      • Data Science in the Era of Big Data
    • Machine Learning Lab
  • Forschung
    • Robust Machine Learning
    • Machine Learning for Graphs/Networks
    • Machine Learning for Temporal and Dynamical Data
    • Bayesian (Deep) Learning / Uncertainty
    • Efficient ML
    • Code
  • Publikationen
  • Offene Stellen
    • FAQ
  • Abschlussarbeiten

News

Six papers (incl. 3 spotlights) accepted at ICLR 2022

21.01.2022


Our group has six papers accepted at the 2022 International Conference on Learning Representations (ICLR) -- three of these as spotlight papers. The topics cover various fields of graph learning (e.g. GNNs in the context of quantum mechanical calculations and for spatio-temporal forecasting; DAG learning for, e.g., causal discovery; hierarchical graph clustering) as well as robustness (of neural combinatorial optimization) and uncertainty estimation. Congratulations to all co-authors!

  • Nicholas Gao, Stephan Günnemann
    Ab-Initio Potential Energy Surfaces by Pairing GNNs with Neural Wave Functions 
    (selected for spotlight presentation)
    International Conference on Learning Representations (ICLR), 2022
    Summary: Recently, neural networks have succeeded at modeling wave functions of many-electron systems from first principle. However, these methods come with significant computational demand. In this work, we propose the Potential Energy Surface Network (PESNet) to accelerate the process of finding many high-accuracy solutions. While traditionally one has to separately solve for each configuration of a molecule, PESNet solves many simultaneously. PESNet accomplishes this by deploying a graph neural network (GNN) on the molecular graph to reparametrize the neural wave function. This training enables us to obtain a model for a continuous subset of the potential energy surface. Further, we guarantee certain generalizations by incorporating all physical symmetries without overconstraining the model. In our experiments, PESNet solves many Schrödinger equations up to 40 times faster without loss of accuracy.
     
  • Marten Lienen, Stephan Günnemann
    Learning the Dynamics of Physical Systems from Sparse Observations with Finite Element Networks 
    (selected for spotlight presentation)
    International Conference on Learning Representations (ICLR), 2022
    Summary: Partial differential equations have long been established as the preferred language to describe physical systems and the scientific community has developed many methods to solve them and predict the future behavior of these environments. We propose a new method for spatio-temporal forecasting, Finite Element Networks, that combines graph neural networks, neural ordinary differential equations and finite element methods, a particular approach towards solving PDEs. The resulting model estimates the instantaneous effects of the unknown dynamics governing an observed system on each cell in a meshing of the spatial domain. Our model can incorporate prior knowledge via assumptions on the form of the unknown PDE, which induce a structural bias towards learning specific processes. Through this mechanism, we derive a transport variant of our model from the convection equation and show that it improves the predictive accuracy on sea surface temperature and gas flow forecasting against baseline models representing a selection of spatio-temporal forecasting methods. A qualitative analysis shows that our model disentangles the data dynamics into their constituent parts, which makes it uniquely interpretable.
     
  • Bertrand Charpentier, Simon Kibler, Stephan Günnemann
    Differentiable DAG Sampling 
    International Conference on Learning Representations (ICLR), 2022
    Summary: Directed Acyclic Graphs (DAGs) are important mathematical objects in many machine learning tasks including causal discovery. In this work, we propose a new probabilistic model over DAGs capable of fast and differentiable sampling (DP-DAG). To this end, DP-DAG samples a DAG by successively (1) sampling a linear ordering of the node and (2) sampling edges consistent with the sampled linear ordering. We further propose VI-DP-DAG, a new method for DAG learning from observational data which combines DP-DAG with variational inference. In contrast to existing differentiable DAG learning approaches, VI-DP-DAG is guaranteed to output a valid DAG at any time during training and does not require any complex augmented Lagrangian optimization scheme. In our extensive experiments, VI-DP-DAG significantly improves DAG structure and causal mechanism learning while training faster than competitors.
     
  • Daniel Zügner, Bertrand Charpentier, Morgane Ayle, Sascha Geringer, Stephan Günnemann 
    End-to-End Learning of Probabilistic Hierarchies on Graphs 
    International Conference on Learning Representations (ICLR), 2022
    Summary: Real-world graphs such as Web graphs, citation networks, flight networks, or biological networks show rich hierarchical structure. We propose a hierarchical clustering method for graphs which can discover such hierarchies. At the core of our method is a novel probabilistic continuous relaxation of tree-based hierarchies, which enables us to compute quantities like ancestor or lowest-common-ancestor probabilities efficiently, in closed form, and in a differentiable way. This enables us to optimize established quality metrics of hierarchical clustering using gradient descent. Our scalable method outperforms traditional as well as recent deep-learning-based baselines consistently on 12 real-world datasets.
     
  • Simon Geisler, Johanna Sommer, Jan Schuchardt, Aleksandar Bojchevski, Stephan Günnemann
    Generalization of Neural Combinatorial Solvers Through the Lens of Adversarial Robustness
    International Conference on Learning Representations (ICLR), 2022
    Summary: Deep learning for combinatorial optimization comes with the promise of replacing handcrafted heuristics with principled and optimized decisions. We study the adversarial robustness of such neural combinatorial solvers for the first time. We hereby more realistically evaluate their local generalization capabilities. Our results show that these solvers are indeed susceptible to small perturbations of the problem instances. This raises the question of whether neural combinatorial solvers deliver on the aforementioned promise, yet.
     
  • Bertrand Charpentier, Oliver Borchert, Daniel Zügner, Simon Geisler, Stephan Günnemann 
    Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions
    (selected for spotlight presentation)
    International Conference on Learning Representations (ICLR), 2022
    Summary: Uncertainty awareness is crucial to develop reliable machine learning models. In this work, we propose the Natural Posterior Network (NatPN) for fast and high-quality uncertainty estimation for any task where the target distribution belongs to the exponential family. Thus, NatPN finds application for both classification and general regression settings. Unlike many previous approaches, NatPN does not require out-of-distribution (OOD) data at training time. Instead, it leverages Normalizing Flows to fit a single density on a learned latent space. For any input sample, NatPN uses the predicted likelihood to perform a Bayesian update over the target distribution. Theoretically, NatPN assigns high uncertainty far away from training data. Empirically, NatPN delivers highly competitive performance in calibration and OOD detection for classification, regression and count prediction tasks.

◄ Zurück zu: Alle News
To top

Informatik 26 - Data Analytics and Machine Learning


Prof. Dr. Stephan Günnemann

Technische Universität München
TUM School of Computation, Information and Technology
Department of Computer Science
Boltzmannstr. 3
85748 Garching 

Sekretariat:
Raum 00.11.057
Tel.: +49 89 289-17256
Fax: +49 89 289-17257

  • Datenschutz
  • Impressum
  • Barrierefreiheit