Previous talks at the SCCS Colloquium

Marco Hövekenmeier: Natural Gradient Variational Inference with Empirical Bayes

SCCS Colloquium |


In the field of Bayesian deep learning (BDL), Natural Gradient Variational Inference (NGVI) has emerged as a key method for approximate inference. This thesis investigates the integration of NGVI with empirical Bayes, also called type-II maximum likelihood estimation, to enable automatic model selection by optimizing hyperparameters, precisely the prior precision, using the marginal likelihood. This contrasts commonly known approaches like cross-validation, where multiple models with different hyperparameter configurations are trained and a model selection is performed post-hoc. 

One major challenge in BDL is the intractability of the marginal likelihood. This work compares methods to approximate this quantity, notably through the evidence lower bound (ELBO) in NGVI and online Laplace approximations to the marginal likelihood. The neural network Hessian plays a crucial role in both approaches and is approximated by the Generalized Gauss-Newton matrix (GGN). We place a special emphasis on Kronecker-factored GGN approximations and compare the performance of the resulting algorithms with baselines using the full or diagonal approximations on illustrative toy datasets as well as on UCI regression and classification benchmarks.

Master's thesis presentation. Marco is advised by Severin Reiz and Alexander Immer (ETH Zürich).