Previous talks at the SCCS Colloquium

Victor Armegioiu: Meta-Learning Better Bayesian Priors

SCCS Colloquium |


A core issue of BNNs is the prior. Due to the lack of better alternatives, one often uses a zero-centered spherical Gaussian in the NN weight-space as prior. However, this is often a pretty bad / naive choice that does not convey much useful inductive bias. An alternative to such a naive prior is to meta-learn the prior from a set of related datasets (as in PAC-Bayesian Meta-Learning [1]). Yet, finding an appropriate parametric family of priors in the high-dimensional weight-space of neural networks is challenging. Our work explores the possibility to meta-learn alternative BNN priors, i.e. priors in the function space or priors represented as a set of particles (similar to Particle-VI [2] posteriors).

[1] Meta-Learning by Adjusting Priors Based on Extended PAC-Bayes Theory (https://arxiv.org/abs/1711.01244)
[2] A Unified Particle-Optimization Framework for Scalable Bayesian Sampling (https://arxiv.org/pdf/1805.11659.pdf)

Guided Research submission talk (Data Engineering and Analytics). Victor is advised by Severin Reiz and Jonas Rothfuss (ETH)