Previous talks at the SCCS Colloquium

Victor Armegioiu and Andreea Musat: Meta-Learning kernels with Hyperkernels and Implications on Bayesian Optimization

SCCS Colloquium |


We introduce a more generic framework for kernel learning in the context of Gaussian process optimization in the bandit setting. The typical GP optimization algorithm (GP-UCB) aims to find optima of a hidden function (no access to gradients), that's very costly to evaluate, in a minimal number of evaluations - the typical assumption that's made concerning the function class, is that the function is either a trajectory of a Gaussian process, or lies in a Reproducing Kernel Hilbert (RKHS) space induced by some kernel. In either case, the underlying kernel operator determines the shape/smoothness of the function.

Previous work assumes perfect a priori knowledge of the actual kernel and its parameters (length, smoothness, etc), which is not the case for actual real life scenarios.
We introduce a meta-learning algorithm that assumes nothing about the kernel class, and learns the kernel assuming access to some trajectories (i.e. functions) of the underlying generating process. Based on several results from spectral theory, we show that our algorithm converges in sublinear time, making it an useful generalization to GP-UCB type algorithms.

Application project presentation. Victor and Andreea are advised by Severin Reiz, Hasan Ashraf, and Jonas Rothfuss (ETH LAS).