Dino Sejdinovic
Embeddings of probability distributions into reproducing kernel Hilbert spaces are a useful framework for fully nonparametric hypothesis testing and for learning on distributional inputs. I will give an overview of this framework and describe its recent applications in the context of meta learning. In particular, we will consider hyperparameter learning using Bayesian optimisation where typically one requires initial exploration even in cases where similar prior tasks have been solved. We propose to transfer information across tasks via learned kernel-neural representations of training datasets used in those tasks, leading to faster convergence compared to existing baselines. In addition, we will consider an application of kernel embeddings to conditional density estimation in the meta learning setting.
H. C. L. Law, P. Zhao, L. Chan, J. Huang, and D. Sejdinovic, Hyperparameter Learning via Distributional Transfer, in NeurIPS 2019.
J.-F. Ton, L. Chan, Y. W. Teh, and D. Sejdinovic, Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings, ArXiv e-prints:1906.02236, in NeurIPS 2019 Meta Learning Workshop.