Kernel methods and graphical models are two important families of techniques for machine learning. Our community has witnessed many major but separate advances in the theory and applications of both subfields. For kernel methods, the advances include kernels on structured data, Hilbert-space embeddings of distributions, and applications of kernel methods to multiple kernel learning, transfer learning, and multi-task learning. For graphical models, the advances include variational inference, nonparametric Bayes techniques, and applications of graphical models to topic modeling, computational biology and social network problems.
This workshop addresses two main research questions: first, how may kernel methods be used to address difficult learning problems for graphical models, such as inference for multi-modal continuous distributions on many variables, and dealing with non-conjugate priors? And second, how might kernel methods be advanced by bringing in concepts from graphical models, for instance by incorporating sophisticated conditional independence structures, latent variables, and prior information?
Kernel algorithms have traditionally had the advantage of being solved via convex optimization or eigenproblems, and having strong statistical guarantees on convergence. The graphical model literature has focused on modelling complex dependence structures in a flexible way, although approximations may be reuqired to make inference tractable. Can we develop a new set of methods which blend these strengths?
There has recently been a number of publications combining kernel and graphical model techniques, including kernel hidden Markov models, kernel belief propagation, kernel Bayes rule, kernel topic models, kernel variational inference, kernel herding as Bayesian quadrature, kernel beta processes, and a connection between kernel k-means and Bayesian nonparametrics. Each of these results deals with different inference tasks, and makes use of a range of RKHS propreties. We propose this workshop so as to "connect the dots" and develop a unified toolkit to address a broad range of learning problems, to the mutual benefit of reseachers in kernels and graphical models. The goals of the workshop are thus twofold: first, to provide an accessible review and synthesis of recent results combining graphical models and kernels. Second, to provide a discussion forum for open problems and technical challenges.
7:45- 8:00 - Overview by Organizers
8:00- 8:30 - Invited Talk: Kernel Bayes Rule (Kenji Fukumizu, The Institute of Statistical Mathematics)
8:30- 8:40 - Talk: Fast Food: Approximating Kernel Expansion in Loglinear Time (Alex Smola, Research at Google)
8:40- 9:00 - Coffee Break
9:00- 9:30 - Invited Talk: Nonparametric Bayes and Kernel Kmeans (Brian Kulis, Ohio State University)
9:30-10:00 - Invited Talk: Kernel Beta Processes (Lawrence Carin, Duke University)
10:00-10:15 - Accepted Talk: Kernel Embeddings of Dirichlet Process Mixtures (Krikamol Muandet, Max Plank Institute of Biological Cybernetics)
10:15 - 16:00 Break
16:00-16:10 - Accepted Talk: Kernel Methods for Learning Motion Patterns (Lachlan McCalman, University of Sydney)
16:10-16:20 - Accepted Talk: Kernels for Protein Structure Prediction (Narges Razavian, Carnegie Mellon University)
16:20-16:30 - Short Coffee Break
16:30-17:00 - Invited Talk: Kernel Topic Models (Thore Graepel, Microsoft Research Cambridge)
17:00-17:30 - Invited Talk: Nonparametric and Stochastic Variational Inference (Matt Hoffman, Adobe)
17:30-18:00 - Coffee Break
18:00-18:30 - Invited Talk: Determinantal Point Processes (Ben Taskar, University of Pennsylvannia)
18:30-19:00 - Invited Talk: Connection between Kernel Embedding and Bayesian/Gaussian Process Methods (David Duvenaud, Cambridge University)
19:00-19:30 - Open Discussion on Current Challenges and Future Directions
[SBSGS10] Song, L. and Boots, B. and Siddiqi, S. and Gordon, G. and Smola, A., Hilbert space embeddings of hidden Markov models, ICML'10.
[RWDC11] Ren, L. and Wang, Y. and Dunson, D.B. and Carin, L., Kernel beta processes, NIPS'11.
[SGBLG11] Song, L. and Gretton, A. and Bickson, D. and Low, Y. and Guestrin, C., Kernel belief propagation, AISTATS'11.
[FSG11] Fukumizu, K. and Song, L. and Gretton, A., Kernel Bayes rules, NIPS'11.
[HSHG12] Hennig, P. and Stern, D. and Herbrich, R. and Graepel, T., Kernel topic models, AISTATS'12.
[HD12] Huszar, F. and Duvenaud, D., Optimally-weighted herding is Bayesian quadrature, UAI'12.
[GHB12] Gershman, S. and Hoffman, M. and Blei, D., Nonparametric variational inference, ICML'12.
[KJ12] Kulis, B. and Jordan, M., Revisiting k-means: new algorithms via Bayesian nonparametrics, ICML'12.
[KT12] Kulesza, A. and Taskar, B., Determinantal point processes for machine learning, arXiv:1207.6083