Abstracts 2017-2018
17 July 2018, Vadim Strijov (Moscow Institute of Physics and Technology)
At 11:00 in room 406, IMAG building. Organised by LIG.
Bayesian model selection and multimodelling.
Abstract: Multimodeling for learning-to-learn or meta-learning are discussed. The talk defines Bayesian strategies for local and universal model selection and multimodellings and discusses the principles of model selection. Multimodels are used when a sample cannot be described by a single model. This happens when feature weights depend on the feature values. Though a multimodel is an interpretable generalization of a single model case, it can contain large number of similar models. Pruning algorithms are constructed based on the suggested method for statistical model comparison.
10 July 2018, Matteo Sesia (PhD Student, Stanford University)
At 11:00 in F107, Inria Montbonnot.
Gene Hunting with Knockoffs for Hidden Markov Models. (pdf) (arXiv) (BibTex) slides.
Abstract: Modern scientific studies often require the identification of a subset of relevant explanatory variables, in the attempt to understand an interesting phenomenon. Several statistical methods have been developed to automate this task, but only recently has the framework of model-free knockoffs proposed a general solution that can perform variable selection under rigorous type-I error control, without relying on strong modeling assumptions. In this paper, we extend the methodology of model-free knockoffs to a rich family of problems where the distribution of the covariates can be described by a hidden Markov model (HMM). We develop an exact and efficient algorithm to sample knockoff copies of an HMM. We then argue that combined with the knockoffs selective framework, they provide a natural and powerful tool for performing principled inference in genome-wide association studies with guaranteed FDR control. Finally, we apply our methodology to several datasets aimed at studying the Crohn's disease and several continuous phenotypes, e.g. levels of cholesterol.
Joint work with Chiara Sabatti and Emmanuel Candès.
14 June 2018, Isadora Villalobos-Antoniano (Assistant Prof, Bocconi University, Milan, Italy)
At 14:00, salle 106, IMAG building (700 Avenue Centrale, Saint-Martin-d'Hères).
Bayesian estimation of probabilistic sensitivity measures for computer experiments. slides.
Abstract: Simulation-based experiments have become increasingly important for risk evaluation and decision-making in a broad range of applications, in engineering, science and public policy. In the presence of uncertainty regarding the phenomenon under study and, in particular, of the simulation model inputs, a probabilistic approach to sensitivity analysis becomes crucial. A number of global sensitivity measures have been proposed in the literature, together with estimation methods designed to work at relatively low computational costs. First in line is the one-sample or given-data approach which relies on adequate partitions of the input space. We propose a Bayesian alternative for the estimation of several sensitivity measures which shows a good performance on synthetic examples, specially for small sample sizes. Furthermore, we propose the use of a nonparametric approach for conditional density estimation which bypasses the need for pre-defined partitions, allowing the sharing of information across the entire input space through the underlying assumption of partial exchangeability. In both cases, the Bayesian paradigm ensures the quantification of the uncertainty in the estimation. Joint work with: Emanuele Borgonovo and Xuefei Lu
At 14:00 in F107, Inria Montbonnot.
Student’s t Source and Mixing Models for Multichannel Audio Source Separation. slides.
Abstract: This paper presents a Bayesian framework for under-determined audio source separation in multichannel reverberant mixtures. We model the source signals as Student’s t latent random variables in a time-frequency domain. The specific structure of musical signals in this domain is exploited by means of a non-negative matrix factorization model. Conversely, we design the mixing model in the time domain. In addition to leading to an exact representation of the convolutive mixing process, this approach allows us to develop simple probabilistic priors for the mixing filters. Indeed, as those filters correspond to room responses they exhibit a simple characteristic structure in the time domain that can be used to guide their estimation. We also rely on the Student’s t distribution for modeling the impulse response of the mixing filters. From this model, we develop a variational inference algorithm in order to perform source separation. The experimental evaluation demonstrates the potential of this approach for separating multichannel reverberant mixtures.
Paper available at https://hal.inria.fr/hal-01584755v2/document.
26 April 2018, Alisa Kirichenko (Postdoc, Machine Learning dept. of CWI, Amsterdam)
Alisa's talk will be part of the #RandomGraphTwitter workshop: webpage and twitter.
At 15:00 in Amphithéâtre, IMAG building (700 Avenue Centrale, Saint-Martin-d'Hères).
Function estimation on large graphs using Bayesian Laplacian regularization. slides.
Abstract: In recent years there has been substantial interest in high-dimensional estimation and prediction problems on large graphs. These can in many cases be viewed as high-dimensional or nonparametric regression or classification problems in which the goal is to learn a “smooth” function on a given graph. We present a mathematical framework that allows to study the performance of nonparametric function estimation methods on large graphs and we derive minimax convergence rates within the framework. We consider simple undirected graphs that satisfy an assumption on their “asymptotic geometry”, formulated in terms of the graph Laplacian. We also introduce a Sobolev-type smoothness condition on the target function using the graph Laplacian to quantify smoothness. Then we develop Bayesian procedures for problems at hand and we show how asymptotically optimal Bayesian regularization can be achieved under these conditions. The priors we study are randomly scaled Gaussians with precision operators involving the Laplacian of the graph.
At 14:00 in C208, Inria, Montbonnot.
Marta and Fei will present their PhD work:
- Marta: Bayesian Learning of the Mallows rank model. slides.
- Fei: Learning and Smoothing in Switching Markov Models with Copulas. slides
22 March 2018, Paul-Marie GROLLEMUND (ATER, Montpellier)
At 14:00 in Room 106, IMAG building (700 Avenue Centrale, Saint-Martin-d'Hères).
Bayesian linear regression on functional data. slides.
1st March 2018, Riccardo Corradin (PhD student, Milan Biccoca & Trinity College Dublin)
At 14:00 in Room 106, IMAG building (700 Avenue Centrale, Saint-Martin-d'Hères).
Bayesian nonparametric methods for density estimation and clustering on the phase-space. slides.
8 February 2018, Éric Marchand (Prof, Université de Sherbrooke, in sabbatical at UGA)
At 14:00 in Room 106, IMAG building (700 Avenue Centrale, Saint-Martin-d'Hères).
Estimation with predictive densities: recent results. slides.
16 January 2018, Łukasz Rajkowski (PhD student, University of Warsaw)
At 11am, in F107, Inria Montbonnot. Buffet at noon, in A109.
Analysis of Mode A Posteriori in the Chinese Restaurant Process model. slides.
Presentation of paper available on arXiv.
11 December 2017, Hongliang Lü (Postdoc, Inria)
At 11:30, in F107 at Inria Montbonnot.
Nonparametric Bayesian Image Segmentation. slides.
Presentation of the paper Nonparametric Bayesian Image Segmentation. P Orbanz and JM Buhmann. International Journal of Computer Vision (IJCV), Vol. 77, 25-45, 2008. The paper and the associated (matlab) code can be found here: [PDF] [Journal] [Code]
15 November 2017, Julyan Arbel (Researcher, Inria)
At 10am, in F107 at Inria Montbonnot.
Approximate Bayesian computation (ABC).
In this session, we will cover two papers (one seminal, one preprint):
- Fearnhead, P., & Prangle, D. (2012). Constructing summary statistics for approximate Bayesian computation: semi‐automatic approximate Bayesian computation. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 74(3), 419-474. link to paper, link to presentation by the authors.
- Bernton, E., Jacob, P. E., Gerber, M., & Robert, C. P. (2017). Inference in generative models using the Wasserstein distance. arXiv preprint arXiv:1701.05146. link to paper, link to presentation by Xian.