IMAG Building, 700 avenue Centrale, Saint Martin d'Hères. The workshop will start on Thursday 19 at 9am, and end on Friday 20 at 4pm.
Misspecified models
Neural network related techniques
Advanced computational techniques
Dynamic models
10:00 F107, Inria Grenoble Rhône-Alpes (Montbonnot building)
Bayesian model selection and multimodelling
Multimodeling for learning-to-learn or meta-learning are discussed. The talk defines Bayesian strategies for local and universal model selection and multimodellings and discusses the principles of model selection. Multimodels are used when a sample cannot be described by a single model. This happens when feature weights depend on the feature values. Though a multimodel is an interpretable generalization of a single model case, it can contain large number of similar models. Pruning algorithms are constructed based on the suggested method for statistical model comparison.
14:00 Room Mont Blanc - GIPSA-lab (location), seminar organized by GIPSA-lab
Sequential Monte-Carlo (slides)
Nicolas will be giving a DIS exceptional seminar on the 23rd of January 2020 at 02:00 pm in the room Mont Blanc. Nicolas is a professor of Statistics at the ENSAE, and has been very active on the topic of Sequential Monte-Carlo (also known as particle filtering). He will be giving an introductory seminar on this topic as well as several applications to filtering problems in signal processing, ecology, finance, etc.
16:00 Salle de réunion 1 - Batiment IMAG
Bayesian statistics in R (slides)
Bayesian multilevel models are increasingly used to overcome the limitations of frequentist approaches in the analysis of complex structured data. During this session, I will briefly introduce the logic of Bayesian inference and motivate the use of multilevel modelling. I will then show how Bayesian multilevel models can be fitted using the probabilistic programming language Stan and the R package brms (Bürkner, 2016). The brms package allows fitting complex nonlinear multilevel (aka 'mixed-effects') models using an understandable high-level formula syntax. I will demonstrate the use of brms with some general examples and discuss model comparison tools available within the package. Prior experience with data manipulation and linear models in R will be helpful.
14:00 Salle 106 - Batiment IMAG
A Bayesian non-parametric methodology for inferring grammar complexity
Based on a set of strings from a language, we wish to infer the complexity of the underlying grammar. To this end, we develop a methodology to choose between two classes of formal grammars in the Chomsky hierarchy: simple regular grammars and more complex context-free grammars. To do so, we introduce a probabilistic context-free grammar model in the form of a Hierarchical Dirichlet Process over rules expressed in Greibach Normal Form. In comparison to other representations, this has the advantage of nesting the regular class within the context-free class. We consider model comparison both by exploiting this nesting, and with Bayes' factors. The model is fit using a Sequential Monte Carlo method, implemented in the Birch probabilistic programming language. We apply this methodology to data collected from primates, for which the complexity of the grammar is a key question.
LIG seminar - 14:00 Salle 406 - Batiment IMAG
Some theory on Bayesian neural networks (slides)
In this talk, we first present seminal works at the basis of the theory of Bayesian neural networks. These include Radford Neal result in the 90s regarding the connexion between Gaussian processes and wide neural networks, and the recent developments of this result to deep neural networks. In a second part, we focus on understanding priors in Bayesian neural networks at the unit level. More specifically, we investigate deep Bayesian neural networks with Gaussian weight priors and a class of ReLU-like nonlinearities. We establish that the induced prior distribution on the units before and after activation becomes increasingly heavy-tailed with the depth of the layer.
Joint work with Mariia Vladimirova, Jakob Verbeek, Pablo Mesejo.
Link: http://proceedings.mlr.press/v97/vladimirova19a/vladimirova19a.pdf