Past sessions 2011 - 12

Thursday June 21
15:00 - 17:00
Ecole des Mines, L218

Mehryar Mohri / Courant Institute of Mathematical Sciences, NYU
Adapting to a Non-Ideal World

Learning theory and algorithms were originally developed for a chimerical world in which several critical assumptions related to the distributions and the sampling would hold. But, real-world data sets quite often do not meet these conditions: the distribution according to which training and test points are drawn may differ and the distributions may further be drifting with time.
These problems are not just second-order effects. Their solution is not needed only to slightly improve the performance of some learning algorithms. Ignoring them can lead to dramatically poor results, which can be attested empirically without difficulty.
This talk presents a series of theoretical and algorithmic solutions to address these issues. It also reports empirical results in support of these algorithms in several natural scenarios.
The talk includes joint work with Corinna Cortes, Yishay Mansour, and Andres Munoz.

Monday June 11
13:30 - 14:30
Ecole normale supérieure / room W

Magalie Fromont (ENSAI)
Tests de comparaison basés sur des noyaux : une approche bootstrap non asymptotique
[Lien vers l'article]

May 24
15:00 - 17:00
Ecole des Mines, L118

Eric Moulines / Telecom ParisTech
New trends in state-space models

Monday March 26
13:30 - 14:30
Ecole normale supérieure / room W 

Pierre Alquier / Université Paris Diderot and CREST, ENSAE
Sélection de modèles pour l'autorégression 
[Liens vers les articles correspondants : article 1article 2]

Thursday March 22
15:00 - 17:00
Ecole des Mines / room V106B

Gérard Biau / Paris 6 and ENS Paris
Random forests
Random forests are a scheme proposed by Leo Breiman in the 2000's for building a predictor ensemble with a set of decision trees that grow in randomly selected subspaces of data. Despite growing interest and  practical use, there has been little exploration of the statistical properties of random forests, and little is known about the mathematical forces driving the algorithm.  In this talk, we will discuss an in-depth analysis of a random forests model suggested by Breiman in 2004, which is very close to the original algorithm. We show in particular that the procedure is consistent and adapts to sparsity, in the sense that its rate of convergence depends only on the number of strong features and not on how many noise variables are present.

Anne-Claire Haury / Mines ParisTech and Institut Curie
Feature selection with random forests, application to gene network reconstruction

Monday March 12
13:30 - 14:30
Ecole normale supérieure / room W 

Gábor Lugosi / ICREA and Universitat Pompeu Fabra
Detection of correlations in a multivariate sample [link to the paper]

Thursday February 16
15:00 - 17:00
Ecole des Mines, room L218

NIPS debriefing
- Emilie Kaufmann: Improved algorithms for linear bandits, by Abbasi-Yadkori, Pal and Szepesvari
- Pierre Chiche: Distributed delayed stochastic optimization, by Agarwal and Duchi
- Toby Hocking: Crowdclustering, by Gomes, Welinder, Krause and Perona

Monday February 13
13:30 - 14:30
Ecole normale supérieure / room W 

Robin Ryder / Université Paris Dauphine
Apprentissage de la phylogénie de la diversification des langues naturelles par méthodes bayésiennes 

Monday January 16
13:30 - 14:30
Ecole normale supérieure / room W 

Jean-Baptiste Monnier / Université Paris Diderot 
Classification binaire supervisée sous hypothèse de marge et régression en design aléatoire par projections localisées sur une analyse multirésolution

Thursday January 5
15:00 - 17:00
Ecole des Mines, room L106

Alexandre d'Aspremont (Ecole Polytechnique)
Semidefinite Programming with Applications in Geometry and Machine Learning
This tutorial will start by a very brief primer on semidefinite programming followed by a discussion of some recent applications to geometrical problems arising in statistics, graph theory, etc. A second part will then focus on applications of these techniques to performance measures for dictionary matrices in compressed sensing.

Monday November 14
13:30 - 14:30
Ecole normale supérieure / room *U ou V*, level -2 under the main hall of the maths department

Stéphane Gaïffas / Université Pierre et Marie Curie
Inégalités d'oracle pour la prédiction de matrices en grande dimension et applications à la complétion de matrices

[Articles sur lesquels sont fondés l'exposé : premier, second]

November 3
15:00 - 17:00
Ecole des Mines, room L213

Francis Bach [INRIA / ENS]
Non-asymptotic analysis of stochastic approximation algorithms for machine learning

Monday October 10
13:30 - 14:30
Ecole normale supérieure / room W 

Robin Genuer [Université Bordeaux Segalen]
Analyse du biais de forêts purement aléatoires

September 29
16:00 - 18:00
Ecole des Mines, L108

ICML'11 debriefing, with talks by
- Sylvain Robbiano Surrogate losses and regret bounds for cost-sensitive classification with example-dependent cost by Scott
- Simon Lacoste-Julien Support Vector Machines as Probabilistic Models by Franc et al.
- Nicolas Le Roux Bayesian Learning via Stochastic Gradient Langevin Dynamics by Welling and Tee

Monday September 19
13:30 - 15:00
Ecole normale supérieure / room W 

ALT'11 Briefing, with talks by
Alexandra Carpentier [Upper-Confidence-Bound Algorithms for Active Learning in Multi-Armed Bandits]
Aurélien Garivier [On Upper-Confidence Bound Policies for Switching Bandit Problems]
Antoine Salomon [Deviations of Stochastic Bandit Regret]
Gilles Stoltz [Lipschitz Bandits without the Lipschitz Constant]