Séance du 26 novembre 2012

Lundi 26 novembre 2012

Organisateurs: Liliane Bel et Vincent Rivoirard

14h00 Guillem Rigaill (INRA - Université Evry)

Titre: Exact posterior distributions and model selection criteria for multiple change-point detection problems

(joint work with E. Lebarbier and S. Robin)

Résumé : In segmentation problems, inference on change-point position and model selection are two difficult issues due to the discrete nature of change-points. In a Bayesian context, we derive exact, explicit and tractable formulae for the posterior distribution of variables such as the number of change-points or their positions. We also demonstrate that several classical Bayesian model selection criteria can be computed exactly. All these results are based on an efficient strategy to explore the whole segmentation space, which is very large. We illustrate our methodology on both simulated data and a comparative genomic hybridization profile.

15h00 Gabriel Peyré (CNRS - Université Paris-Dauphine)

Titre: Robust Sparse Analysis Regularization

(joint work with S. Vaiter, C. Dossal, J. Fadili)

Résumé: In this talk I will detail several key properties of L1-analysis regularization for the resolution of linear inverse problems [5,6]. With the notable exception of [1,3,7], most previous theoretical works consider sparse synthesis priors where the sparsity is measured as the norm of the coefficients that synthesize the signal in a given dictionary, see for instance [3,4]. In contrast, the more general analysis regularization minimizes the L1 norm of the correlations between the signal and the atoms in the dictionary. The corresponding variational problem includes several well-known regularizations such as the discrete total variation, the fused lasso and sparse correlation with translation invariant wavelets. I will first study the variations of the solution with respect to the observations and the regularization parameter, which enables the computation of the degrees of freedom estimator [6] (this result was also proved independently at the same time in [8]). I will then give a sufficient condition to ensure that a signal is the unique solution of the analysis regularization when there is no noise in the observations [5]. The same criterion ensures the robustness of the sparse analysis solution to a small noise in the observations. Lastly I will define a stronger condition that ensures robustness to an arbitrary bounded noise. In the special case of synthesis regularization, our contributions recover already known results [2,4], that are hence generalized to the analysis setting. I will illustrate these theoritical results on practical examples to study the robustness of the total variation, fused lasso and translation invariant wavelets regularizations.

16h00 Thierry Dumont (ID Services - Université Paris Sud)

Titre : Estimation non paramétrique dans les modèles de Markov cachés.

Résumé : Dans cet exposé, nous considèrerons un modèle de Markov caché, issu de travaux sur la géolocalisation intra-muros. Dans ce modèle, les données Y1,...,Yn correspondent à des observations bruitées de f(X1),..,f(Xn), où X est une chaine de Markov non observée et où f est une fonction inconnue. L'objectif de cet exposé est l'estimation non paramétrique de cette fonction inconnue. Nous nous attarderons notamment sur la problématique d'identifiabilité du modèle. Puis nous étudierons un estimateur de cette fonction f, construit uniquement sur la base des observations Y1,...,Yn.

***************************************************************************************************************

Bibliography Gabriel Peyré:

[1] E. Candes, Y.C. Eldar, D. Needell, and P. Randall. Compressed sensing with coherent and redundant dictionaries. Applied and Computational Harmonic Analysis, 31(1):59–73, 2010.

[2] J.J. Fuchs. On sparse representations in arbitrary redundant bases. IEEE Transactions on Information Theory, 50(6):1341–1344, 2004.

[3] S. Nam, M.E. Davie, M. Elad, R. Gribonval, The Cosparse Analysis Model and Algorithms, preprint arXiv:1106.4987, 2011.

Robust Sparse Analysis Regularization.

[4] J.A. Tropp. Just relax: Convex programming methods for identifying sparse signals in noise. IEEE Transactions on Information Theory, 52 (3):1030–1051, 2006.

[5] S. Vaiter, G. Peyré, C. Dossal, J. Fadili, Robust Sparse Analysis Regularization, preprint arXiv:1109.6222v1, 2011.

[6] S. Vaiter, C. Deledalle, G. Peyré, J. Fadili, C. Dossal, Local Behavior of Sparse Analysis Regularization: Applications to Risk Estimation, preprint arxiv.org/pdf/1204.3212, 2012.

[7] M. Grasmair. Linear convergence rates for Tikhonov regularization with positively homogeneous functionals. Inverse Problems, 27(7):075014, 2011.

[8] R. Tibshirani and J. Taylor. Degrees of Freedom in Lasso Problems. Annals of Statistics, Vol. 40, No. 2, 1198-1232, 2012.