2-Day iTWIST'18 Doctoral School

CIRM, Marseille, France. 19 - 20 November​, 2018

News:

  • 06/12/2019: The 2020 edition of iTWIST will take place in Nantes, France from June 24th to 26th, 2020. A 2-day doctoral school will precede the workshop, on Monday 22nd and Tuesday 23rd, June 2020. More information can be found on the iTWIST 2020 website :https://itwist20.ls2n.fr/


  • 21/8/18: Detailed doctoral school program is now online (see below) (update 13/9/18: schedule added).
  • 22/6/18: Application form is closed.
  • 23/4/18: Opening of the application form.
  • 15/1/18: iTWIST'18 doctoral school "Call for participants" is now available.

For this edition, iTWIST'18 will organize a 2-day doctoral school preceding the workshop, on Monday 19th and Tuesday 20th November​, 2018 in CIRM, Marseille, France.

The objective of the doctoral school is to give the opportunity to PhD students and postdoctoral fellows to learn some of the theoretical and applicative concepts that will be discussed in the workshop. The school will be divided into four courses taught in English and will occur before the workshop. Note that this doctoral school is also opened to interested master students.

Doctoral school program

“Flavors of compressive sensing” (3h)

by Simon Foucart (Texas A&M University, USA)

Abstract: Compressive Sensing has attracted a lot of attention in science and engineering, because it revealed the theoretical possibility of acquiring structured high-dimensional objects using much less information than previously expected, and more importantly because it provided practical procedures to do so. The foundations of the field rely on an elegant mathematical theory that has matured over the past few years. The goal of these two lectures is to highlight the main aspects of this theory.

In particular, the following items will be discussed:

    • the exact reconstruction of sparse vectors;
    • the stable and robust reconstruction of nearly sparse vectors acquired with minor errors;
    • the basic reconstruction algorithms (convex programming, greedy algorithms, thresholding-based strategies);
    • the restricted isometry property;
    • random measurement matrices;
    • optimality of uniform recovery results.

“Computational Imaging with Convex and Non-Convex Optimization” (4h)

by Ulugbek Kamilov (Washington University in St. Louis, USA)

Covered topics:

    • proximal algorithms in imaging (nonconvex, convex, and strongly convex)
    • stochastic optimization (SGD, stochastic extensions of proximal algorithms)
    • model-based imaging with plug-and-play priors
    • applications to tomographic microscopy

“Optimization strategies for fast inverse problems under sparsity constraints

(with some applications in neuroimaging)” (4h)

by Alexandre Gramfort (Inria, Université Paris-Saclay, France)

In this course AG will:

    • Motivate the use of sparse regularizations in the context of neuroscience applications;
    • Review some results on coordinate descent methods starting from more well known iterative algorithms such as a proximal or projected gradient descent;
    • Cover the dual construction of Lasso-type solvers and explain how it can be used to control optimality, derive accelerations with screening rules and working set methods;
    • Present how such methods can be efficiently implemented in Python using Numba or Cython (as done in the Scikit-Learn software).

“Quantized compressed sensing and related data embeddings” (3h)

by Laurent Jacques (UCLouvain, Belgium)

In this course we will discover the interplay of Compressive Sensing theory, as introduced by Simon Foucart in this doctoral school, with the unavoidable quantization of any sensing procedure, that is, the standard analog-to-digital conversion operated in actual sensing devices in order to efficiently transmit, store or process recorded data. This interaction will lead us to the definition of interesting mathematical questions in high dimensional geometry, with for instance the study of certain embedding properties for (1-bit) quantized random projections, i.e., the preservation of pairwise vector distance in the quantized and projected domain, up to controllable distortions.

In particular, this course will cover the following aspects:

    • Early attemps to combine CS and quantization;
    • Principles of memoryless scalar quantization: 1-bit and multi-bits;
    • Consistent reconstruction methods, in quantization theory and in quantized compressive sensing;
    • 1-bit Compressive Sensing: compatible sensing matrices, reconstruction algorithms and guarantees;
    • Multi-bit Quantized Compressive Sensing and embeddings: the benefit of dithering;
    • Overview of other quantization methods, e.g., noise shaping quantization and Sigma-Delta QCS.

Important: required numerical tools for the doctoral school:

  • The course on “optimization strategies for fast inverse problems under sparsity constraints (with some applications in neuroimaging)” by A. Gramfort will be illustrated with examples and codes developped on Python 3.6. Therefore, it is very important that you to bring with you a working laptop computer on which you have to install the following important softwares:
    • a working Python 3.6 (e.g., installed through the multiplatform "anaconda" python distribution);
    • the following python packages (normally installed by anoconda but verify this): numpy, scipy, scikitlearn, jupyter, matplotlib and ideally also the numba package.

Application form is closed