Sparse learning
"We consider it a good principle to explain the phenomena by the simplest hypothesis possible" - Ptolemy
Learning sparse models: theory and applications
from system identification to neural networks
One-day workshop
Wednesday, 16 September, 2020
Organized by Sophie M. Fosson and Diego Regruto - Dipartimento di Automatica e Informatica, Politecnico di Torino, Italy
Supported by IEEE CSS Italy Chapter and IEEE CSS Technical Committee on System Identification and Adaptive Control (TC-SIAC)
In data-driven science, it is fundamental to extract the essential information from data, to avoid redundancies, over-fitting, and undesired high complexity. For this purpose, sparse optimization is exploited to learn parsimonious models: this is "sparse learning".
In the last years, the theory of sparse optimization has been developed within the signal processing community, in particular in the context of compressed sensing. Nowadays, its application is popular in several areas, ranging from linear and non-linear system identification to neural networks and deep learning.
The aim of this workshop is to bring together researchers from signal processing, system identification, and machine learning communities to discuss new challenges of sparse learning, with particular attention to the application of more recent theoretical results to real-world problems.
Program:
- 9:00 a.m. "An introduction to sparse learning" by Sophie M. Fosson -Politecnico di Torino (Torino, Italy)
- 9:30 a.m. "On the sparse estimation approach to change point detection" by Cristian Rojas - KTH (Stockholm, Sweden)
- 10:15 a.m. "Incoherent measurement matrices in compressed sensing and applications to 5G communications" by Cristian Rusu - IIT (Genova, Italy)
- 11:10 a.m. "Iterative regularization: general convex case" by Cesare Molinari - IIT (Genova, Italy)
- 11:55 a.m. "Regularized and Distributionally Robust Data-Enabled Predictive Control" by Jeremy Coulson - ETH (Zürich, Switzerland)
- 2:15 p.m. "Incorporating a priori physical constraints in the sparse identification of nonlinear dynamical systems" by Jean-Christophe Loiseau - ENSAM (Paris, France)
- 3:00 p.m. "Learning hidden influences in sparse social networks" by Chiara Ravazzi - CNR-IEIIT (Torino, Italy)
- 3:45 p.m. "Adaptive-rate reconstruction of sparse time-varying signals: applications in compressive background Subtraction" by João Mota - Heriot-Watt University (Edinburgh, UK)
- 4:40 p.m. "Learning sparse neural topologies with a structure" by Attilio Fiandrotti and Enzo Tartaglione - Università di Torino (Italy)
- 5:25 p.m. "Sparse Neural Networks and the Lottery Ticket Hypothesis" by Giulia Fracastoro - Politecnico di Torino (Italy)
Detailed information:
THANKS TO ALL THE ATTENDEES!
The workshop had an overall participation of more than 100 people. We thank everyone for the interest and for the enthusiasm for sparse learning. The workshop has been a good starting point for future discussions and collaborations. Stay tuned!