**Compressive Sensing Videos**

List of videos relevant to Compressive Sensing featured in Nuit Blanche.

Where is my popcorn ?

Scalable Tensor Factorizations with Incomplete Data
can be found here
( or downloaded
here (197 MB) (the videos are also here)- An Overview of Compressed Sensing and Sparse Signal Recovery via L1 Minimization (part 2 of the tutorial is the Compressed Sensing presentation) by Emmanuel Candes
- Geometric Methods and Manifold Learning by Mikhail Belkin, and Partha Niyogi,
- Theory, Methods and Applications of Active Learning by Rui Castro and Robert Nowak.
- Sparse Representations from Inverse Problems to Pattern Recognition by Stéphane Mallat.
- Matrix Completion via Convex Optimization: Theory and Algorithms by Emmanuel Candes
- Learning Dictionaries for Image Analysis and Sensing by Guillermo Sapiro
- Vision and Hodge Theory by Stephen Smale (although there is no video yet).
- Multiscale Geometry and Harmonic Analysis of Data Bases by Ronald Coifman
- Nonlinear Dimension Reduction by Spectral Connectivity Analysis and Diffusion Coarse-Graining by Ann B Lee.
- What Do Unique Games, Structural Biology and the Low-Rank Matrix Completion Problem Have In Common by Amit Singer.
Svetlana Avramov-Zamurovic has a new video introducing Compressive Sensing but it is in Serbian. The slides are here whereas the video is here (after 30 minutes into the video the discussion goes into CS). The Sparsity in Machine Learning meeting 2009 featured several presentations related to Compressive Sensing: - Phase transitions phenomenon in Compressed Sensing by Jared Tanner
- Fast methods for sparse recovery: alternatives to L1 by Mike Davies
- Algorithmic Strategies for Non-convex Optimization in Sparse Learning by Tong Zhang
- Distilled Sensing: Active sensing for sparse recovery by Rui Castro
Svetlana Avramov-Zamurovic has some new videos of the Compressive Sensing tutorial she is giving at USNA. ******* Compressive Sensing Workshop organized at Duke February 25 & 26, 2009 Autonomous Geometric Precision Error Estimation in Low-level Computer Vision Tasks. A work by Andres Corrada-Emmanuel and Howard Schultz presented by John Paisley from Duke. A short Introduction to Compressed Sensing by Emmanuel Candes, a video made at ITA 08 can be watched at scivee.tv. Sparse Representations: From Source Separation to Compressed Sensing" in a video (ram) and in an audio only format. The accompanying slides are here- Talk 1. Computation Imaging (ram)
- Talk 2. Geometric Optics and Tomography (ram)
- Talk 3. Diffraction and Optical Elements (ram)
- Talk 4. Lecture Holography (ram)
- Talk 5. Lenses, imaging and MTF (ram)
- Talk 6. Wavefront coding and the impulse response (ram)
- Talk 7. Interferometry and the van Cittert-Zernike Theorem (ram)
- Talk 8. OCT and spatial/spectral and temporal degrees of freedom (ram)
- Talk 9. Spectroscopy and Spectral Imaging (ram)
- Talk 10. Coded aperture spectroscopy and spectral tomography (ram)
- An introduction to transform coding, Lecture 1 Slides (pdf)
- Compressive sensing for time signals: Analog to information conversion, Lecture 2 Slides (pdf), Talks(A/V) (ram)
- Compressive sensing for detection and classification problems,Lecture 3 Slides (pdf), Talks(A/V) (ram)
- Multi-signal, distributed compressive sensingLecture 4 Slides (pdf), Talks(A/V) (ram)
- Compressive imaging with a single pixel cameraLecture 5 Slides (pdf), Talks(A/V) (ram)
- Sparsity, Talks(A/V) (ram)
- After a rapid and glossy introduction to compressive sampling–or compressed sensing as this is also called–the lecture will introduce sparsity as a key modeling tool; the lecture will review the crucial role played by sparsity in various areas such as data compression, statistical estimation and scientific computing.
- Sparsity and the l1 norm, Talks(A/V) (ram)
- In many applications, one often has fewer equations than unknowns. While this seems hopeless, we will show that the premise that the object we wish to recover is sparse or compressible radically changes the problem, making the search for solutions feasible. This lecture discusses the importance of the l1-norm as a sparsity promoting functional and will go through a series of examples touching on many areas of data processing.
- Compressive sampling: sparsity and incoherence , Talks(A/V) (ram)
- Compressed sensing essentially relies on two tenets: the first is that the object we wish to recover is compressible in the sense that it has a sparse expansion in a set of basis functions; the second is that the measurements we make (the sensing waveforms) must be incoherent with these basis functions. This lecture will introduce key results in the field such as a new kind of sampling theorem which states that one can sample a spectrally sparse signal at a rate close to the information rate---and this without information loss.
- The uniform uncertainty principle, Talks(A/V) (ram)
- We introduce a strong form of uncertainty relation and discuss its fundamental role in the theory of compressive sampling. We give examples of random sensing matrices obeying this strong uncertainty principle; e.g. Gaussian matrices.
- The role of probability in compressive sampling, Talks(A/V) (ram)
- This lecture will discuss the crucial role played by probability in compressive sampling; we will discuss techniques for obtaining nonasymptotic results about extremal eigenvalues of random matrices. Of special interest is the role played by high- dimensional convex geometry and techniques from geometric functional analysis such as the Rudelson's selection lemma and the role played by powerful results in the probabilistic theory of Banach space such as Talagrand's concentration inequality.
- Robust compressive sampling and connections with statistics, Talks(A/V) (ram)
- We show that compressive sampling is–perhaps surprisingly–robust vis a vis modeling and measurement errors.
- Robust compressive sampling and connections with statistics (continued) Talks(A/V) (ram)
- We show that accurate estimation from noisy undersampled data is sometimes possible and connect our results with a large literature in statistics concerned with high dimensionality; that is, situations in which the number of observations is less than the number of parameters.
- Connections with information and coding theory Talks(A/V) (ram)
- We morph compressive sampling into an error correcting code, and explore the implications of this sampling theory for lossy compression and some of its relationship with universal source coding.
- Modern convex optimization Talks(A/V) (ram)
- We will survey the literature on interior point methods which are very efficient numerical algorithms for solving large scale convex optimization problems.
- Applications, experiments and open problems Talks(A/V) (ram)
- We discuss several applications of compressive sampling in the area of analog-to-digital conversion and biomedical imaging and review some numerical experiments in new directions. We conclude by exposing the participants to some important open problems
- Signal encodingLecture 1 Slides (pdf), Talks(A/V) (ram)
- Shannon-Nyquist Theory, Pulse Code Modulation, Sigma-Delta Modulation, Kolmogorov entropy, optimal encoding.
- Compression Lecture 2 Slides (pdf), Talks(A/V) (ram)
- Best k-term approximation for bases and dictionaries, decay rates, approximation classes, application to image compression via wavelet decompositions.
- Discrete compressed sensing, Lecture 3 Slides (pdf), Talks(A/V) (ram)
- The problem, best matrices for classes, Gelfand widths and their connection to compressed sensing.
- The restricted isometry property (RIP), Talks(A/V) (ram)
- Performance of compressed sensing under RIP.
- Construction of CS matrices with best RIP, Talks(A/V) (ram)
- Bernoulli and Gaussian random variables.
- Performance of CS matrices revisitedLecture 6 Slides (pdf), Talks(A/V) (ram)
- Proofs of the Kashin-Gluskin theorems.
- Performance in probability, Lecture 7 Slides (pdf), Talks(A/V) (ram)
- Examples of performance for Gaussian and Bernoulli ensembles.
- Decoders, Lecture 8 Slides (pdf), Talks(A/V) (ram)
- l1 minimization, greedy algorithms, iterated least squares.
- Performance of iterated least squaresPaper (pdf), Talks(A/V) (ram)
- Convergence and exponential convergence.
- Deterministic constructions of CS Matrices, Lecture 10 Slides (pdf), Talks(A/V) (ram)
- Constructions from finite fields, circulant matrices.
- Algorithms for Compressed Sensing, ISlides (pdf), Talks(A/V) (ram)
- What algorithmic problem do we mean by Compressed Sensing? There are a variety of alternatives, each with different algorithmic solutions (both theoretical and practical). I will discuss some of the different types of results from the combinatorial to the probabilistic.
- Algorithms for Compressed Sensing, IILecture notes (pdf) , Talks(A/V) (ram)
- What do these algorithms all have in common? What are the common goals of the problems and how do they achieve them? I will discuss several known techniques and open problems.
- Welcome and introduction, Talks(A/V) (ram)
- Introduction to MRISlides (pdf), Slides (ppt), Talks(A/V) (ram)
Short presentations by participantsShort presentation by participants (A/V) (ram) - Short presentation by participants 1 (ram)
- Short presentation by participants 2 (ram)
- Short presentation by participants 3 (ram)
- Short presentation by participants 4 (ram)
- Short presentation by participants 5 (ram)
. Video on Integration of Sensing and Processing (December 05-09, 2005) by Robert Nowak on Active learning vs. compressed sensing. . Richard Baraniuk hasdone two talks: one at Rice and the other at IMA. |