Groups of 3-4. The group size should at least be 3.
Every member of a group should specify their role in the project clearly.
October 8: Project Summary due
October 29: Spotlight Presentations
December 1-3: Final Presentations
End of RRR Week: Final Project Report
Here are a few tentative topics, along with some resources. In particular, students are allowed to choose their own projects provided it has substantial connection to information theory. Furthermore, they are also allowed to choose their research work as class project provided it has ties to information theory.
Sparse Graph Codes in various applications (some parts will be covered in lecture) --
Speeding Up Machine Learning: Speeding Up Distributed Machine Learning Using Codes Kangwook et al.
Private information retrieval and private stream search (cryptography applications) Efficient Private Information Retrieval Over Unsynchronized Databases, G. Fanti and K. Ramchandran , Private Stream Search at the same communication cost as a regular search: Role of LDPC codes; M. Finiasz and K. Ramchandran
Fast and efficient compressed sensing based on sparse-graph codes for wavelet-based sparsity (MRI applications) : Fast and Efficient Sparse 2D Discrete Fourier Transform using Sparse-Graph Codes ; F. Ong et al , Sparse MRI: The application of compressed sensing for rapid MR imaging; M. Lustig et al
Codes for approximate learning algorithms (such as matrix completion and other ''randomized linear algebra" applications : Low Rank Approximation using Error Correcting Coding Matrices; Ubaru et al , Relative-Error CUR Matrix Decompositions ; P. Drineas
Codes for compressed phase-retrieval (optics applications) : PhaseCode: Fast and Efficient Compressive Phase Retrieval based on Sparse-Graph-Codes , R. Pedarsani et al.
Mentor- Amirali Aghazadeh, Swanand kadhe
Applications of Sparse Graph Codes for huge data-sets (with applications in Biology):
Resource: A. Aghazadeh, Orhan Ocal, and Kannan Ramchandran, CRISPRLand: Interpretable Large-Scale Inference of DNA Repair Landscape Based on a Spectral Approach, Intelligent Systems for Molecular Biology (ISMB), July 2020.
SPRIGHT: A Fast and Robust Framework for Sparse Walsh-Hadamard Transform X. Li et al.
Mentor- Amirali Aghazadeh
Polar Codes---theory and applications:
Resource: Efficient design and decoding of polar codes; Trifonov et. al.
How to construct polar codes; Tal et.al
Performance of polar codes for channel and source coding; Hussami et. al
Straggler Resilient Serverless Computing Based on Polar Codes; Bartan and Pilanci
Mentor- Swanand Kadhe, Vipul Gupta
Coded Computing in Serverless Systems:
Oversketched newton: Fast convex optimization for serverless systems, Gupta et. al
Serverless Straggler Mitigation using Local Error-Correcting Codes, Gupta et. al
Mentor- Vipul Gupta
Further Investigation in GANs:
Resource: The original Goodfellow paper: https://arxiv.org/abs/1406.2661
Wasserstein GAN: https://arxiv.org/abs/1701.07875
A survey paper on GANs: https://arxiv.org/abs/1906.01529
Mentor-- Kuan-Yun Lee, Nikita Dhawan
Auto-encoders for learning Generative models:
Resource: Kingma, D., Welling, M. (2013). Auto-Encoding Variational Bayes.
Zhao, S., Song, J., Ermon, S. (2019). InfoVAE: Balancing Learning and Inference in Variational Autoencoders
Mentor: Kuan-Yun Lee, Nikita Dhawan
The Information Bottleneck Principle:
Resource: Papers by Naftali Tishby (here), and the follow-ups.
A theory paper on information bottleneck (here)
Mentor: Kuan-Yun Lee
Info theory meets Online Learning and Bandits
Multiplicative weight update for prediction and multi-armed bandit problems; lower bounds; universal portfolio using multiplicative weight update (here , lecture notes)
Book on Online Learning: Prediction, Learning and Games, Cesa-Bianchi, Lugosi.
Book on Bandits: Bandit Algorithms (by Csaba Szepesvari and Tor Littimore )
Mentor: Avishek Ghosh
Capacity of Neural Nets
Capacity of a neuron: Chapter IV of Information Theory, Inference, and Learning Algorithms, David J.C. MacKay
Capacity for feed forward neural nets (here)
Mentor: Avishek Ghosh, Nived Rajaraman
Information Theoretic Lower Bounds in Statistical Learning
Statistics: Chapter 15 of High Dimensional Statistics, Martin J Wainwright, Also see the lecture notes ( 1 , 2 )
Bandits : Chapter 14-17 of Bandit Algorithms by Csaba Szepesvari and Tor Littimore
Mentor: Avishek Ghosh
The Entropy method in Discrepency theory:
The entropy method is a powerful method used to prove existential results for several combinatorial problems. Most famously, it was used by Spencer in his famous paper "Six Standard Deviations Suffice" to prove that moderately sized set systems have low discrepancy. In this project, you will review the entropy method along with its applications.
Resource: "Six Standard Deviations suffice''--Joel Spencer
The Discrepancy Method, Randomness and Complexity--Bernard Chazelle
Constructive Discrepancy Minimization by Walking on The Edges--Lovett et. al (here)
Mentor: Nived Rajaraman
Community Detection