Email Communications
Historical email announcements are posted here for the benefit of new/potential members.
June 27 2022, 11:30 AM
[ML Theory Reading Group] Meeting Announcement 06/28/2022
Since last week's presentation ran over, tomorrow Matt Raymond will continue his presentation on Automatic Symmetry Discovery with Lie Algebra Convolutional Network:
Existing equivariant neural networks require prior knowledge of the symmetry group and discretization for continuous groups. We propose to work with Lie algebras (infinitesimal generators) instead of Lie groups. Our model, the Lie algebra convolutional network (L-conv) can automatically discover symmetries and does not require discretization of the group. We show that L-conv can serve as a building block to construct any group equivariant feedforward architecture. Both CNNs and Graph Convolutional Networks can be expressed as L-conv with appropriate groups. We discover direct connections between L-conv and physics: (1) group invariant loss generalizes field theory (2) Euler-Lagrange equation measures the robustness, and (3) equivariance leads to conservation laws and Noether current. These connections open up new avenues for designing more general equivariant networks and applying them to important problems in physical sciences.
We will be meeting at 12 PM ET in our typical zoom room (here).
If you have a paper that you think is interesting or needs more exposure, consider signing up here to present in one of the following weeks!
Thanks,
Matt Raymond
June 20 2022, 11:40 PM
[ML Theory Reading Group] Meeting Announcement 06/21/2022
This week Matt Raymond will be presenting Automatic Symmetry Discovery with Lie Algebra Convolutional Network:
Existing equivariant neural networks require prior knowledge of the symmetry group and discretization for continuous groups. We propose to work with Lie algebras (infinitesimal generators) instead of Lie groups. Our model, the Lie algebra convolutional network (L-conv) can automatically discover symmetries and does not require discretization of the group. We show that L-conv can serve as a building block to construct any group equivariant feedforward architecture. Both CNNs and Graph Convolutional Networks can be expressed as L-conv with appropriate groups. We discover direct connections between L-conv and physics: (1) group invariant loss generalizes field theory (2) Euler-Lagrange equation measures the robustness, and (3) equivariance leads to conservation laws and Noether current. These connections open up new avenues for designing more general equivariant networks and applying them to important problems in physical sciences.
We will be meeting at 12 PM ET in our typical zoom room.
If you have a paper that you think is interesting or needs more exposure, consider signing up here to present in one of the following weeks!
Thanks,
Matt Raymond
June 13 2022, 9:43 PM
[ML Theory Reading Group] Meeting Announcement 06/14/2022
All,
We do not have a presenter for tomorrow, so we will be skipping this week. However, next week I will be presenting Automatic Symmetry Discovery with Lie Algebra Convolutional Network:
Existing equivariant neural networks require prior knowledge of the symmetry group and discretization for continuous groups. We propose to work with Lie algebras (infinitesimal generators) instead of Lie groups. Our model, the Lie algebra convolutional network (L-conv) can automatically discover symmetries and does not require discretization of the group. We show that L-conv can serve as a building block to construct any group equivariant feedforward architecture. Both CNNs and Graph Convolutional Networks can be expressed as L-conv with appropriate groups. We discover direct connections between L-conv and physics: (1) group invariant loss generalizes field theory (2) Euler-Lagrange equation measures the robustness, and (3) equivariance leads to conservation laws and Noether current. These connections open up new avenues for designing more general equivariant networks and applying them to important problems in physical sciences.
If you have a paper that you think is interesting or needs more exposure, consider signing up here to present in one of the following weeks!
Thanks,
Matt Raymond
June 7 2022, 10:26 AM
[ML Theory Reading Group] Meeting Announcement 06/07/2022
All,
Apologies for the late announcement. We do not have a presenter for today, so we will not be meeting. Hopefully I will be presenting next week, but we're still looking for presenters for the following weeks. You can sign up using this google sheets spreadsheet.
Thanks,
Matt Raymond
May 31, 2022, 07:49 AM
[ML Theory Reading Group] Meeting Announcement 05/31/2022
All,
Today at 12 pm, Jianxin Zhang will be presenting "Learning from Label Proportions by Learning from Label Noise" in this zoom room:
Learning from label proportions (LLP) is a weakly supervised classification problem where data points are grouped into bags, and the label proportions within each bag are observed instead of the instance-level labels. The task is to learn a classifier to predict the individual labels of future individual instances. Prior work on LLP for multi-class data has yet to develop a theoretically grounded algorithm. In this work, we provide a theoretically grounded approach to LLP based on a reduction to learning with label noise, using the forward correction (FC) loss of Patrini et al. [20]. We establish an excess risk bound and generalization error analysis for our approach, while also extending the theory of the FC loss which may be of independent interest. Our approach demonstrates improved empirical performance in deep learning scenarios across multiple datasets and architectures, compared to the leading existing methods.
We currently do not have anyone signed up to present next week. Presentations are vital for the success of a reading group! If you have a paper that you're reading or a publication that you've recently submitted, please consider signing up to present next week using this google sheets spreadsheet.
Thanks,
Matt Raymond
May 23, 2022, 08:46 AM
[ML Theory Reading Group] Meeting Announcement 05/24/2022
All,
Unfortunately, we do not have a presenter for tomorrow so we will be skipping this week as well. Next week, Jianxin Zhang will be presenting "Learning from Label Proportions by Learning from Label Noise":
Learning from label proportions (LLP) is a weakly supervised classification problem where data points are grouped into bags, and the label proportions within each bag are observed instead of the instance-level labels. The task is to learn a classifier to predict the individual labels of future individual instances. Prior work on LLP for multi-class data has yet to develop a theoretically grounded algorithm. In this work, we provide a theoretically grounded approach to LLP based on a reduction to learning with label noise, using the forward correction (FC) loss of Patrini et al. [20]. We establish an excess risk bound and generalization error analysis for our approach, while also extending the theory of the FC loss which may be of independent interest. Our approach demonstrates improved empirical performance in deep learning scenarios across multiple datasets and architectures, compared to the leading existing methods.
Thanks,
Matt Raymond
May 16, 2022, 3:06 PM
[ML Theory Reading Group] Meeting Announcement 05/17/2022
All,
Unfortunately, we do not have a presenter for tomorrow so we will be skipping this week. We also need presenters for next week, so please sign up if you have a paper that you're willing to present.
We will also be inviting faculty members and external presenters from NeurIPS 2022 to present in the upcoming weeks. I'll keep you posted as things develop.
Thanks,
Matt Raymond
Monday, May 9, 2022, 1:25 PM
[ML Theory Reading Group] Meeting Announcement 05/10/2022
All,
This Tuesday at 1pm EST, Matt Raymond will be presenting "Denoising Diffusion Probabilistic Models" by Ho et al:
We present high quality image synthesis results using diffusion probabilistic models, a class of latent variable models inspired by considerations from nonequilibrium thermodynamics. Our best results are obtained by training on a weighted variational bound designed according to a novel connection between diffusion probabilistic models and denoising score matching with Langevin dynamics, and our models naturally admit a progressive lossy decompression scheme that can be interpreted as a generalization of autoregressive decoding. On the unconditional CIFAR10 dataset, we obtain an Inception score of 9.46 and a state-of-the-art FID score of 3.17. On 256x256 LSUN, we obtain sample quality similar to ProgressiveGAN.
Additionally, per the survey results, we will be changing the meeting time to Tuesdays at 12pm EST starting next week. All meetings (starting this week) will be held in this Zoom room, which is also on our meeting calendar.
We currently do not have anyone signed up to present next week. Presentations are vital for the success of a reading group! If you have a paper that you're reading or a publication that you've recently submitted, please consider signing up to present next week using this google sheets spreadsheet.
Thanks,
Matt Raymond
Tuesday, May 3, 2022, 5:37 PM
ML Theory Reading Group: Into the Summer and Beyond
Hi all,
I've bolded the important points for your convenience.
In case you missed Kevin's email, I will be coordinating the ML theory reading group during the summer (and hopefully beyond).
Now that we've all had some time to recover from finals, I think we should try to start the group back up for the summer. If you plan on attending the reading group during the Summer and/or Fall, please fill out this (anonymous) survey so we can figure out how many people are going to attend. We will maintain the same schedule as last semester (1 pm starting 05/10/22) unless the survey indicates a preference otherwise.
Similar to last semester, the focus of the reading group is going to be on developing our own presentation skills, as well as sharing research that we think is theoretically significant (or clever). We will not be recording the sessions, so this is a good opportunity to present your work if it is not ready for public release just yet. However, starting this summer, we will be inviting outside researchers to present for the reading group. We have a short list already assembled, but feel free to add any suggestions while completing the survey. We will begin inviting external speakers 2-3 weeks after the reading group starts to ensure that our turnout is not too low for invited talks.
Volunteering for presentation slots is critical for the survival of reading groups, especially one this new. Even if you do not have enough material for a full hour, consider presenting for part of the hour. If possible, please invite other students to join the reading group to ensure that we reach critical mass.
Thank you all, and feel free to email me if you have any questions/concerns/comments! I look forward to seeing you all next week.
Matt Raymond
MSc CSE '22
PhD ECE '26 (anticipated)