References

Below is a non-exhaustive list of relevant works you can consider for the workshop:

1. Knoblauch, Jeremias, Hisham Husain, and Tom Diethe. "Optimal continual learning has perfect memory and is NP-hard." ICML (2020).

2. Pan, Pingbo, et al. "Continual deep learning by functional regularisation of memorable past." NeurIPS (2020).

3. Borsos, Zalán, Mojmír Mutný, and Andreas Krause. "Coresets via bilevel optimization for continual learning and streaming." NeurIPS (2020).

4. Bennani, Mehdi Abbana, and Masashi Sugiyama. "Generalisation guarantees for continual learning with orthogonal gradient descent." arXiv preprint arXiv:2006.11942 (2020).

5. Doan, Thang, et al. "A Theoretical Analysis of Catastrophic Forgetting through the NTK Overlap Matrix." AISTATS (2021).

6. Yin, Dong, Mehrdad Farajtabar, and Ang Li. "SOLA: Continual Learning with Second-Order Loss Approximation." arXiv preprint arXiv:2006.10974 (2020).

7. Benzing, Frederik. "Understanding Regularisation Methods for Continual Learning." arXiv preprint arXiv:2006.06357 (2021).

8. Krishnan, R., and Prasanna Balaprakash. "Meta Continual Learning via Dynamic Programming." arXiv preprint arXiv:2008.02219 (2021).

9. Budden, David, et al. "Gaussian Gated Linear Networks." NeurIPS (2020).

10. Veness, Joel, et al. "Online learning with gated linear networks." arXiv preprint arXiv:1712.01897 (2017).

11. Zhuang, Zhenxun, et al. "No-regret non-convex online meta-learning." (ICASSP). IEEE, 2020.

12. Loo, Noel, Siddharth Swaroop, and Richard E. Turner. "Generalized Variational Continual Learning." ICLR (2021).

13. Xie, Zeke, et al. "Artificial Neural Variability for Deep Learning: On Overfitting, Noise Memorization, and Catastrophic Forgetting." arXiv preprint arXiv:2011.06220 (2020).