Continual Learning: Surveys
A comprehensive survey of continual learning: Theory, method and application
Embracing Change: Continual Learning in Deep Neural Networks
Towards Continual Reinforcement Learning: A Review and Perspectives
Continual Learning with Deep Architectures (tutorial by Irina Rish and Vincenzo Lomonaco, ICML 2021)
Continual learning: A comparative study on how to defy forgetting in classification tasks
Class-incremental learning: survey and performance evaluation
Never-Ending Learning (tutorial by Tom Mitchell and Partha Talukdar, ICML 2019)
Continual Learning @ Scale
Continual Pre-Training of Large Language Models: How to re-warm your model? (ICML ES-FoMo workshop 2023)
Fine-tuned Language Models are Continual Learners (aka Continual T0) (EMNLP 2022)
Effect of scale on catastrophic forgetting in neural networks (ICLR 2022)
An Empirical Study of Catastrophic Forgetting in Large Language Models During Continual Fine-tuning
Intelligent Learning Rate Distribution to Reduce Catastrophic Forgetting in Transformers
Improving language models fine-tuning with representation consistency targets
Foundational Models for Continual Learning: An Empirical Study of Latent Replay
Effects of Model and Prior Learning Scale on Catastrophic Forgetting
Don’t Stop Learning: Towards Continual Learning for the CLIP Model
Continual Learning in NLP
Continual Lifelong Learning in Natural Language Processing: A Survey (COLING 2020)
Drinking from a Firehose: Continual Learning with Web-scale Natural Language (TPAMI 2023)
Pretrained Language Model in Continual Learning: A Comparative Study, (ICLR, 2022)
Memorization Without Overfitting: Analyzing the Training Dynamics of Large Language Models
LAMOL: LAnguage MOdeling for Lifelong Language Learning (ICLR 2020)
TemporalWiki: A Lifelong Benchmark for Training and Evaluating Ever-Evolving Language Models (2022)
Towards Continual Knowledge Learning of Language Models
Continual Pre-training of Language Models (ICLR 2023 )
Episodic memory in lifelong language learning (Neurips 2019)
Recall and learn: Fine-tuning deep pretrained language models with less forgetting
Lifelong Pretraining: Continually Adapting Language Models to Emerging Corpora
Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning (Neurips 2021)
On Anytime Learning at Macroscale (CoLLas 2022)
Continual Learning in NLP
Continual Lifelong Learning in Natural Language Processing: A Survey (COLING 2020)
Drinking from a Firehose: Continual Learning with Web-scale Natural Language (TPAMI 2023)
Pretrained Language Model in Continual Learning: A Comparative Study, (ICLR, 2022)
Memorization Without Overfitting: Analyzing the Training Dynamics of Large Language Models
LAMOL: LAnguage MOdeling for Lifelong Language Learning (ICLR 2020)
TemporalWiki: A Lifelong Benchmark for Training and Evaluating Ever-Evolving Language Models (2022)
Towards Continual Knowledge Learning of Language Models
Continual Pre-training of Language Models (ICLR 2023 )
Episodic memory in lifelong language learning (Neurips 2019)
Recall and learn: Fine-tuning deep pretrained language models with less forgetting
Lifelong Pretraining: Continually Adapting Language Models to Emerging Corpora
Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning (Neurips 2021)
On Anytime Learning at Macroscale (CoLLas 2022)
Misc
Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback
Continual Pre-Training Mitigates Forgetting in Language and Vision