Continual learning denotes the machine learning paradigm that considers adaptive algorithms capable of learning from a continuous stream of input data that become progressively available over time. The key characteristic of such a continual learner is the ability to learn new tasks without compromising previously acquired knowledge, i.e., without catastrophic forgetting. This is a major challenge for current neural networks because of the stability-plasticity dilemma: on the one hand, neural networks have to be adaptive to new input data from non-stationary distributions; on the other hand, large weight changes imply a loss of existing knowledge that is encoded in the weight matrix. Thus, there is a trade-off between stability and plasticity.
Continual learning is inherently an incremental process, without a sharp distinction between a training phase and an application phase. A further characteristic is forward and backward transfer learning: the model can leverage previously acquired knowledge to solve weakly related, new tasks; and conversely, new knowledge might improve the performance on old tasks. However, there are constraints imposed on the model's capacity to process and store previous knowledge. For example, if the model is based on a neural network, then the maximum number of nodes is fixed to prevent indefinite growth.
The self-organizing incremental neural network (SOINN) is designed for online learning. Learning in SOINN bears similarities with grow when required networks and growing neural gas. We have recently developed SOINN+, the next generation of SOINN, for unsupervised learning from noisy data streams [1,2]. SOINN+ "forgets gracefully", which means that previously acquired network structures are not destroyed by sudden concept drifts in the data streams.
In our latest work [3], we extended SOINN+ for continual supervised learning. The new model, called GSOINN+, is able to learn a sequence of unrelated classification tasks without catastrophic forgetting.
References
[1] Wiwatcharakoses C. and Berrar D. (2020) . SOINN+, a self-organizing incremental neural network for unsupervised learning from noisy data streams. Expert Systems with Applications. [link]
[2] Wiwatcharakoses C. and Berrar D. (2019) Self-organizing incremental neural networks for continual learning. Proceedings of the 28th International Joint Conference on Artificial Intelligence (IJCAI 2019), Macao, China, 2019, pp. 6476-6477. [pdf]
[3] Wiwatcharakoses C. and Berrar D. (2019) A self-organizing incremental neural network for continual supervised learning. Expert Systems with Applications.