Fusion ART is a multi-channel neural network architecture based on ART (Adaptive Resonance Theory). It is designed to learn cognitive nodes, each encoding the multi-modal representation of a memory chunk across multiple pattern channels, in response to a continual stream of incoming patterns. Whereas the original ART perform unsupervised learning of recognition categories from input patterns, fusion ART learns multi-channel associative mappings across multimodal pattern channels in an online and incremental manner. Fusion ART with a single input channel reduces to the original ART model, fusion ART with two or more pattern channels extends unsupervised learning to supervised learning, semi-supervised learning, multimodal learning, reinforcement learning, and sequence learning. The knowledge learned by fusion ART can also be interpreted as various types of memory systems. More detail about Fusion ART model can be seen in the references as follows.
Ah-Hwee Tan, Budhitama Subagdja, Di Wang and Lei Meng. Self-organizing Neural Networks for Universal Learning and Multimodal Memory Encoding. Neural Networks, 120 (2019) 58-73. [PDF]
Ah-Hwee Tan., Gail A. Carpenter, Stephen Grossberg (2007) Intelligence Through Interaction: Towards a Unified Theory for Learning. In: Advances in Neural Networks. Lecture Notes in Computer Science, vol 4491. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72383-7_128
Fusion ART has been implemented as a framework wherein developers can try different configurations of the multi-channel neural networks for different purposes. The Python codes of fusion ART consists of two files that can be downloaded as follows.
fusionART.py is the main code with the main definition of fusion ART as a class.
ARTfunc.py is the library of functions to be used by fusionART.py
The fusion ART codes have dependency with numpy library in Python. Please make sure that numpy is installed before the codes can be used.
Examples are also provided as jupyter notebook files as follows (require jupyter to be installed).
basic-FusionART-example.ipynb shows the basic of how to create fusion ART object, how to setup the parameters and how to train it to store input patterns.
schemabased-FusionART-example.ipynb shows about how to create fusion ART object and setting up parameters as well but using schema method to specify the name and attribute labels for every channel/field. The schema can also be useful to automate some required pre-processing like complement-coding in fusion ART.
loading-FusionART-network.ipynb shows how to load the learned model of fusion ART after it is saved to a file (shown in the previous two examples).
fusionART-IRIS-classification.ipynb shows the application of fusion ART in supervised learning for classification task. The example makes use of IRIS flower dataset for the classification task from scikit-learn library in python. This example requires scikit-learn library to be installed.
fusionART-2chn-mtrack-IRIS-classification.ipynb shows the application of fusion ART in supervised learning for classification task similar to the example using IRIS dataset above. However, this example uses two channels fusion ART instead of three. This example also shows how to use 'match tracking' feature of fusion ART for automatically but transiently adjusting the vigilance parameter during learning.
Built upon fusion ART, the Episodic Memory-Adaptive Resonance Theory (EM-ART) is a computational model of Episodic Memory. EM-ART stores events and episodes by combining two fusion ART networks: one for encoding events and the other for episodes. It can also be seen also as three-layers fusion ART model.
As an extension of fusion ART, the EM-ART model supports event encoding in the form of multiple-modal patterns. An episodic encoding scheme is introduced that allows temporal sequences of events to be learned and recognized. The model incorporates a memory search procedure to recall stored episodic traces in response to potentially imperfect search cues.
More detail about EM-ART model can be seen in the references as follows.
Wenwen Wang, Budhitama Subagdja, Ah-Hwee Tan, and Janusz A. Starzyk. Neural modeling of episodic memory: Encoding, retrieval, and forgetting. (2012). IEEE Transactions on Neural Networks and Learning Systems. 23, (10), 1574-1586. Research Collection School Of Computing and Information Systems. [PDF]
Budhitama Subagdja and Ah-Hwee Tan. Neural modeling of sequential inferences and learning over episodic memory. (2015). Neurocomputing. 161, (2015), 229-242. Research Collection School Of Computing and Information Systems. [PDF]
EM-ART can be implemented by making use of fusion ART implementation framework as the building block. Examples of how to create EM-ART and to make use of it for some episodic memory task are provided below as jupyter notebook files. The codes in the example requires the fusion ART codes to present in the same folder. Note that, for this version of codes, "forgetting" feature as mentioned in the paper as referred above is still not included yet.
EM-ART-complemented-fields.ipynb shows how EM-ART can be constructed by stacking two fusion ARTs together. It also shows how EM-ART can be used to encode and store episodes and how to recall back the episodes by using partial sequence cues.
EM-ART-schema-buffer.ipynb is similar to the above example of EM-ART construction and use. The difference is the use of schema method in fusion ART to define the episode buffer for transiently store the sequence of events.
loading-EM-ART-network.ipynb shows how to load the learned model of EM-ART after it is saved to a file (shown in the previous two examples).