1. Self-organizing neural networks for universal learning and multimodal memory encoding.
- Author
-
Tan AH, Subagdja B, Wang D, and Meng L
- Subjects
- Mathematical Concepts, Machine Learning
- Abstract
Learning and memory are two intertwined cognitive functions of the human brain. This paper shows how a family of biologically-inspired self-organizing neural networks, known as fusion Adaptive Resonance Theory (fusion ART), may provide a viable approach to realizing the learning and memory functions. Fusion ART extends the single-channel Adaptive Resonance Theory (ART) model to learn multimodal pattern associative mappings. As a natural extension of ART, various forms of fusion ART have been developed for a myriad of learning paradigms, ranging from unsupervised learning to supervised learning, semi-supervised learning, multimodal learning, reinforcement learning, and sequence learning. In addition, fusion ART models may be used for representing various types of memories, notably episodic memory, semantic memory and procedural memory. In accordance with the notion of embodied intelligence, such neural models thus provide a computational account of how an autonomous agent may learn and adapt in a real-world environment. The efficacy of fusion ART in learning and memory shall be discussed through various examples and illustrative case studies., (Copyright © 2019 Elsevier Ltd. All rights reserved.)
- Published
- 2019
- Full Text
- View/download PDF