1. Dynamically evolving deep neural networks with continuous online learning.
- Author
-
Zhong, Yuan, Zhou, Jing, Li, Ping, and Gong, Jie
- Subjects
- *
ARTIFICIAL neural networks , *ONLINE education , *RECOLLECTION (Psychology) , *DATA distribution , *BIG data - Abstract
In a big data environment, data streams are sequences of dynamically changing data with unlimited length; they are often associated with concept drift, caused by data distribution transition. The uncertainty of data streams challenges the prediction ability of traditional static algorithms and hampers their practical application. Although many deep neural network models can be tailored to perform in an online manner, they are not suitable for non-stationary data streams that continually change over time. In this work, we propose a deep neural network model that can dynamically adjust to data distribution change by self-growth, that is, by changing its structure from narrow to wide and/or from shallow to deep. Specifically, initialized from a standard multilayer perceptron, the model grows by adding neurons according to a local error and adding layers according to the change in Jenson–Shannon divergence. The growth involves an adaptive memory mechanism, designed to avoid catastrophic forgetting, which may occur when hidden layers are added. In experiments, the proposed model achieved remarkable improvements in adaptability and prediction performance across 12 regression datasets (including non-stationary and stationary data streams), compared with the state-of-the-art dynamic neural network and other online approaches. The influence of the parameters, timeliness, convergence, and size of the pretrained model are explored in the paper. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF