Back to Search Start Over

Energy-Efficient and Timeliness-Aware Continual Learning Management System.

Authors :
Kang, Dong-Ki
Source :
Energies (19961073); Dec2023, Vol. 16 Issue 24, p8018, 19p
Publication Year :
2023

Abstract

Continual learning has recently become a primary paradigm for deep neural network models in modern artificial intelligence services, where streaming data patterns frequently and irregularly change over time in dynamic environments. Unfortunately, there is still a lack of studies on computing cluster management for the processing of continual learning tasks, particularly in terms of the timeliness of model updates and associated energy consumption. In this paper, we propose a novel timeliness-aware continual learning management (TA-CLM) system aimed at ensuring timely deep neural network model updates for continual learning tasks while minimizing the energy consumption of computing worker nodes in clusters. We introduce novel penalty cost functions to penalize quantitatively deep neural network model update latency and present the associated optimization formulation to ensure the best task allocation. Additionally, we design a simulated annealing-based optimizer, which is a meta-heuristic technique and easy to implement, to solve the non-convex and non-linear optimization problem. We demonstrate that the proposed TA-CLM system improves both latency and energy performance over its competitors by an average of 51.3% and 51.6%, respectively, based on experimental results using raw data from well-known deep neural network models on an NVIDIA GPU-based testbed and a large-scale simulation environment. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19961073
Volume :
16
Issue :
24
Database :
Complementary Index
Journal :
Energies (19961073)
Publication Type :
Academic Journal
Accession number :
174437597
Full Text :
https://doi.org/10.3390/en16248018