Back to Search Start Over

SAED: self-attentive energy disaggregation.

Authors :
Virtsionis-Gkalinikis, Nikolaos
Nalmpantis, Christoforos
Vrakas, Dimitris
Source :
Machine Learning; Nov2023, Vol. 112 Issue 11, p4081-4100, 20p
Publication Year :
2023

Abstract

The field of energy disaggregation deals with the approximation of appliance electric consumption using only the aggregate consumption measurement of a mains meter. Recent research developments have used deep neural networks and outperformed previous methods based on Hidden Markov Models. On the other hand, deep learning models are computationally heavy and require huge amounts of data. The main objective of the current paper is to incorporate the attention mechanism into neural networks in order to reduce their computational complexity. For the attention mechanism two different versions are utilized, named Additive and Dot Attention. The experiments show that they perform on par, while the Dot mechanism is slightly faster. The two versions of self-attentive neural networks are compared against two state-of-the-art energy disaggregation deep learning models. The experimental results show that the proposed architecture achieves faster or equal training and inference time and with minor performance drop depending on the device or the dataset. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08856125
Volume :
112
Issue :
11
Database :
Complementary Index
Journal :
Machine Learning
Publication Type :
Academic Journal
Accession number :
173179371
Full Text :
https://doi.org/10.1007/s10994-021-06106-3