1. Enhancing Multivariate Time Series Classifiers Through Self-Attention and Relative Positioning Infusion
- Author
-
Mehryar Abbasi and Parvaneh Saeedi
- Subjects
Multivariate time series classification ,positional information ,temporal attention ,time series analysis ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Time Series Classification (TSC) is an important and challenging task for many visual computing applications. Despite the extensive range of methods developed for TSC, only a few are based on Deep Neural Networks (DNNs). In this paper, we present two novel attention blocks: (Global Temporal Attention and Temporal Pseudo-Gaussian Augmented Self-Attention) that can enhance deep learning-based TSC approaches, even when such approaches are designed and optimized for specific datasets or tasks. We validate the performance of the proposed blocks using multiple state-of-the-art deep learning-based TSC models on the University of East Anglia (UEA) benchmark, including a standardized collection of 30 Multivariate Time Series Classification (MTSC) datasets. We demonstrate that adding the proposed attention blocks increases base models’ average accuracy by up to 3.6%. Additionally, the proposed TPS block uses a new injection module to include the relative positional information in transformers. As a standalone unit with less computational complexity, it enables TPS to perform better than most of the state-of-the-art DNN-based TSC methods. The source codes for our setups and the attention blocks are publicly available (https://github.com/mehryar72/TimeSeriesClassification-TPS).
- Published
- 2024
- Full Text
- View/download PDF