Back to Search Start Over

Time-Varying Sequence Model.

Authors :
Jadhav, Sneha
Zhao, Jianxiang
Fan, Yepeng
Li, Jingjing
Lin, Hao
Yan, Chenggang
Chen, Minghan
Source :
Mathematics (2227-7390). Jan2023, Vol. 11 Issue 2, p336. 15p.
Publication Year :
2023

Abstract

Traditional machine learning sequence models, such as RNN and LSTM, can solve sequential data problems with the use of internal memory states. However, the neuron units and weights are shared at each time step to reduce computational costs, limiting their ability to learn time-varying relationships between model inputs and outputs. In this context, this paper proposes two methods to characterize the dynamic relationships in real-world sequential data, namely, the internal time-varying sequence model (ITV model) and the external time-varying sequence model (ETV model). Our methods were designed with an automated basis expansion module to adapt internal or external parameters at each time step without requiring high computational complexity. Extensive experiments performed on synthetic and real-world data demonstrated superior prediction and classification results to conventional sequence models. Our proposed ETV model is particularly effective at handling long sequence data. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
22277390
Volume :
11
Issue :
2
Database :
Academic Search Index
Journal :
Mathematics (2227-7390)
Publication Type :
Academic Journal
Accession number :
161478198
Full Text :
https://doi.org/10.3390/math11020336