1. Large Multivariate Time Series Forecasting: Survey on Methods and Scalability
- Author
-
Youssef Hmamouche, Piotr Przymus, Lotfi Lakhal, Hana Alouaoui, Alain Casali, Laboratoire d'Informatique et Systèmes (LIS), Aix Marseille Université (AMU)-Université de Toulon (UTLN)-Centre National de la Recherche Scientifique (CNRS), Nicolaus Copernicus University [Toruń], Data Mining at scale (DANA), and Aix Marseille Université (AMU)-Université de Toulon (UTLN)-Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-Université de Toulon (UTLN)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
Process (engineering) ,business.industry ,Computer science ,Big data ,Univariate ,Context (language use) ,Machine learning ,computer.software_genre ,Variable (computer science) ,[STAT.ML]Statistics [stat]/Machine Learning [stat.ML] ,Business intelligence ,Scalability ,Artificial intelligence ,Time series ,business ,computer - Abstract
International audience; Research on the analysis of time series has gained momentum in recent years, as knowledge derived from time series analysis can improve the decision-making process for industrial and scientific fields. Furthermore, time series analysis is often an essential part of business intelligence systems. With the growing interest in this topic, a novel set of challenges emerges. Utilizing forecasting models that can handle a large number of predictors is a popular approach that can improve results compared to univariate models. However, issues arise for high dimensional data. Not all variables will have direct impact on the target variable and adding unrelated variables may make the forecasts less accurate. Thus, the authors explore methods that can effectively deal with time series with many predictors. The authors discuss state-of-the-art methods for optimizing the selection, dimension reduction, and shrinkage of predictors. While similar research exists, it exclusively targets small and medium datasets, and thus, the research aims to fill the knowledge gap in the context of big data applications.
- Published
- 2019