A comparative analysis of state-of-the-art-time series forecasting algorithms

Cathal Murray, Priyanka Chaurasia, L.E. Hollywood, Damien Coyle

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

2 Citations (SciVal)

Abstract

In recent years many new algorithms have been developed for applications in speech and image processing which may be repurposed for time series prediction. This paper presents a comprehensive comparative analysis of time series forecasting capabilities of eight such state-of-the-art algorithms – namely: Vanilla Long Short-Term Memory(V-LSTM) Gated Recurrent Unit (GRU), Bidirectional LSTM(BD-LSTM), Auto encoder (AE-LSTM), Convolutional Neural Network LSTM(CNN-LSTM), LSTM with convolutional encoder (ConvLSTM), Attention mechanism networks and the Transformer networks. Model performances across ten different benchmark datasets including fields of interests such as finance, weather and sales are evaluated. Direct and iterative prediction methods of forecasting are also comprehensively evaluated. For comprehensive and efficient model optimization, the asynchronous successive halving algorithm (ASHA) is applied in the training folds in a 10 fold cross validation framework. Statistical tests are used to comprehensively compare algorithm performances within and across datasets. We show that whilst there are differences between all models, the differences are insignificant for the top performing models which include the Transformer, Attention, V-LSTM, CNN-LSTM and CV-LSTM. However, the transformer model consistently produces the lowest prediction error. We also show that the iterative multistep ahead prediction method is optimal for long range prediction with these new algorithms.
Original languageEnglish
Title of host publicationInternational Conference on Computational Science and Computational Intelligence
Place of PublicationUnited States
PublisherIEEE Computational Intelligence Society
DOIs
Publication statusPublished - 25 Aug 2023

Bibliographical note

We are grateful for access to the Tier 2 High Performance Computing resources provided by the Northern Ireland High Performance Computing (NI-HPC) facility funded by the UK Engineering and Physical Sciences Research Council (EPSRC), Grant Nos. EP/T022175/ and EP/W03204X/1. DC is grateful for the UKRI Turing AI Fellowship 2021-2025 funded by the EPSRC (grant number EP/V025724/1). CM is supported by a Department for Economy Northern Ireland funded PhD Scholarship.

Keywords

  • time series prediction
  • forecasting
  • multi-horizon
  • attention
  • transformer
  • LSTM

Fingerprint

Dive into the research topics of 'A comparative analysis of state-of-the-art-time series forecasting algorithms'. Together they form a unique fingerprint.

Cite this