Exploring the sources of uncertainty: why does bagging for time series forecasting work?

Fotios Petropoulos, Rob J. Hyndman, Christoph Bergmeir

Research output: Contribution to journalArticle

Abstract

In a recent study, Bergmeir, Hyndman and Ben´ıtez (2016, Bagging exponential smoothing methods using STL decomposition and Box-Cox transformation, International Journal of Forecasting 32, 303-312) successfully employed a bootstrap aggregation (bagging) technique for improving the performance of exponential smoothing. Each series is Box-Cox transformed, and decomposed by Seasonal and Trend decomposition using Loess (STL); then bootstrapping is applied on the remainder series before the trend and seasonality are added back, and the transformation reversed to create bootstrapped versions of the series. Subsequently, they apply automatic exponential smoothing on the original series and the bootstrapped versions of the series, with the final forecast being the equal-weight combination across all forecasts. In this study we attempt to address the question: why does bagging for time series forecasting work? We assume three sources of uncertainty (model uncertainty, data uncertainty, and parameter uncertainty) and we separately explore the benefits of bagging for time series forecasting for each one of them. Our analysis considers 4,004 time series (from the M- and M3-competitions) and two families of models. The results show that the benefits of bagging predominantly originate from the model uncertainty: the fact that different models might be selected as optimal for the bootstrapped series. As such, a suitable weighted combination of the most suitable models should be preferred to selecting a single model.
LanguageEnglish
Pages545-554
JournalEuropean Journal of Operational Research
Volume268
Issue number2
Early online date2 Feb 2018
DOIs
StatusPublished - 16 Jul 2018

Fingerprint

Time Series Forecasting
Bagging
Time series
Exponential Smoothing
Uncertainty
Series
Model Uncertainty
Forecast
Box-Cox Transformation
Decomposition
Decompose
Seasonality
Smoothing Methods
Bootstrapping
Parameter Uncertainty
Remainder
Model
Bootstrap
Time series forecasting
Forecasting

Cite this

Exploring the sources of uncertainty: why does bagging for time series forecasting work? / Petropoulos, Fotios; Hyndman, Rob J.; Bergmeir, Christoph.

In: European Journal of Operational Research, Vol. 268, No. 2, 16.07.2018, p. 545-554.

Research output: Contribution to journalArticle

@article{797efe1048b74687a67629000f3405b6,
title = "Exploring the sources of uncertainty: why does bagging for time series forecasting work?",
abstract = "In a recent study, Bergmeir, Hyndman and Ben´ıtez (2016, Bagging exponential smoothing methods using STL decomposition and Box-Cox transformation, International Journal of Forecasting 32, 303-312) successfully employed a bootstrap aggregation (bagging) technique for improving the performance of exponential smoothing. Each series is Box-Cox transformed, and decomposed by Seasonal and Trend decomposition using Loess (STL); then bootstrapping is applied on the remainder series before the trend and seasonality are added back, and the transformation reversed to create bootstrapped versions of the series. Subsequently, they apply automatic exponential smoothing on the original series and the bootstrapped versions of the series, with the final forecast being the equal-weight combination across all forecasts. In this study we attempt to address the question: why does bagging for time series forecasting work? We assume three sources of uncertainty (model uncertainty, data uncertainty, and parameter uncertainty) and we separately explore the benefits of bagging for time series forecasting for each one of them. Our analysis considers 4,004 time series (from the M- and M3-competitions) and two families of models. The results show that the benefits of bagging predominantly originate from the model uncertainty: the fact that different models might be selected as optimal for the bootstrapped series. As such, a suitable weighted combination of the most suitable models should be preferred to selecting a single model.",
author = "Fotios Petropoulos and Hyndman, {Rob J.} and Christoph Bergmeir",
year = "2018",
month = "7",
day = "16",
doi = "10.1016/j.ejor.2018.01.045",
language = "English",
volume = "268",
pages = "545--554",
journal = "European Journal of Operational Research",
issn = "0377-2217",
publisher = "Elsevier",
number = "2",

}

TY - JOUR

T1 - Exploring the sources of uncertainty: why does bagging for time series forecasting work?

AU - Petropoulos,Fotios

AU - Hyndman,Rob J.

AU - Bergmeir,Christoph

PY - 2018/7/16

Y1 - 2018/7/16

N2 - In a recent study, Bergmeir, Hyndman and Ben´ıtez (2016, Bagging exponential smoothing methods using STL decomposition and Box-Cox transformation, International Journal of Forecasting 32, 303-312) successfully employed a bootstrap aggregation (bagging) technique for improving the performance of exponential smoothing. Each series is Box-Cox transformed, and decomposed by Seasonal and Trend decomposition using Loess (STL); then bootstrapping is applied on the remainder series before the trend and seasonality are added back, and the transformation reversed to create bootstrapped versions of the series. Subsequently, they apply automatic exponential smoothing on the original series and the bootstrapped versions of the series, with the final forecast being the equal-weight combination across all forecasts. In this study we attempt to address the question: why does bagging for time series forecasting work? We assume three sources of uncertainty (model uncertainty, data uncertainty, and parameter uncertainty) and we separately explore the benefits of bagging for time series forecasting for each one of them. Our analysis considers 4,004 time series (from the M- and M3-competitions) and two families of models. The results show that the benefits of bagging predominantly originate from the model uncertainty: the fact that different models might be selected as optimal for the bootstrapped series. As such, a suitable weighted combination of the most suitable models should be preferred to selecting a single model.

AB - In a recent study, Bergmeir, Hyndman and Ben´ıtez (2016, Bagging exponential smoothing methods using STL decomposition and Box-Cox transformation, International Journal of Forecasting 32, 303-312) successfully employed a bootstrap aggregation (bagging) technique for improving the performance of exponential smoothing. Each series is Box-Cox transformed, and decomposed by Seasonal and Trend decomposition using Loess (STL); then bootstrapping is applied on the remainder series before the trend and seasonality are added back, and the transformation reversed to create bootstrapped versions of the series. Subsequently, they apply automatic exponential smoothing on the original series and the bootstrapped versions of the series, with the final forecast being the equal-weight combination across all forecasts. In this study we attempt to address the question: why does bagging for time series forecasting work? We assume three sources of uncertainty (model uncertainty, data uncertainty, and parameter uncertainty) and we separately explore the benefits of bagging for time series forecasting for each one of them. Our analysis considers 4,004 time series (from the M- and M3-competitions) and two families of models. The results show that the benefits of bagging predominantly originate from the model uncertainty: the fact that different models might be selected as optimal for the bootstrapped series. As such, a suitable weighted combination of the most suitable models should be preferred to selecting a single model.

U2 - 10.1016/j.ejor.2018.01.045

DO - 10.1016/j.ejor.2018.01.045

M3 - Article

VL - 268

SP - 545

EP - 554

JO - European Journal of Operational Research

T2 - European Journal of Operational Research

JF - European Journal of Operational Research

SN - 0377-2217

IS - 2

ER -