Do Principles Aid Forecasts?

A test of trained and untrained judgment using Rule-based Forecasting

Authors

  • Monica Adya
  • Edward J. Lusk
  • Moncef Balhadjali

Keywords:

Time Series, RBF, Non-Expert User Group, Forecasting Expertise

Abstract

Recent undertakings in forecasting have called for the assimilation of forecasting knowledge in the form of practical principles that can better inform practitioners, researchers, and educators. In this paper, we examined one aspect of this intent - whether the use of forecasting principles can improve forecast accuracy. Our study used forecasting principles modeled in Rule-based Forecasting (RBF) and explored the following fundamental question: Can principles from RBF guide non-experts towards improved forecast accuracy? To investigate this, we compared forecasts generated by participants who received instruction in the RBF procedures to those who did not. We also compared these results to those generated by the automated-RBF system. Using multiple error measures, we observe that forecasts from the trained group were about 23 to 16% more accurate than those from the untrained group. Forecasts from the automated RBF system were superior to the trained group by about 6-12% across the six horizons. Participant responses to post-study questionnaires revealed that while more training and exposure to RBF was desirable, subjects were satisfied with the level of instruction that they received in the classroom setting.

Published

2018-10-02

Issue

Section

Artikel