Testing for Changes in Forecasting Performance
Pierre Perron and
No WP2019-03, Boston University - Department of Economics - Working Papers Series from Boston University - Department of Economics
We consider the issue of forecast failure (or breakdown) and propose methods to assess retrospectively whether a given forecasting model provides forecasts which show evidence of changes with respect to some loss function. We adapt the classical structural change tests to the forecast failure context. First, we recommend that all tests should be carried with a fixed scheme to have best power. This ensures a maximum difference between the fitted in and out-of-sample means of the losses and avoids contamination issues under the rolling and recursive schemes. With a fixed scheme, Giacomini and Rossiâ€™s (2009) (GR) test is simply a Wald test for a one-time change in the mean of the total (the in-sample plus out-of-sample) losses at a known break date, say m, the value that separates the in and out-of-sample periods. To alleviate this problem, we consider a variety of tests: maximizing the GR test over values of m within a pre-specified range; a Double sup-Wald (DSW) test which for each m performs a sup-Wald test for a change in the mean of the out-of-sample losses and takes the maximum of such tests over some range; we also propose to work directly with the total loss series to define the Total Loss sup-Wald (TLSW) and Total Loss UDmax (TLUD) tests. Using theoretical analyses and simulations, we show that with forecasting models potentially involving lagged dependent variables, the only tests having a monotonic power function for all data-generating processes considered are the DSW and TLUD tests, constructed with a fixed forecasting window scheme. Some explanations are provided and empirical applications illustrate the relevance of our findings in practice.
Keywords: forecast breakdown; non-monotonic power; structural change; out-of-sample forecast (search for similar items in EconPapers)
JEL-codes: C14 C22 (search for similar items in EconPapers)
Pages: 52 pages
Date: 2018-05, Revised 2018-12
New Economics Papers: this item is included in nep-ets, nep-for and nep-ore
References: View references in EconPapers View complete reference list from CitEc
Citations: Track citations by RSS feed
Downloads: (external link)
Journal Article: Testing for Changes in Forecasting Performance (2021)
Working Paper: Testing for Changes in Forecasting Performance (2019)
Working Paper: Testing for Changes in Forecasting Performance (2018)
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
Persistent link: https://EconPapers.repec.org/RePEc:bos:wpaper:wp2019-003
Access Statistics for this paper
More papers in Boston University - Department of Economics - Working Papers Series from Boston University - Department of Economics Contact information at EDIRC.
Bibliographic data for series maintained by Program Coordinator ().