Bootstrap prediction intervals for SETAR models
Jing Li
International Journal of Forecasting, 2011, vol. 27, issue 2, 320-332
Abstract:
This paper considers four methods for obtaining bootstrap prediction intervals (BPIs) for the self-exciting threshold autoregressive (SETAR) model. Method 1 ignores the sampling variability of the threshold parameter estimator. Method 2 corrects the finite sample biases of the autoregressive coefficient estimators before constructing BPIs. Method 3 takes into account the sampling variability of both the autoregressive coefficient estimators and the threshold parameter estimator. Method 4 resamples the residuals in each regime separately. A Monte Carlo experiment shows that (1) accounting for the sampling variability of the threshold parameter estimator is necessary, despite its super-consistency; (2) correcting the small-sample biases of the autoregressive parameter estimators improves the small-sample properties of bootstrap prediction intervals under certain circumstances; and (3) the two-sample bootstrap can improve the long-term forecasts when the error terms are regime-dependent.
Keywords: Bootstrap; Interval forecasting; SETAR models; Time series; Simulation (search for similar items in EconPapers)
Date: 2011
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (10)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0169207010000385
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:intfor:v:27:y:2011:i:2:p:320-332
DOI: 10.1016/j.ijforecast.2010.01.013
Access Statistics for this article
International Journal of Forecasting is currently edited by R. J. Hyndman
More articles in International Journal of Forecasting from Elsevier
Bibliographic data for series maintained by Catherine Liu ().