Using High Frequency Data to Calculate, Model and Forecast Realized Volatility
Roel Oomen ()
No 75, Computing in Economics and Finance 2001 from Society for Computational Economics
The objective of this paper is to calculate, model, and forecast realized volatility using high-frequency stock-market index data. The approach differs from existing ones in several ways. First, it is shown that the decay of the serial dependence of high-frequency returns on the sampling frequency is consistent with an ARMA process under temporal aggregation. This is important in modelling high-frequency returns and chosing the optimal sampling frequency when calculating realized volatility. Second, as a result of several test statistics for long memory in realized volatility, it is found that the realized volatility series can be modelled as an ARFIMA process. The ARFIMA's forecasting performance is assessed in a simulation study, and, although it outperforms representative GARCH models, it does so with greater complexity and data intensiveness that may not be worthwhile relative to GARCH's simplicity and flexibility.
Keywords: High Frequency Data; Long Memory; GARCH; Realized Volatility (search for similar items in EconPapers)
JEL-codes: C51 C52 C53 G12 (search for similar items in EconPapers)
References: Add references at CitEc
Citations: View citations in EconPapers (2) Track citations by RSS feed
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
Persistent link: https://EconPapers.repec.org/RePEc:sce:scecf1:75
Access Statistics for this paper
More papers in Computing in Economics and Finance 2001 from Society for Computational Economics Contact information at EDIRC.
Bibliographic data for series maintained by Christopher F. Baum ().