Autoregressive Conditional Heteroskedasticity
Gebhard Kirchgässner and
Juergen Wolters
Chapter 7 in Introduction to Modern Time Series Analysis, 2007, pp 241-265 from Springer
Abstract:
Abstract All models discussed so far use the conditional expectation to describe the mean development of one or more time series. The optimal forecast, in the sense that the variance of the forecast errors will be minimised, is given by the conditional mean of the underlying model. Here, it is assumed that the residuals are not only uncorrelated but also homoskedastic, i.e. that the unexplained fluctuations have no dependencies in the second moments. However, Benoit Mandelbrot (1963) already showed that financial market data have more outliers than would be compatible with the (usually assumed) normal distribution and that there are ‘volatility clusters’: small (large) shocks are again followed by small (large) shocks. This may lead to ‘leptokurtic distributions’, which — as compared to a normal distribution — exhibit more mass at the centre and at the tails of the distribution. This results in ‘excess kurtosis’, i.e. the values of the kurtosis are above three.
Keywords: Conditional Variance; Leverage Effect; Arch Effect; Excess Kurtosis; Volatility Cluster (search for similar items in EconPapers)
Date: 2007
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
Chapter: Autoregressive Conditional Heteroscedasticity (2013)
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-3-540-73291-4_7
Ordering information: This item can be ordered from
http://www.springer.com/9783540732914
DOI: 10.1007/978-3-540-73291-4_7
Access Statistics for this chapter
More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().