A minimum Hellinger distance estimator for stochastic differential equations: An application to statistical inference for continuous time interest rate models
Ludovic Giet and
Michel Lubrano
Computational Statistics & Data Analysis, 2008, vol. 52, issue 6, 2945-2965
Abstract:
A minimum disparity estimator minimizes a [phi]-divergence between the marginal density of a parametric model and its non-parametric estimate. This principle is applied to the estimation of stochastic differential equation models, choosing the Hellinger distance as particular [phi]-divergence. Under an hypothesis of stationarity, the parametric marginal density is provided by solving the Kolmogorov forward equation. A particular emphasis is put on the non-parametric estimation of the sample marginal density which has to take into account sample dependence and kurtosis. A new window size determination is provided. The classical estimator is presented alternatively as a distance minimizer and as a pseudo-likelihood maximizer. The latter presentation opens the way to Bayesian inference. The method is applied to continuous time models of the interest rate. In particular, various models are tested using alternatively tests and their results are discussed.
Date: 2008
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (5)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167-9473(07)00392-1
Full text for ScienceDirect subscribers only.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:csdana:v:52:y:2008:i:6:p:2945-2965
Access Statistics for this article
Computational Statistics & Data Analysis is currently edited by S.P. Azen
More articles in Computational Statistics & Data Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().