Cressie Read Power Divergence for Moment-Based Estimation: Hyperparameter and Finite Sample Behavior
Jieun Lee and
Anil K. Bera
Papers from arXiv.org
Abstract:
We study Cressie Read power divergence (CRPD) estimation for moment based models, focusing on finite sample behavior. While generalized empirical likelihood estimators, dual to CRPD, are known to outperform generalized method of moments estimators in small to moderate samples, the power parameter is typically chosen arbitrarily by the researcher, serving mainly as an index. We interpret it as a hyperparameter that determines the loss function and governs the learning procedure, shaping the curvature of the objective and influencing finite sample performance. Using second order asymptotics, we show that it affects both the structural estimator and the associated Lagrange multipliers, governing robustness, bias, and sensitivity to sampling variation. Monte Carlo simulations illustrate how estimator performance varies with the choice of the power parameter and underlying distributional features, with implications for second order bias and coverage distortion. An empirical illustration based on Owen (2001)s classical example highlights the practical relevance of tuning the power parameter.
Date: 2026-03
New Economics Papers: this item is included in nep-dcm
References: Add references at CitEc
Citations:
Downloads: (external link)
http://arxiv.org/pdf/2603.22599 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2603.22599
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().