Expected Kullback-Leibler-based characterizations of score-driven updates
Ramon Punder,
Timo Dimitriadis and
Rutger-Jan Lange
Papers from arXiv.org
Abstract:
Score-driven (SD) models are a standard tool in statistics and econometrics, with applications in hundreds of published articles in the past decade. We provide an information-theoretic characterization of SD updates based on reductions in the expected Kullback-Leibler (EKL) divergence relative to the true -- but unknown -- data-generating density. EKL reductions occur if and only if the expected update direction aligns with the expected score; i.e., their inner product should be positive. This equivalence condition uniquely identifies SD updates (including scaled or clipped variants) as being EKL reducing, even in non-concave, multivariate, and misspecified settings. We further derive explicit bounds on admissible learning rates in terms of score moments, linking SD methods to adaptive optimization techniques. By contrast, alternative performance measures in the literature impose stronger conditions (e.g., concave logarithmic densities) and do not characterize SD updates: other updating rules may improve these measures, while SD updates need not. Our results provide a rigorous justification for SD models and establish EKL as their natural information-theoretic foundation.
Date: 2024-08, Revised 2026-03
New Economics Papers: this item is included in nep-ecm
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://arxiv.org/pdf/2408.02391 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2408.02391
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().