A new toolkit for robust distributional change detection
Anna‐Lena Kißlinger and
Wolfgang Stummer
Applied Stochastic Models in Business and Industry, 2018, vol. 34, issue 5, 682-699
Abstract:
Divergences (distances), which measure the dissimilarity, respectively, proximity, between two probability distributions, have turned out to be very useful for several different tasks in statistics (eg, parameter estimation and goodness‐of‐fit testing), econometrics, machine learning, information theory, etc. Some prominent examples are the Kullback‐Leibler information (relative entropy), the Csiszár‐Ali‐Silvey ϕ‐divergences, the “ordinary” (ie, unscaled) Bregman divergences, and the recently developed more general scaled Bregman divergences. Out of the latter and a novel extension to nonconvex generators, we form a new toolkit for detecting distributional changes in random data (streams and clouds). Some sample‐size asymptotics is investigated as well.
Date: 2018
References: Add references at CitEc
Citations:
Downloads: (external link)
https://doi.org/10.1002/asmb.2357
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:wly:apsmbi:v:34:y:2018:i:5:p:682-699
Access Statistics for this article
More articles in Applied Stochastic Models in Business and Industry from John Wiley & Sons
Bibliographic data for series maintained by Wiley Content Delivery ().