Information Theoretic Criteria
Badong Chen,
Lujuan Dang,
Nanning Zheng and
Jose C. Principe
Additional contact information
Badong Chen: National Engineering Research Center for Visual Information and Applications, and Institute of Artificial Intelligence and Robotics, Xi’an Jiaotong University, National Key Laboratory of Human-Machine Hybrid Augmented Intelligence
Lujuan Dang: National Engineering Research Center for Visual Information and Applications, and Institute of Artificial Intelligence and Robotics, Xi’an Jiaotong University, National Key Laboratory of Human-Machine Hybrid Augmented Intelligence
Nanning Zheng: National Engineering Research Center for Visual Information and Applications, and Institute of Artificial Intelligence and Robotics, Xi’an Jiaotong University, National Key Laboratory of Human-Machine Hybrid Augmented Intelligence
Jose C. Principe: University of Florida, Electrical and Computer Engineering Department
Chapter Chapter 3 in Kalman Filtering Under Information Theoretic Criteria, 2023, pp 53-87 from Springer
Abstract:
Abstract The optimization criteria in information theoretic learning (ITL) have gained increasing attention over the past few years, which use the information theoretic quantities (e.g., entropy or correntropy) estimated directly from the data instead of the usual second-order statistical measures, such as variance and covariance, as the optimization costs. In ITL, the correntropy is a local similarity measure (only concerns about the local part of the error PDF falling within the kernel size), and it is naturally a robust cost, named maximum correntropy criterion (MCC). As another key principle in ITL, entropy measures the uncertainty of error; therefore, the optimization criterion is formulated as minimum error entropy (MEE) which aims to reduce the error uncertainty via minimizing Renyi’s quadratic entropy. Compared with MCC dealing with heavy-tailed non-Gaussian noises, the MEE improves the capability of model prediction when facing more complex non-Gaussian noises, such as noises from multimodal distributions. Sometimes, in order to obtain an optimal solution, the MEE needs to manually add a bias to the model to yield zero mean error. To more naturally adjust the error mean, the MEE with fiducial points (MEEF) was proposed, which can automatically anchor the error mean around zero. In this chapter, the MCC, MEE, and MEEF are briefly reviewed.
Keywords: Information theoretic learning; Maximum correntropy criterion; Minimum error entropy; MEE with fiducial points (search for similar items in EconPapers)
Date: 2023
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-3-031-33764-2_3
Ordering information: This item can be ordered from
http://www.springer.com/9783031337642
DOI: 10.1007/978-3-031-33764-2_3
Access Statistics for this chapter
More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().