The Journey from Entropy to Generalized Maximum Entropy
Amjad Al-Nasser
Journal of Quantitative Methods, 2019, vol. 3, issue 1, 1-7
Abstract:
Currently we are witnessing the revaluation of huge data recourses that should be analyzed carefully to draw the right decisions about the world problems. Such big data are statistically risky since we know that the data are combination of (useful) signals and (useless) noise, which considered as unorganized facts that need to be filtered and processed. Using the signals only and discarding the noise means that the data restructured and reorganized to be useful and it is called information. So for any data set, we need only the information. In context of information theory, the entropy is used as a statistical measure to quantify the maximum amount of information in a random event.
Keywords: Entropy; Generalized Maximum Entropy; Maximum Entropy (ME); Mathematical Programming Problem (search for similar items in EconPapers)
JEL-codes: C46 (search for similar items in EconPapers)
Date: 2019
References: Add references at CitEc
Citations:
Downloads: (external link)
https://ojs.umt.edu.pk/index.php/jqm/article/view/32/21 Full text (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:ris:jqmumt:0018
Access Statistics for this article
Journal of Quantitative Methods is currently edited by Sajid Ali
More articles in Journal of Quantitative Methods from University of Management and Technology, Lahore, Pakistan Department of Quantitative Methods, School of Business and Economics, University of Management and Technology, Lahore, Pakistan. Contact information at EDIRC.
Bibliographic data for series maintained by Romila Qamar ().