EconPapers    
Economics at your fingertips  
 

On the Use of Variability Measures to Analyze Source Coding Data Based on the Shannon Entropy

Helio M. de Oliveira, Raydonal Ospina, Carlos Martin-Barreiro, Víctor Leiva () and Christophe Chesneau
Additional contact information
Helio M. de Oliveira: Department of Statistics, CASTLab, Universidade Federal de Pernambuco, Recife 50670-901, Brazil
Raydonal Ospina: Department of Statistics, CASTLab, Universidade Federal de Pernambuco, Recife 50670-901, Brazil
Carlos Martin-Barreiro: Faculty of Natural Sciences and Mathematics, Escuela Superior Politécnica del Litoral (ESPOL), Guayaquil 090902, Ecuador
Víctor Leiva: School of Industrial Engineering, Pontificia Universidad Católica de Valparaíso, Valparaíso 2362807, Chile
Christophe Chesneau: Department of Mathematics, Université de Caen-Normandie, 14 032 Caen, France

Mathematics, 2023, vol. 11, issue 2, 1-16

Abstract: Source coding maps elements from an information source to a sequence of alphabetic symbols. Then, the source symbols can be recovered exactly from the binary units. In this paper, we derive an approach that includes information variation in the source coding. The approach is more realistic than its standard version. We employ the Shannon entropy for coding the sequences of a source. Our approach is also helpful for short sequences when the central limit theorem does not apply. We rely on a quantifier of the information variation as a source. This quantifier corresponds to the second central moment of a random variable that measures the information content of a source symbol; that is, considering the standard deviation. An interpretation of typical sequences is also provided through this approach. We show how to use a binary memoryless source as an example. In addition, Monte Carlo simulation studies are conducted to evaluate the performance of our approach. We apply this approach to two real datasets related to purity and wheat prices in Brazil.

Keywords: communication science; discrete memoryless source; entropy; information theory; Monte Carlo simulation; Newton–Raphson method; statistical moments; variance (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/11/2/293/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/2/293/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:2:p:293-:d:1026782

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:11:y:2023:i:2:p:293-:d:1026782