EconPapers    
Economics at your fingertips  
 

Regularities Unseen, Randomness Observed: Levels of Entropy Convergence

James P. Crutchfield and David P. Feldman

Working Papers from Santa Fe Institute

Abstract: We study how the Shannon entropy of sequences produced by an information source converges to the source's entropy rate. We synthesize several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes by using a hierarchy of derivatives of Shannon entropy convergence. This leads, in turn, to natural measures of (i) apparent memory stored in a source and (ii) the amounts of information that must be extracted from observations of a source in order (a) for it to be optimally predicted and (b) for an observer to synchronize to it. One consequence of ignoring these structural properties is that the missed regularities are converted to apparent randomness. We demonstrate that this problem arises particularly for small data sets; e.g., in settings where one has access to a relatively few, short measurement sequences.

Date: 2001-02
New Economics Papers: this item is included in nep-evo
References: Add references at CitEc
Citations: View citations in EconPapers (1)

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:wop:safiwp:01-02-012

Access Statistics for this paper

More papers in Working Papers from Santa Fe Institute Contact information at EDIRC.
Bibliographic data for series maintained by Thomas Krichel ().

 
Page updated 2025-03-22
Handle: RePEc:wop:safiwp:01-02-012