Some Information Theoretic Ideas Useful in Statistical Inference
Takis Papaioannou (),
Kosmas Ferentinos () and
Charalampos Tsairidis ()
Additional contact information
Takis Papaioannou: University of Piraeus
Kosmas Ferentinos: University of Ioannina
Charalampos Tsairidis: Democritus University of Thrace
Methodology and Computing in Applied Probability, 2007, vol. 9, issue 2, 307-323
Abstract:
Abstract In this paper we discuss four information theoretic ideas and present their implications to statistical inference: (1) Fisher information and divergence generating functions, (2) information optimum unbiased estimators, (3) information content of various statistics, (4) characterizations based on Fisher information.
Keywords: Information generating function; Information optimum estimation; Information content; Acid test properties; Quantal random censoring; Koziol–Green model; Truncated data; Characterizations of Fisher information; Primary 62B10; Secondary 94A17 (search for similar items in EconPapers)
Date: 2007
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s11009-007-9017-7 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:metcap:v:9:y:2007:i:2:d:10.1007_s11009-007-9017-7
Ordering information: This journal article can be ordered from
https://www.springer.com/journal/11009
DOI: 10.1007/s11009-007-9017-7
Access Statistics for this article
Methodology and Computing in Applied Probability is currently edited by Joseph Glaz
More articles in Methodology and Computing in Applied Probability from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().