EconPapers    
Economics at your fingertips  
 

Worst-case bounds for the logarithmic loss of predictors

Nicolò Cesa Bianchi and Gabor Lugosi

Economics Working Papers from Department of Economics and Business, Universitat Pompeu Fabra

Abstract: We investigate on-line prediction of individual sequences. Given a class of predictors, the goal is to predict as well as the best predictor in the class, where the loss is measured by the self information (logarithmic) loss function. The excess loss (regret) is closely related to the redundancy of the associated lossless universal code. Using Shtarkov's theorem and tools from empirical process theory, we prove a general upper bound on the best possible (minimax) regret. The bound depends on certain metric properties of the class of predictors. We apply the bound to both parametric and nonparametric classes of predictors. Finally, we point out a suboptimal behavior of the popular Bayesian weighted average algorithm.

Keywords: Universal prediction; universal coding; empirical processes; on-line learning; metric entropy (search for similar items in EconPapers)
JEL-codes: C1 C13 (search for similar items in EconPapers)
Date: 1999-10
References: View complete reference list from CitEc
Citations: View citations in EconPapers (2)

Downloads: (external link)
https://econ-papers.upf.edu/papers/418.pdf Whole Paper (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:upf:upfgen:418

Access Statistics for this paper

More papers in Economics Working Papers from Department of Economics and Business, Universitat Pompeu Fabra
Bibliographic data for series maintained by ( this e-mail address is bad, please contact ).

 
Page updated 2025-04-01
Handle: RePEc:upf:upfgen:418