EconPapers    
Economics at your fingertips  
 

Undiscounted Bandit Games

R Keller and Sven Rady

Papers from arXiv.org

Abstract: We analyze undiscounted continuous-time games of strategic experimentation with two-armed bandits. The risky arm generates payoffs according to a L\'{e}vy process with an unknown average payoff per unit of time which nature draws from an arbitrary finite set. Observing all actions and realized payoffs, plus a free background signal, players use Markov strategies with the common posterior belief about the unknown parameter as the state variable. We show that the unique symmetric Markov perfect equilibrium can be computed in a simple closed form involving only the payoff of the safe arm, the expected current payoff of the risky arm, and the expected full-information payoff, given the current belief. In particular, the equilibrium does not depend on the precise specification of the payoff-generating processes.

Date: 2019-09, Revised 2020-08
New Economics Papers: this item is included in nep-gth and nep-mic
References: View references in EconPapers View complete reference list from CitEc
Citations:

Published in Games and Economic Behavior 124 (2020) 43-61

Downloads: (external link)
http://arxiv.org/pdf/1909.13323 Latest version (application/pdf)

Related works:
Journal Article: Undiscounted bandit games (2020) Downloads
Working Paper: Undiscounted Bandit Games (2020) Downloads
Working Paper: Undiscounted Bandit Games (2019) Downloads
Working Paper: Undiscounted Bandit Games (2019) Downloads
Working Paper: UNDISCOUNTED BANDIT GAMES (2019) Downloads
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:1909.13323

Access Statistics for this paper

More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().

 
Page updated 2025-03-19
Handle: RePEc:arx:papers:1909.13323