Reinforcement learning in population games
Ratul Lahkar () and
Robert M. Seymour
Games and Economic Behavior, 2013, vol. 80, issue C, 10-38
Abstract:
We study reinforcement learning in a population game. Agents in a population game revise mixed strategies using the Cross rule of reinforcement learning. The population state—the probability distribution over the set of mixed strategies—evolves according to the replicator continuity equation which, in its simplest form, is a partial differential equation. The replicator dynamic is a special case in which the initial population state is homogeneous, i.e. when all agents use the same mixed strategy. We apply the continuity dynamic to various classes of symmetric games. Using 3×3 coordination games, we show that equilibrium selection depends on the variance of the initial strategy distribution, or initial population heterogeneity. We give an example of a 2×2 game in which heterogeneity persists even as the mean population state converges to a mixed equilibrium. Finally, we apply the dynamic to negative definite and doubly symmetric games.
Keywords: Reinforcement learning; Continuity equation; Replicator dynamics (search for similar items in EconPapers)
JEL-codes: C72 C73 (search for similar items in EconPapers)
Date: 2013
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (7)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0899825613000286
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:gamebe:v:80:y:2013:i:c:p:10-38
DOI: 10.1016/j.geb.2013.02.006
Access Statistics for this article
Games and Economic Behavior is currently edited by E. Kalai
More articles in Games and Economic Behavior from Elsevier
Bibliographic data for series maintained by Catherine Liu ().