EconPapers    
Economics at your fingertips  
 

Multi-Armed Bandit Processes

Xiaoqiang Cai, Xianyi Wu and Xian Zhou
Additional contact information
Xiaoqiang Cai: The Chinese University of Hong Kong
Xianyi Wu: East China Normal University
Xian Zhou: Macquarie University

Chapter Chapter 6 in Optimal Stochastic Scheduling, 2014, pp 225-252 from Springer

Abstract: Abstract This chapter studies the powerful tool for stochastic scheduling, using theoretically elegant multi-armed bandit processes to maximize expected total discounted rewards. Multi-armed bandit models form a particular type of optimal resource allocation problems, in which a number of machines or processors are to be allocated to serve a set of competing projects (arms). We introduce the classical theory for multi-armed bandit processes in Section 6.1, and consider open bandit processes in which infinitely many arms are allowed in Section 6.2. An extension to generalized open bandit processes is given in Section 6.3. Finally, a concise account for closed bandit processes in continuous time is presented in Section 6.4.

Keywords: Optimal Policy; Reward Rate; Index Policy; Bandit Problem; Decision Epoch (search for similar items in EconPapers)
Date: 2014
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:isochp:978-1-4899-7405-1_6

Ordering information: This item can be ordered from
http://www.springer.com/9781489974051

DOI: 10.1007/978-1-4899-7405-1_6

Access Statistics for this chapter

More chapters in International Series in Operations Research & Management Science from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-04-01
Handle: RePEc:spr:isochp:978-1-4899-7405-1_6