This chapter presents developments in the theory of stochastic games that have taken place in recent years. It complements the contribution by Mertens. Major emphasis is put on stochastic games with finite state and action sets. In the zero-sum case, a classical result of Mertens and Neyman states that given [epsilon] > 0, each player has a strategy that is [epsilon]-optimal for all discount factors close to zero. Extensions to non-zero-sum games are dealt with here. In particular, the proof of existence of uniform equilibrium payoffs for two-player games is discussed, as well as the results available for more-than-two-player games. Important open problems related to N-player games are introduced by means of a class of simple stochastic games, called quitting, or stopping, games. Finally, recent results on zero-sum games with imperfect monitoring and on zero-sum games with incomplete information are surveyed.