This paper analyzes learning in multi-player noncooperative games with risky payoffs. The goal of the paper is to assess the relative importance of stochastic payoffs and expected payoffs in the learning process. A general learning model which nests several variations of reinforcement learning, belief-based learning, and experience-weighted attraction learning is used to analyze behavior in coordination game and prisonerʼs dilemma experiments with probabilistic payoffs. In all experiments, some subjects learn from past lottery outcomes, though the importance of these stochastic payoffs relative to expected payoffs depends on the game. Stochastic payoffs are less important when posted probabilities are equal to expected payoffs and more important when subjects are informed how much they would have earned from foregone strategies.