Speaking on Data’s Behalf: What Researchers Say and How Audiences Choose
Jesse J. Chandler,
Ignacio Martinez,
Mariel M. Finucane,
Jeffrey G. Terziev and
Alexandra M. Resch
Evaluation Review, 2020, vol. 44, issue 4, 325-353
Abstract:
Background: Bayesian statistics have become popular in the social sciences, in part because they are thought to present more useful information than traditional frequentist statistics. Unfortunately, little is known about whether or how interpretations of frequentist and Bayesian results differ. Objectives: We test whether presenting Bayesian or frequentist results based on the same underlying data influences the decisions people made. Research design: Participants were randomly assigned to read Bayesian and frequentist interpretations of hypothetical evaluations of new education technologies of various degrees of uncertainty, ranging from posterior probabilities of 99.8% to 52.9%, which have equivalent frequentist p values of .001 and .65, respectively. Subjects: Across three studies, 933 U.S. adults were recruited from Amazon Mechanical Turk. Measures: The primary outcome was the proportion of participants who recommended adopting the new technology. We also measured respondents’ certainty in their choice and (in Study 3) how easy it was to understand the results. Results: When presented with Bayesian results, participants were more likely to recommend switching to the new technology. This finding held across all degrees of uncertainty, but especially when the frequentist results reported a p value >.05. Those who recommended change based on Bayesian results were more certain about their choice. All respondents reported that the Bayesian display was easier to understand. Conclusions: Presenting the same data in either frequentist or Bayesian terms can influence the decisions that people make. This finding highlights the importance of understanding the impact of the statistical results on how audiences interpret evaluation results.
Keywords: real-world dissemination; design and evaluation of programs and policies; methodological development; content area (search for similar items in EconPapers)
Date: 2020
References: Add references at CitEc
Citations:
Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0193841X19834968 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:sae:evarev:v:44:y:2020:i:4:p:325-353
DOI: 10.1177/0193841X19834968
Access Statistics for this article
More articles in Evaluation Review
Bibliographic data for series maintained by SAGE Publications ().