Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty
Nate Breznau,
Eike Mark Rinke,
Alexander Wuttke,
Hung H. V. Nguyen,
Muna Adem,
Jule Adriaans,
Amalia Alvarez-Benjumea,
Henrik K. Andersen,
Daniel Auer,
Flavio Azevedo,
Oke Bahnsen,
Elmar Schlueter,
Regine Schmidt,
Katja M. Schmidt,
Alexander Schmidt-Catran,
Claudia Schmiedeberg,
Jürgen Schneider,
Martijn Schoonvelde,
Julia Schulte-Cloos,
Sandy Schumann,
Paul C. Bauer,
Pablo Christmann,
Reinhard Schunck,
Jürgen Schupp,
Julian Seuring,
Henning Silber,
Willem Sleegers,
Nico Sonntag,
Alexander Staudt,
Nadia Steiber,
Nils Steiner,
Sebastian Sternberg,
Roxanne Connelly,
Markus Baumann,
Dieter Stiers,
Dragana Stojmenovska,
Nora Storz,
Erich Striessnig,
Anne-Kathrin Stroppe,
Janna Teltemann,
Andrey Tibajev,
Brian Tung,
Giacomo Vagni,
Christian S. Czymara,
Jasper Van Assche,
Sharon Baute,
Meta van der Linden,
Jolanda van der Noll,
Arno Van Hootegem,
Stefan Vogtenhuber,
Bogdan Voicu,
Fieke Wagemans,
Nadja Wehl,
Hannah Werner,
Elena Damian,
Brenton M. Wiernik,
Fabian Winter,
Verena Benoit,
Christof Wolf,
Yuki Yamada,
Nan Zhang,
Conrad Ziller,
Stefan Zins,
Tomasz Żółtak,
Julian Bernauer,
Alejandro Ecker,
Carl Berning,
Anna Berthold,
Felix S. Bethke,
Thomas Biegert,
Katharina Blinzler,
Johannes N. Blumenberg,
Licia Bobzien,
Andrea Bohman,
Thijs Bol,
Amie Bostic,
Achim Edelmann,
Zuzanna Brzozowska,
Katharina Burgdorf,
Kaspar Burger,
Kathrin B. Busch,
Juan Carlos-Castillo,
Nathan Chan,
Maureen A. Eger,
Simon Ellerbrock,
Anna Forke,
Andrea Forster,
Leticia Micheli,
Chris Gaasendam,
Konstantin Gavras,
Vernon Gayle,
Theresa Gessler,
Timo Gnambs,
Amélie Godefroidt,
Max Grömping,
Martin Groß,
Stefan Gruber,
Tobias Gummer,
Jonathan Mijs,
Andreas Hadjar,
Jan Paul Heisig,
Sebastian Hellmeier,
Stefanie Heyne,
Magdalena Hirsch,
Mikael Hjerm,
Oshrat Hochman,
Andreas Hövermann,
Sophia Hunger,
Christian Hunkler,
Cristóbal Moya,
Nora Huth,
Zsófia S. Ignácz,
Laura Jacobs,
Jannes Jacobsen,
Bastian Jaeger,
Sebastian Jungkunz,
Nils Jungmann,
Mathias Kauff,
Manuel Kleinert,
Julia Klinger,
Marcel Neunhoeffer,
Jan-Philipp Kolb,
Marta Kołczyńska,
John Kuk,
Katharina Kunißen,
Dafina Kurti Sinatra,
Alexander Langenkamp,
Philipp M. Lersch,
Lea-Maria Löbel,
Philipp Lutscher,
Matthias Mader,
Daniel Nüst,
Joan E. Madia,
Natalia Malancu,
Luis Maldonado,
Helge Marahrens,
Nicole Martin,
Paul Martinez,
Jochen Mayerl,
Oscar J. Mayorga,
Patricia McManus,
Kyle McWagner,
Olav Nygård,
Cecil Meeusen,
Daniel Meierrieks,
Jonathan Mellon,
Friedolin Merhout,
Samuel Merk,
Daniel Meyer,
Fabian Ochsenfeld,
Gunnar Otte,
Anna O. Pechenkina,
Christopher Prosser,
Dave Balzer,
Louis Raes,
Kevin Ralston,
Miguel R. Ramos,
Arne Roets,
Jonathan Rogers,
Guido Ropers,
Robin Samuel,
Gregor Sand,
Ariela Schachter,
Merlin Schaeffer,
Gerrit Bauer () and
David Schieferdecker
EconStor Open Access Articles and Book Chapters, 2022, vol. 119, issue 44, No e2203150119, 8 pages
Abstract:
This study explores how researchers’ analytical choices affect the reliability of scientific findings. Most discussions of reliability problems in science focus on systematic biases. We broaden the lens to emphasize the idiosyncrasy of conscious and unconscious decisions that researchers make during data analysis. We coordinated 161 researchers in 73 research teams and observed their research decisions as they used the same data to independently test the same prominent social science hypothesis: that greater immigration reduces support for social policies among the public. In this typical case of social science research, research teams reported both widely diverging numerical findings and substantive conclusions despite identical start conditions. Researchers’ expertise, prior beliefs, and expectations barely predict the wide variation in research outcomes. More than 95% of the total variance in numerical results remains unexplained even after qualitative coding of all identifiable decisions in each team’s workflow. This reveals a universe of uncertainty that remains hidden when considering a single study in isolation. The idiosyncratic nature of how researchers’ results and conclusions varied is a previously underappreciated explanation for why many scientific hypotheses remain contested. These results call for greater epistemic humility and clarity in reporting scientific findings.
Keywords: metascience; many analysts; researcher degrees of freedom; analytical flexibility; immigration and policy preferences (search for similar items in EconPapers)
Date: 2022
References: Add references at CitEc
Citations: View citations in EconPapers (9)
Downloads: (external link)
https://www.econstor.eu/bitstream/10419/266342/1/F ... many-researchers.pdf (application/pdf)
Related works:
Journal Article: Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty (2022) 
Working Paper: Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty (2022) 
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:zbw:espost:266342
DOI: 10.1073/pnas.2203150119
Access Statistics for this article
More articles in EconStor Open Access Articles and Book Chapters from ZBW - Leibniz Information Centre for Economics Contact information at EDIRC.
Bibliographic data for series maintained by ZBW - Leibniz Information Centre for Economics ().