Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty
Nate Breznau,
Eike Mark Rinke,
Alexander Wuttke,
Hung H.V. Nguyen,
Muna Adem,
Jule Adriaans,
Amalia Alvarez-Benjumea,
Henrik K. Andersen,
Daniel Auer,
Flavio Azevedo,
Oke Bahnsen,
Dave Balzer,
Gerrit Bauer (),
Paul C. Bauer,
Markus Baumann,
Sharon Baute,
Verena Benoit,
Julian Bernauer,
Carl Berning,
Anna Berthold,
Felix S. Bethke,
Thomas Biegert,
Katharina Blinzler,
Johannes N. Blumenberg,
Licia Bobzien,
Andrea Bohman,
Thijs Bol,
Amie Bostic,
Zuzanna Brzozowska,
Katharina Burgdorf,
Kaspar Burger,
Kathrin B. Busch,
Juan Carlos-Castillo,
Nathan Chan,
Pablo Christmann,
Roxanne Connelly,
Christian S. Czymara,
Elena Damian,
Alejandro Ecker,
Achim Edelmann,
Maureen A. Eger,
Simon Ellerbrock,
Anna Forke,
Andrea Forster,
Chris Gaasendam,
Konstantin Gavras,
Vernon Gayle,
Theresa Gessler,
Timo Gnambs,
Amélie Godefroidt,
Max Grömping,
Martin Groß,
Stefan Gruber,
Tobias Gummer,
Andreas Hadjar,
Jan Paul Heisig,
Sebastian Hellmeier,
Stefanie Heyne,
Magdalena Hirsch,
Mikael Hjerm,
Oshrat Hochman,
Andreas Hövermann,
Sophia Hunger,
Christian Hunkler,
Nora Huth,
Zsófia S. Ignácz,
Laura Jacobs,
Jannes Jacobsen,
Bastian Jaeger,
Sebastian Jungkunz,
Nils Jungmann,
Mathias Kauff,
Manuel Kleinert,
Julia Klinger,
Jan Philipp Kolb,
Marta Kołczyńska,
John Kuk,
Katharina Kunißen,
Dafina Kurti Sinatra,
Alexander Langenkamp,
Philipp M. Lersch,
Lea Maria Löbel,
Philipp Lutscher,
Matthias Mader,
Joan E. Madia,
Natalia Malancu,
Luis Maldonado,
Helge Marahrens,
Nicole Martin,
Paul Martinez,
Jochen Mayerl,
Oscar J. Mayorga,
Patricia McManus,
Kyle McWagner,
Cecil Meeusen,
Daniel Meierrieks,
Jonathan Mellon,
Friedolin Merhout,
Samuel Merk,
Daniel Meyer,
Leticia Micheli,
Jonathan Mijs,
Cristóbal Moya,
Marcel Neunhoeffer,
Daniel Nüst,
Olav Nygård,
Fabian Ochsenfeld,
Gunnar Otte,
Anna O. Pechenkina,
Christopher Prosser,
Louis Raes,
Kevin Ralston,
Miguel R. Ramos,
Arne Roets,
Jonathan Rogers,
Guido Ropers,
Robin Samuel,
Gregor Sand,
Ariela Schachter,
Merlin Schaeffer,
David Schieferdecker,
Elmar Schlueter,
Regine Schmidt,
Katja M. Schmidt,
Alexander Schmidt-Catran,
Claudia Schmiedeberg,
Jürgen Schneider,
Martijn Schoonvelde,
Julia Schulte-Cloos,
Sandy Schumann,
Reinhard Schunck,
Jürgen Schupp,
Julian Seuring,
Henning Silber,
Willem Sleegers,
Nico Sonntag,
Alexander Staudt,
Nadia Steiber,
Nils Steiner,
Sebastian Sternberg,
Dieter Stiers,
Dragana Stojmenovska,
Nora Storz,
Erich Striessnig,
Anne Kathrin Stroppe,
Janna Teltemann,
Andrey Tibajev,
Brian Tung,
Giacomo Vagni,
Jasper Van Assche,
Meta van der Linden,
Jolanda van der Noll,
Arno Van Hootegem,
Stefan Vogtenhuber,
Bogdan Voicu,
Fieke Wagemans,
Nadja Wehl,
Hannah Werner,
Brenton M. Wiernik,
Fabian Winter,
Christof Wolf,
Yuki Yamada,
Nan Zhang,
Conrad Ziller,
Stefan Zins and
Tomasz Żółtak
LSE Research Online Documents on Economics from London School of Economics and Political Science, LSE Library
Abstract:
This study explores how researchers' analytical choices affect the reliability of scientific findings. Most discussions of reliability problems in science focus on systematic biases. We broaden the lens to emphasize the idiosyncrasy of conscious and unconscious decisions that researchers make during data analysis. We coordinated 161 researchers in 73 research teams and observed their research decisions as they used the same data to independently test the same prominent social science hypothesis: that greater immigration reduces support for social policies among the public. In this typical case of social science research, research teams reported both widely diverging numerical findings and substantive conclusions despite identical start conditions. Researchers' expertise, prior beliefs, and expectations barely predict the wide variation in research outcomes. More than 95% of the total variance in numerical results remains unexplained even after qualitative coding of all identifiable decisions in each team's workflow. This reveals a universe of uncertainty that remains hidden when considering a single study in isolation. The idiosyncratic nature of how researchers' results and conclusions varied is a previously underappreciated explanation for why many scientific hypotheses remain contested. These results call for greater epistemic humility and clarity in reporting scientific findings.
Keywords: analytical flexibility; immigration and policy preferences; many analysts; metascience; researcher degrees of freedom (search for similar items in EconPapers)
JEL-codes: C1 (search for similar items in EconPapers)
Pages: 8 pages
Date: 2022-11-01
References: Add references at CitEc
Citations: View citations in EconPapers (2)
Published in Proceedings of the National Academy of Sciences of the United States of America, 1, November, 2022, 119(44). ISSN: 1091-6490
Downloads: (external link)
http://eprints.lse.ac.uk/117278/ Open access version. (application/pdf)
Related works:
Journal Article: Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty (2022) 
Journal Article: Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty (2022) 
Working Paper: Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty (2022) 
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:ehl:lserod:117278
Access Statistics for this paper
More papers in LSE Research Online Documents on Economics from London School of Economics and Political Science, LSE Library LSE Library Portugal Street London, WC2A 2HD, U.K.. Contact information at EDIRC.
Bibliographic data for series maintained by LSERO Manager ().