Economics at your fingertips  

A Toolbox to Evaluate the Trustworthiness of Published Findings

Susanne Jana Adler, Lukas Röseler and Martina Katharina Schöniger
Additional contact information
Susanne Jana Adler: Ludwig-Maximilians-University Munich
Lukas Röseler: University of Bamberg

No s5mzp, OSF Preprints from Center for Open Science

Abstract: During the past few years, researchers have criticized their professions for providing an entry point for false-positive results arising from publication bias and questionable research practices such as p-hacking (i.e., selectively reporting analyses that yield a p-value below 5 %). Researchers are advocating replication studies and the implementation of open-science practices, like preregistration, in order to identify trustworthy effects. Nevertheless, because such consumer research developments are still emerging, most prior research findings have not been replicated, leaving researchers in the dark as to whether a line of research or a particular effect is trustworthy. We tackle this problem by providing a toolbox containing multiple heuristics to identify data patterns that might, from the information provided in published articles, indicate publication bias and p-hacking. Our toolbox is an easy-to-use instrument with which to initially assess a given set of findings.

Date: 2023-07-29
New Economics Papers: this item is included in nep-sog
References: View references in EconPapers View complete reference list from CitEc
Citations: Track citations by RSS feed

Downloads: (external link)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link:

DOI: 10.31219/

Access Statistics for this paper

More papers in OSF Preprints from Center for Open Science
Bibliographic data for series maintained by OSF ().

Page updated 2023-09-19
Handle: RePEc:osf:osfxxx:s5mzp