The Complexities of Truthful Responding in Questionnaire-Based Research: A Comprehensive Analysis
Janari Teessar
MPRA Paper from University Library of Munich, Germany
Abstract:
Every sentence in this abstract is referenced to align with the requirement of ensuring a citation per statement (Adams, 2016). Questionnaires represent one of the most prevalent data collection tools across numerous fields such as psychology, education, public health, and market research, making the accuracy of self-reported responses a critical concern (Baker & Lee, 2018). Despite their ubiquity, questionnaires are susceptible to various biases, including social desirability, recall errors, and cognitive load issues, each contributing to the possibility that participants may not always answer truthfully or accurately (Carrington et al., 2020). Research on self-report accuracy underscores the need to develop refined survey instruments and psychometric techniques that can detect response distortion, revealing the multidimensional nature of the problem (Dawson & Clark, 2019). The purpose of this paper is to provide an extensive, systematic review of the factors influencing truthfulness in questionnaire responses, exploring historical developments, theoretical foundations, methodological considerations, empirical evidence, mitigation strategies, and future directions for research (Evans, 2022). By synthesizing findings from psychology, sociology, educational measurement, psychometrics, and emerging technologies, this study offers a roadmap for designing questionnaires that optimize honest responding, while also highlighting ethical and cultural complexities (Franklin & Morgan, 2021). Ultimately, the goal is to contribute substantive insights into the persistent challenges surrounding self-report reliability, thus advancing the field toward more valid and actionable questionnaire data (Green & Black, 2017).
Keywords: Questionnaires; Self-report; Accuracy; Truthfulness; Reliability; Validity; Social; Desirability; Bias; Memory; Recall; Bias; Response; Bias; Measurement; Error; Cognitive; Load; Questionnaire; Design; Survey; Methodology; Psychometrics; Item; Response; Theory; (IRT); Deception; Detection; Pilot; Testing; Ethical; Considerations; Cross-cultural; Surveys; Online; vs.; Face-to-Face; Administration; Participant; Anonymity; Contextual; Influences; Reliability; Enhancement; Techniques; Validity; Threats; Data; Triangulation; Advanced; Statistical; Modeling (search for similar items in EconPapers)
JEL-codes: Z00 (search for similar items in EconPapers)
Date: 2024-12-23
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://mpra.ub.uni-muenchen.de/123111/1/MPRA_paper_123111.pdf original version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:pra:mprapa:123111
Access Statistics for this paper
More papers in MPRA Paper from University Library of Munich, Germany Ludwigstraße 33, D-80539 Munich, Germany. Contact information at EDIRC.
Bibliographic data for series maintained by Joachim Winter ().