Using Response Times in Answer Similarity Analysis
Kylie Gorney and
James A. Wollack
Additional contact information
Kylie Gorney: Michigan State University
James A. Wollack: University of Wisconsin-Madison
Journal of Educational and Behavioral Statistics, 2025, vol. 50, issue 3, 449-470
Abstract:
Recent decades have seen a tremendous growth in the development of collusion detection methods, many of which rest on the assumption that examinees who engage in collusion will display unusually similar scores/responses. In this article, we expand the definition of answer similarity to include not only the item scores/responses but also the item response times (RTs). Using detailed simulations and an experimental data set, we show that (a) both the new and existing similarity statistics are able to control the Type I error rate in most of the studied conditions and (b) the new statistics are much more powerful, on average, than the existing statistics at detecting several types of simulated collusion.
Keywords: answer similarity analysis; preknowledge; response times; test collusion (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://journals.sagepub.com/doi/10.3102/10769986241248770 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:sae:jedbes:v:50:y:2025:i:3:p:449-470
DOI: 10.3102/10769986241248770
Access Statistics for this article
More articles in Journal of Educational and Behavioral Statistics
Bibliographic data for series maintained by SAGE Publications ().