EconPapers    
Economics at your fingertips  
 

Factors Associated with the Equivalence of the Scores of Computer-Based Test and Paper-and-Pencil Test: Presentation Type, Item Difficulty and Administration Order

Tzu-Hua Wang, Chien-Hui Kao and Hsiang-Chun Chen
Additional contact information
Tzu-Hua Wang: Department of Education and Learning Technology, National Tsing Hua University, Hsinchu City 300193, Taiwan
Chien-Hui Kao: Department of Education and Learning Technology, National Tsing Hua University, Hsinchu City 300193, Taiwan
Hsiang-Chun Chen: Department of Early Childhood Education, National Tsing Hua University, Hsinchu City 300193, Taiwan

Sustainability, 2021, vol. 13, issue 17, 1-14

Abstract: Since schools cannot use face-to-face tests to evaluate students’ learning effectiveness during the COVID-19 pandemic, many schools implement computer-based tests (CBT) for this evaluation. From the perspective of Sustainable Development Goal 4, whether this type of test conversion affects students’ performance in answering questions is an issue worthy of attention. However, studies have not yielded consistent findings on the equivalence of the scores of examinees’ answering performance on computer-based tests (CBT) and paper-and-pencil tests (PPT) when taking the same multiple-choice tests. Some studies have revealed no significant differences, whereas others have exhibited significant differences between the two formats. This study adopted a counterbalanced experimental design to investigate the effects of test format, computerised presentation type, difficulty of item group, and administration order of item groups of different difficulty levels on examinees’ answering performance. In this study, 381 primary school fifth graders in northern Taiwan completed an achievement test on the topic of Structure and Functions of Plants, which is part of the primary school Natural Science course. The achievement test included 16 multiple-choice items. After data collection and analysis, no significant differences in the answering performance of examinees were identified among the PPT, CBT with single-item presentation, and CBT with multiple-item presentation. However, after further analysis, the results indicated that the difficulty of item group and the administration order of item groups of different difficulty levels had significant influences on answering performance. The findings suggest that compared with a PPT, examinees exhibit better answering performance when taking multiple-choice tests in a CBT with multiple-item presentation.

Keywords: administration order; computer-based test; COVID-19 pandemic; item difficulty; multiple-choice test; paper-and-pencil test (search for similar items in EconPapers)
JEL-codes: O13 Q Q0 Q2 Q3 Q5 Q56 (search for similar items in EconPapers)
Date: 2021
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2071-1050/13/17/9548/pdf (application/pdf)
https://www.mdpi.com/2071-1050/13/17/9548/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jsusta:v:13:y:2021:i:17:p:9548-:d:621440

Access Statistics for this article

Sustainability is currently edited by Ms. Alexandra Wu

More articles in Sustainability from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jsusta:v:13:y:2021:i:17:p:9548-:d:621440