Evaluating Item Response Format and Content Using Partial Credit Trees in Scale Development
Nana Amma Berko Asamoah,
Ronna C Turner,
Wen-Juo Lo,
Brandon L Crawford,
Sara McClelland and
Kristen N Jozkowski
Journal of Survey Statistics and Methodology, 2025, vol. 13, issue 2, 280-305
Abstract:
The type of response options selected for items on a survey, along with how many response options to include and whether to allow neutral midpoints, impacts data obtained from survey collections and the interpretations made using the results. Further, if subgroups within a population (e.g., racial/ethnic, gender, age) interpret response options differently, this variance can artificially inflate non-significant differences or mask true differences between groups. In this study, we apply two recursive partitioning procedures for investigating differential item functioning (DIF) in an experiment evaluating seven item response formats (five levels of an agree–disagree [AD] format and two levels of an item-specific [IS] format). Partial credit tree procedures allow for the evaluation of multiple covariates without prespecifying subgroups to be compared. We applied the procedures to items measuring adults’ attitudes toward legal abortion and all response formats functioned without DIF for age, gender, race, education, and religion when evaluated using global DIF screening approaches. Item-focused analyses indicated that odd-numbered response formats were less susceptible to content-based DIF. The combination of psychometric properties indicated that five-point AD and IS formats may be preferable for abortion attitude measurement based on the screening procedures conducted in this study.
Keywords: Abortion; Content validity; Differential item functioning; Item bias; Item response format; Item validity; Survey development (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1093/jssam/smae028 (application/pdf)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:oup:jassam:v:13:y:2025:i:2:p:280-305.
Access Statistics for this article
Journal of Survey Statistics and Methodology is currently edited by Emily Berg and Brad Edwards
More articles in Journal of Survey Statistics and Methodology from American Association for Public Opinion Research and American Statistical Association
Bibliographic data for series maintained by Oxford University Press ().