Implementing a general framework for assessing interrater agreement in Stata
Daniel Klein ()
Additional contact information
Daniel Klein: International Centre for Higher Education Research Kassel
Stata Journal, 2018, vol. 18, issue 4, 871-901
Abstract:
Despite its well-known weaknesses, researchers continuously choose the kappa coefficient (Cohen, 1960, Educational and Psychological Measurement 20: 37–46; Fleiss, 1971, Psychological Bulletin 76: 378–382) to quantify agreement among raters. Part of kappa’s persistent popularity seems to arise from a lack of available alternative agreement coefficients in statistical software packages such as Stata. In this article, I review Gwet’s (2014, Handbook of Inter-Rater Reliability) recently developed framework of interrater agreement coefficients. This framework extends several agreement coefficients to handle any number of raters, any number of rating categories, any level of measurement, and missing values. I introduce the kappaetc command, which implements this framework in Stata.
Keywords: kappaetc; kappaetci; Cohen; Fleiss; Gwet; interrater agreement; kappa; Krippendorff; reliability (search for similar items in EconPapers)
Date: 2018
Note: to access software from within Stata, net describe http://www.stata-journal.com/software/sj18-4/st0544/
References: Add references at CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://www.stata-journal.com/article.html?article=st0544 link to article purchase
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:tsj:stataj:v:18:y:2018:i:4:p:871-901
Ordering information: This journal article can be ordered from
http://www.stata-journal.com/subscription.html
Access Statistics for this article
Stata Journal is currently edited by Nicholas J. Cox and Stephen P. Jenkins
More articles in Stata Journal from StataCorp LLC
Bibliographic data for series maintained by Christopher F. Baum () and Lisa Gilmore ().