UM  > Personal research not belonging to the institution
Residential Collegefalse
Status已發表Published
We Agreed to Measure Agreement--Redefining Reliability De-justifies Krippendorff s Alpha
Xinshu Zhao1; Guangchao Charles Feng2; Jun S. Liu3; Ke Deng4
2018
Source PublicationChina Media Research
ISSN1556-889X
Volume14Issue:2Pages:1-15
Abstract

Zhao, Liu, & Deng (2013) reviewed 22 inter-coder reliability indices, and found that each makes unrealistic assumption(s) about coder behavior, leading to paradoxes and abnormalities. Krippendorff’s α makes more of such assumptions, consequently produces more paradoxes and abnormalities than any other index.
Professor Krippendorff (2013) countered that “most of the authors’ discoveries are the artifacts of being led astray by strange, almost conspiratorial uses of language.” The commentary reiterated Krippendorff’s long-standing position that Krippendorff’s α is the standard reliability measure, and is the only index qualified to serve the function (Hayes & Krippendorff, 2007; Krippendorff, 2004b, 2016).
This paper continues this dialogue. We offer a review of literature to show that the scientific community, including Krippendorff, has defined intercoder reliability as intercoder agreement, and Krippendorff’s α, like all its main competitors, was designed and declared to measure intercoder agreement. Now that evidences are mounting that α, like Scott’s π and Cohen’s κ, does not accurately measure intercoder agreement, Krippendorff chose to redefine intercoder reliability and, furthermore, redefine information, variation, sensitivity, and specificity.
By redefining reliability, we argue, Prof. Krippendorff has redefined the function of Krippendorff’s α, thereby disqualified α as an indicator of intercoder agreement. The search for a better index of intercoder agreement aka intercoder reliability should continue.
We, however, also note a spiral of inertia in science communication in general, and reliability research in particular. The powerful spiral, we argue, should not forever keep up the appearances for α, π or κ.

KeywordSpiral Of Inertia Agreement Inter-rater Reliability Inter-coder Reliability Reliability Selective Spiral Cohen’s Kappa Multi-signs Multi-signified Multi-concepts Multi-signification Krippendorff’s Alpha Scott’s Pi Multi-signifiers Aggregate Estimation Human Information Mechanical Information Specificity Sensitivity Individual Classification Individual Prediction
Language英語English
Document TypeJournal article
CollectionPersonal research not belonging to the institution
Affiliation1.Hong Kong Baptist University / University of North Carolina at Chapel Hill
2.Shenzhen University
3.Harvard University
4.Tsinghua University
Recommended Citation
GB/T 7714
Xinshu Zhao,Guangchao Charles Feng,Jun S. Liu,et al. We Agreed to Measure Agreement--Redefining Reliability De-justifies Krippendorff s Alpha[J]. China Media Research, 2018, 14(2), 1-15.
APA Xinshu Zhao., Guangchao Charles Feng., Jun S. Liu., & Ke Deng (2018). We Agreed to Measure Agreement--Redefining Reliability De-justifies Krippendorff s Alpha. China Media Research, 14(2), 1-15.
MLA Xinshu Zhao,et al."We Agreed to Measure Agreement--Redefining Reliability De-justifies Krippendorff s Alpha".China Media Research 14.2(2018):1-15.
Files in This Item: Download All
File Name/Size Publications Version Access License
02 Zhao et al. - 201(547KB)期刊論文 开放获取CC BY-NC-SAView Download
Zhao et al. - 2018 -(190KB)期刊論文 开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Xinshu Zhao]'s Articles
[]'s Articles
[Jun S. Liu]'s Articles
Baidu academic
Similar articles in Baidu academic
[Xinshu Zhao]'s Articles
[Guangchao Charl...]'s Articles
[Jun S. Liu]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Xinshu Zhao]'s Articles
[Guangchao Charl...]'s Articles
[Jun S. Liu]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: 02 Zhao et al. - 2018 - We agreed to measure agreement redefining reliability de-justifies Krippendorffs alpha-annotated.pdf
Format: Adobe PDF
File name: Zhao et al. - 2018 - We agreed to measure agreement – redefining reliability de-justifies Krippendorff’s alpha.docx
Format: Microsoft Word
This file does not support browsing at this time
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.