NotesFAQContact Us
Collection
Advanced
Search Tips
Source
Online Submission1
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing one result Save | Export
Randolph, Justus J. – Online Submission, 2005
Fleiss' popular multirater kappa is known to be influenced by prevalence and bias, which can lead to the paradox of high agreement but low kappa. It also assumes that raters are restricted in how they can distribute cases across categories, which is not a typical feature of many agreement studies. In this article, a free-marginal, multirater…
Descriptors: Multivariate Analysis, Statistical Distributions, Statistical Bias, Interrater Reliability