Descriptor
Interrater Reliability | 1 |
Multivariate Analysis | 1 |
Statistical Bias | 1 |
Statistical Distributions | 1 |
Validity | 1 |
Source
Online Submission | 1 |
Author
Randolph, Justus J. | 1 |
Publication Type
Reports - Evaluative | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Randolph, Justus J. – Online Submission, 2005
Fleiss' popular multirater kappa is known to be influenced by prevalence and bias, which can lead to the paradox of high agreement but low kappa. It also assumes that raters are restricted in how they can distribute cases across categories, which is not a typical feature of many agreement studies. In this article, a free-marginal, multirater…
Descriptors: Multivariate Analysis, Statistical Distributions, Statistical Bias, Interrater Reliability