Descriptor
Comparative Analysis | 2 |
Mastery Tests | 2 |
Mathematical Models | 2 |
Test Reliability | 2 |
Computer Assisted Testing | 1 |
Criterion Referenced Tests | 1 |
Cutting Scores | 1 |
Error of Measurement | 1 |
Simulation | 1 |
Statistical Analysis | 1 |
Statistical Bias | 1 |
More ▼ |
Source
Journal of Educational… | 1 |
Publication Type
Reports - Research | 2 |
Journal Articles | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Audience
Location
South Carolina | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Comprehensive Tests of Basic… | 1 |
What Works Clearinghouse Rating

Huynh, Huynh; Saunders, Joseph C. – Journal of Educational Measurement, 1980
Single administration (beta-binomial) estimates for the raw agreement index p and the corrected-for-chance kappa index in mastery testing are compared with those based on two test administrations in terms of estimation bias and sampling variability. Bias is about 2.5 percent for p and 10 percent for kappa. (Author/RL)
Descriptors: Comparative Analysis, Error of Measurement, Mastery Tests, Mathematical Models
Eignor, Daniel R.; Hambleton, Ronald K. – 1979
The purpose of the investigation was to obtain some relationships among (1) test lengths, (2) shape of domain-score distributions, (3) advancement scores, and (4) several criterion-referenced test score reliability and validity indices. The study was conducted using computer simulation methods. The values of variables under study were set to be…
Descriptors: Comparative Analysis, Computer Assisted Testing, Criterion Referenced Tests, Cutting Scores