NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Jue; Engelhard, George, Jr.; Wolfe, Edward W. – Educational and Psychological Measurement, 2016
The number of performance assessments continues to increase around the world, and it is important to explore new methods for evaluating the quality of ratings obtained from raters. This study describes an unfolding model for examining rater accuracy. Accuracy is defined as the difference between observed and expert ratings. Dichotomous accuracy…
Descriptors: Evaluators, Accuracy, Performance Based Assessment, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Zhu, Xiaowen; Stone, Clement A. – Educational and Psychological Measurement, 2012
This study examined the relative effectiveness of Bayesian model comparison methods in selecting an appropriate graded response (GR) model for performance assessment applications. Three popular methods were considered: deviance information criterion (DIC), conditional predictive ordinate (CPO), and posterior predictive model checking (PPMC). Using…
Descriptors: Bayesian Statistics, Item Response Theory, Comparative Analysis, Models
Peer reviewed Peer reviewed
Abedi, Jamal; Baker, Eva L. – Educational and Psychological Measurement, 1995
Results from a performance assessment in which 68 high school students wrote essays support the use of latent variable modeling for estimating reliability, concurrent validity, and generalizability of a scoring rubric. The latent variable modeling approach overcomes the limitations of certain conventional statistical techniques in handling…
Descriptors: Criteria, Essays, Estimation (Mathematics), Generalizability Theory