NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED583292
Record Type: Non-Journal
Publication Date: 2017
Pages: 118
Abstractor: As Provided
ISBN: 978-0-3556-1812-9
ISSN: EISSN-
EISSN: N/A
Available Date: N/A
The Impact of Rater Variability on Relationships among Different Effect-Size Indices for Inter-Rater Agreement between Human and Automated Essay Scoring
Yun, Jiyeo
ProQuest LLC, Ph.D. Dissertation, The Florida State University
Since researchers investigated automatic scoring systems in writing assessments, they have dealt with relationships between human and machine scoring, and then have suggested evaluation criteria for inter-rater agreement. The main purpose of my study is to investigate the magnitudes of and relationships among indices for inter-rater agreement used to assess the relatedness of human and automated essay scoring, and to examine impacts of rater variability on inter-rater agreement. To implement the investigations, my study consists of two parts: empirical and simulation studies. Based on the results from the empirical study, the overall effects for inter-rater agreement were 0.63 and 0.99 for exact and adjacent proportions of agreement, 0.48 for kappas, and between 0.75 and 0.78 for correlations. Additionally, significant differences between 6-point scales and the other scales (i.e., 3-, 4-, and 5-point scales) for correlations, kappas and proportions of agreement existed. Moreover, based on the results of the simulated data, the highest agreements and lowest discrepancies achieved in the matched rater distribution pairs. Specifically, the means of exact and adjacent proportions of agreement, kappa and weighted kappa values, and correlations were 0.58, 0.95, 0.42, 0.78, and 0.78, respectively. Meanwhile the average standardized mean difference was 0.0005 in the matched rater distribution pairs. Acceptable values for inter-rater agreement as evaluation criteria for automated essay scoring, impacts of rater variability on inter-rater agreement, and relationships among inter-rater agreement indices were discussed. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com/en-US/products/dissertations/individuals.shtml.]
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A