Publication Date
In 2025 | 0 |
Since 2024 | 2 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 2 |
Descriptor
Essays | 2 |
Models | 2 |
Natural Language Processing | 2 |
Reliability | 2 |
Scoring | 2 |
Writing Evaluation | 2 |
Artificial Intelligence | 1 |
Automation | 1 |
Comparative Analysis | 1 |
Computational Linguistics | 1 |
Computer Software | 1 |
More ▼ |
Author
Chen Li | 1 |
Chunyi Ruan | 1 |
Colleen Appel | 1 |
Dadi Ramesh | 1 |
Duanli Yan | 1 |
Farah Qureshi | 1 |
Ian Blood | 1 |
James V. Bruno | 1 |
Katherine Castellano | 1 |
Kofi James | 1 |
Michelle Lamar | 1 |
More ▼ |
Publication Type
Journal Articles | 2 |
Reports - Research | 2 |
Education Level
Elementary Education | 1 |
Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Dadi Ramesh; Suresh Kumar Sanampudi – European Journal of Education, 2024
Automatic essay scoring (AES) is an essential educational application in natural language processing. This automated process will alleviate the burden by increasing the reliability and consistency of the assessment. With the advances in text embedding libraries and neural network models, AES systems achieved good results in terms of accuracy.…
Descriptors: Scoring, Essays, Writing Evaluation, Memory
Paul Deane; Duanli Yan; Katherine Castellano; Yigal Attali; Michelle Lamar; Mo Zhang; Ian Blood; James V. Bruno; Chen Li; Wenju Cui; Chunyi Ruan; Colleen Appel; Kofi James; Rodolfo Long; Farah Qureshi – ETS Research Report Series, 2024
This paper presents a multidimensional model of variation in writing quality, register, and genre in student essays, trained and tested via confirmatory factor analysis of 1.37 million essay submissions to ETS' digital writing service, Criterion®. The model was also validated with several other corpora, which indicated that it provides a…
Descriptors: Writing (Composition), Essays, Models, Elementary School Students