Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 1 |
Descriptor
| Accuracy | 1 |
| Automation | 1 |
| Classification | 1 |
| College Freshmen | 1 |
| Essays | 1 |
| Feedback (Response) | 1 |
| Grade 11 | 1 |
| Grade 9 | 1 |
| Multivariate Analysis | 1 |
| Persuasive Discourse | 1 |
| Rating Scales | 1 |
| More ▼ | |
Source
| Grantee Submission | 1 |
Publication Type
| Journal Articles | 1 |
| Reports - Research | 1 |
| Tests/Questionnaires | 1 |
Education Level
| Grade 11 | 1 |
| Grade 9 | 1 |
| High Schools | 1 |
| Higher Education | 1 |
| Junior High Schools | 1 |
| Middle Schools | 1 |
| Postsecondary Education | 1 |
| Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Danielle S. McNamara; Scott A. Crossley; Rod D. Roscoe; Laura K. Allen; Jianmin Dai – Grantee Submission, 2015
This study evaluates the use of a hierarchical classification approach to automated assessment of essays. Automated essay scoring (AES) generally relies onmachine learning techniques that compute essay scores using a set of text variables. Unlike previous studies that rely on regression models, this study computes essay scores using a hierarchical…
Descriptors: Automation, Scoring, Essays, Persuasive Discourse

Peer reviewed
Direct link
