Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 4 |
Since 2016 (last 10 years) | 6 |
Since 2006 (last 20 years) | 7 |
Descriptor
Computer Assisted Testing | 7 |
Writing Evaluation | 7 |
Essays | 5 |
Elementary School Students | 4 |
Scoring | 4 |
Writing Skills | 4 |
Automation | 3 |
Feedback (Response) | 3 |
Grade 4 | 3 |
Grade 5 | 3 |
Prompting | 3 |
More ▼ |
Source
American Educational Research… | 1 |
Educational Technology… | 1 |
Elementary School Journal | 1 |
International Journal of… | 1 |
Journal of Educational… | 1 |
Learning Disabilities: A… | 1 |
Reading and Writing: An… | 1 |
Author
Wilson, Joshua | 7 |
Chen, Dandan | 2 |
Hebert, Michael | 2 |
Andrada, Gilbert N. | 1 |
Myers, Matthew C. | 1 |
Olinghouse, Natalie G. | 1 |
Potter, Andrew | 1 |
Sandbank, Micheal P. | 1 |
Wen, Huijing | 1 |
Publication Type
Journal Articles | 7 |
Reports - Research | 7 |
Education Level
Elementary Education | 6 |
Middle Schools | 4 |
Grade 4 | 3 |
Grade 5 | 3 |
Intermediate Grades | 3 |
Secondary Education | 3 |
Grade 7 | 2 |
Grade 8 | 2 |
Junior High Schools | 2 |
Early Childhood Education | 1 |
Grade 3 | 1 |
More ▼ |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Chen, Dandan; Hebert, Michael; Wilson, Joshua – American Educational Research Journal, 2022
We used multivariate generalizability theory to examine the reliability of hand-scoring and automated essay scoring (AES) and to identify how these scoring methods could be used in conjunction to optimize writing assessment. Students (n = 113) included subsamples of struggling writers and non-struggling writers in Grades 3-5 drawn from a larger…
Descriptors: Reliability, Scoring, Essays, Automation
Myers, Matthew C.; Wilson, Joshua – International Journal of Artificial Intelligence in Education, 2023
This study evaluated the construct validity of six scoring traits of an automated writing evaluation (AWE) system called "MI Write." Persuasive essays (N = 100) written by students in grades 7 and 8 were randomized at the sentence-level using a script written with Python's NLTK module. Each persuasive essay was randomized 30 times (n =…
Descriptors: Construct Validity, Automation, Writing Evaluation, Algorithms
Potter, Andrew; Wilson, Joshua – Educational Technology Research and Development, 2021
Automated Writing Evaluation (AWE) provides automatic writing feedback and scoring to support student writing and revising. The purpose of the present study was to analyze a statewide implementation of an AWE software (n = 114,582) in grades 4-11. The goals of the study were to evaluate: (1) to what extent AWE features were used; (2) if equity and…
Descriptors: Computer Assisted Testing, Writing Evaluation, Feedback (Response), Scoring
Wilson, Joshua; Wen, Huijing – Elementary School Journal, 2022
This study investigated fourth and fifth graders' metacognitive knowledge about writing and its relationship to writing performance to help identify areas that might be leveraged when designing effective writing instruction. Students' metacognitive knowledge was probed using a 30-minute informative writing prompt requiring students to teach their…
Descriptors: Elementary School Students, Metacognition, Writing Attitudes, Writing (Composition)
Wilson, Joshua; Chen, Dandan; Sandbank, Micheal P.; Hebert, Michael – Journal of Educational Psychology, 2019
The present study examined issues pertaining to the reliability of writing assessment in the elementary grades, and among samples of struggling and nonstruggling writers. The present study also extended nascent research on the reliability and the practical applications of automated essay scoring (AES) systems in Response to Intervention frameworks…
Descriptors: Computer Assisted Testing, Automation, Scores, Writing Tests
Wilson, Joshua – Reading and Writing: An Interdisciplinary Journal, 2017
The present study examined growth in writing quality associated with feedback provided by an automated essay evaluation system called PEG Writing. Equal numbers of students with disabilities (SWD) and typically-developing students (TD) matched on prior writing achievement were sampled (n = 1196 total). Data from a subsample of students (n = 655)…
Descriptors: Computer Assisted Testing, Essays, Writing Evaluation, Writing Skills
Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N. – Learning Disabilities: A Contemporary Journal, 2014
The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…
Descriptors: Writing Skills, Writing Ability, Feedback (Response), Computer Mediated Communication