Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 5 |
Since 2016 (last 10 years) | 8 |
Since 2006 (last 20 years) | 8 |
Descriptor
Automation | 8 |
Writing Evaluation | 7 |
Elementary School Students | 6 |
Essay Tests | 4 |
Essays | 4 |
Feedback (Response) | 4 |
Scores | 4 |
Teacher Attitudes | 4 |
Computer Assisted Testing | 3 |
Grade 3 | 3 |
Grade 4 | 3 |
More ▼ |
Source
Grantee Submission | 2 |
International Journal of… | 2 |
American Educational Research… | 1 |
Assessment in Education:… | 1 |
Journal of Educational… | 1 |
Journal of Educational… | 1 |
Author
Wilson, Joshua | 8 |
Beard, Gaysha | 2 |
Chen, Dandan | 2 |
Hebert, Michael | 2 |
Huang, Yue | 2 |
MacArthur, Charles A. | 2 |
Myers, Matthew C. | 2 |
Palermo, Corey | 2 |
Potter, Andrew | 1 |
Rodrigues, Jessica | 1 |
Roscoe, Rod D. | 1 |
More ▼ |
Publication Type
Reports - Research | 8 |
Journal Articles | 6 |
Education Level
Elementary Education | 8 |
Middle Schools | 5 |
Intermediate Grades | 4 |
Early Childhood Education | 3 |
Grade 3 | 3 |
Grade 4 | 3 |
Grade 5 | 3 |
Primary Education | 3 |
Junior High Schools | 2 |
Secondary Education | 2 |
Grade 6 | 1 |
More ▼ |
Audience
Location
Texas | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Wilson, Joshua; Myers, Matthew C.; Potter, Andrew – Assessment in Education: Principles, Policy & Practice, 2022
We investigated the promise of a novel approach to formative writing assessment at scale that involved an automated writing evaluation (AWE) system called MI Write. Specifically, we investigated elementary teachers' perceptions and implementation of MI Write and changes in students' writing performance in three genres from Fall to Spring…
Descriptors: Writing Evaluation, Formative Evaluation, Automation, Elementary School Teachers
Chen, Dandan; Hebert, Michael; Wilson, Joshua – American Educational Research Journal, 2022
We used multivariate generalizability theory to examine the reliability of hand-scoring and automated essay scoring (AES) and to identify how these scoring methods could be used in conjunction to optimize writing assessment. Students (n = 113) included subsamples of struggling writers and non-struggling writers in Grades 3-5 drawn from a larger…
Descriptors: Reliability, Scoring, Essays, Automation
Myers, Matthew C.; Wilson, Joshua – International Journal of Artificial Intelligence in Education, 2023
This study evaluated the construct validity of six scoring traits of an automated writing evaluation (AWE) system called "MI Write." Persuasive essays (N = 100) written by students in grades 7 and 8 were randomized at the sentence-level using a script written with Python's NLTK module. Each persuasive essay was randomized 30 times (n =…
Descriptors: Construct Validity, Automation, Writing Evaluation, Algorithms
Wilson, Joshua; Rodrigues, Jessica – Grantee Submission, 2020
The present study leveraged advances in automated essay scoring (AES) technology to explore a proof of concept for a writing screener using the "Project Essay Grade" (PEG) program. First, the study investigated the extent to which an AES-scored multi-prompt writing screener accurately classified students as at risk of failing a Common…
Descriptors: Writing Tests, Screening Tests, Classification, Accuracy
Wilson, Joshua; Huang, Yue; Palermo, Corey; Beard, Gaysha; MacArthur, Charles A. – International Journal of Artificial Intelligence in Education, 2021
This study examined a naturalistic, districtwide implementation of an automated writing evaluation (AWE) software program called "MI Write" in elementary schools. We specifically examined the degree to which aspects of MI Write were implemented, teacher and student attitudes towards MI Write, and whether MI Write usage along with other…
Descriptors: Automation, Writing Evaluation, Feedback (Response), Computer Software
Wilson, Joshua; Huang, Yue; Palermo, Corey; Beard, Gaysha; MacArthur, Charles A. – Grantee Submission, 2021
This study examined a naturalistic, districtwide implementation of an automated writing evaluation (AWE) software program called "MI Write" in elementary schools. We specifically examined the degree to which aspects of MI Write were implemented, teacher and student attitudes towards MI Write, and whether MI Write usage along with other…
Descriptors: Automation, Writing Evaluation, Feedback (Response), Computer Software
Wilson, Joshua; Chen, Dandan; Sandbank, Micheal P.; Hebert, Michael – Journal of Educational Psychology, 2019
The present study examined issues pertaining to the reliability of writing assessment in the elementary grades, and among samples of struggling and nonstruggling writers. The present study also extended nascent research on the reliability and the practical applications of automated essay scoring (AES) systems in Response to Intervention frameworks…
Descriptors: Computer Assisted Testing, Automation, Scores, Writing Tests
Wilson, Joshua; Roscoe, Rod D. – Journal of Educational Computing Research, 2020
The present study extended research on the effectiveness of automated writing evaluation (AWE) systems. Sixth graders were randomly assigned by classroom to an AWE condition that used "Project Essay Grade Writing" (n = 56) or a word-processing condition that used Google Docs (n = 58). Effectiveness was evaluated using multiple metrics:…
Descriptors: Automation, Writing Evaluation, Feedback (Response), Instructional Effectiveness