Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 9 |
Since 2016 (last 10 years) | 14 |
Since 2006 (last 20 years) | 16 |
Descriptor
Source
Author
Wilson, Joshua | 16 |
Myers, Matthew C. | 3 |
Olinghouse, Natalie G. | 3 |
Potter, Andrew | 3 |
Beard, Gaysha | 2 |
Chen, Dandan | 2 |
Hebert, Michael | 2 |
Huang, Yue | 2 |
MacArthur, Charles A. | 2 |
Palermo, Corey | 2 |
Roscoe, Rod D. | 2 |
More ▼ |
Publication Type
Journal Articles | 15 |
Reports - Research | 15 |
Reports - Evaluative | 1 |
Education Level
Elementary Education | 11 |
Middle Schools | 8 |
Secondary Education | 6 |
Grade 5 | 5 |
Intermediate Grades | 5 |
Junior High Schools | 5 |
Grade 4 | 4 |
Early Childhood Education | 2 |
Grade 3 | 2 |
Grade 6 | 2 |
Grade 7 | 2 |
More ▼ |
Audience
Location
Texas | 1 |
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 1 |
What Works Clearinghouse Rating
Wilson, Joshua; Myers, Matthew C.; Potter, Andrew – Assessment in Education: Principles, Policy & Practice, 2022
We investigated the promise of a novel approach to formative writing assessment at scale that involved an automated writing evaluation (AWE) system called MI Write. Specifically, we investigated elementary teachers' perceptions and implementation of MI Write and changes in students' writing performance in three genres from Fall to Spring…
Descriptors: Writing Evaluation, Formative Evaluation, Automation, Elementary School Teachers
Chen, Dandan; Hebert, Michael; Wilson, Joshua – American Educational Research Journal, 2022
We used multivariate generalizability theory to examine the reliability of hand-scoring and automated essay scoring (AES) and to identify how these scoring methods could be used in conjunction to optimize writing assessment. Students (n = 113) included subsamples of struggling writers and non-struggling writers in Grades 3-5 drawn from a larger…
Descriptors: Reliability, Scoring, Essays, Automation
Deane, Paul; Wilson, Joshua; Zhang, Mo; Li, Chen; van Rijn, Peter; Guo, Hongwen; Roth, Amanda; Winchester, Eowyn; Richter, Theresa – International Journal of Artificial Intelligence in Education, 2021
Educators need actionable information about student progress during the school year. This paper explores an approach to this problem in the writing domain that combines three measurement approaches intended for use in interim-assessment fashion: scenario-based assessments (SBAs), to simulate authentic classroom tasks, automated writing evaluation…
Descriptors: Vignettes, Writing Evaluation, Writing Improvement, Progress Monitoring
Myers, Matthew C.; Wilson, Joshua – International Journal of Artificial Intelligence in Education, 2023
This study evaluated the construct validity of six scoring traits of an automated writing evaluation (AWE) system called "MI Write." Persuasive essays (N = 100) written by students in grades 7 and 8 were randomized at the sentence-level using a script written with Python's NLTK module. Each persuasive essay was randomized 30 times (n =…
Descriptors: Construct Validity, Automation, Writing Evaluation, Algorithms
Potter, Andrew; Wilson, Joshua – Educational Technology Research and Development, 2021
Automated Writing Evaluation (AWE) provides automatic writing feedback and scoring to support student writing and revising. The purpose of the present study was to analyze a statewide implementation of an AWE software (n = 114,582) in grades 4-11. The goals of the study were to evaluate: (1) to what extent AWE features were used; (2) if equity and…
Descriptors: Computer Assisted Testing, Writing Evaluation, Feedback (Response), Scoring
Wilson, Joshua; Huang, Yue; Palermo, Corey; Beard, Gaysha; MacArthur, Charles A. – International Journal of Artificial Intelligence in Education, 2021
This study examined a naturalistic, districtwide implementation of an automated writing evaluation (AWE) software program called "MI Write" in elementary schools. We specifically examined the degree to which aspects of MI Write were implemented, teacher and student attitudes towards MI Write, and whether MI Write usage along with other…
Descriptors: Automation, Writing Evaluation, Feedback (Response), Computer Software
Wilson, Joshua; Huang, Yue; Palermo, Corey; Beard, Gaysha; MacArthur, Charles A. – Grantee Submission, 2021
This study examined a naturalistic, districtwide implementation of an automated writing evaluation (AWE) software program called "MI Write" in elementary schools. We specifically examined the degree to which aspects of MI Write were implemented, teacher and student attitudes towards MI Write, and whether MI Write usage along with other…
Descriptors: Automation, Writing Evaluation, Feedback (Response), Computer Software
Wilson, Joshua; Potter, Andrew; Cordero, Tania Cruz; Myers, Matthew C. – Innovation in Language Learning and Teaching, 2023
Purpose: This study presents results from a pilot intervention that integrated self-regulation through reflection and goal setting with automated writing evaluation (AWE) technology to improve students' writing outcomes. Methods: We employed a single-group pretest-posttest design. All students in Grades 5-8 (N = 56) from one urban, all female,…
Descriptors: Goal Orientation, Writing Instruction, Writing Evaluation, Pilot Projects
Wilson, Joshua; Wen, Huijing – Elementary School Journal, 2022
This study investigated fourth and fifth graders' metacognitive knowledge about writing and its relationship to writing performance to help identify areas that might be leveraged when designing effective writing instruction. Students' metacognitive knowledge was probed using a 30-minute informative writing prompt requiring students to teach their…
Descriptors: Elementary School Students, Metacognition, Writing Attitudes, Writing (Composition)
Wilson, Joshua; Chen, Dandan; Sandbank, Micheal P.; Hebert, Michael – Journal of Educational Psychology, 2019
The present study examined issues pertaining to the reliability of writing assessment in the elementary grades, and among samples of struggling and nonstruggling writers. The present study also extended nascent research on the reliability and the practical applications of automated essay scoring (AES) systems in Response to Intervention frameworks…
Descriptors: Computer Assisted Testing, Automation, Scores, Writing Tests
Troia, Gary A.; Olinghouse, Natalie G.; Zhang, Mingcai; Wilson, Joshua; Stewart, Kelly A.; Mo, Ya; Hawkins, Lisa – Reading and Writing: An Interdisciplinary Journal, 2018
We examined the degree to which content of states' writing standards and assessments (using measures of content range, frequency, balance, and cognitive complexity) and their alignment were related to student writing achievement on the 2007 National Assessment of Educational Progress (NAEP), while controlling for student, school, and state…
Descriptors: State Standards, Academic Standards, Writing Instruction, Writing Evaluation
Wilson, Joshua; Roscoe, Rod D. – Journal of Educational Computing Research, 2020
The present study extended research on the effectiveness of automated writing evaluation (AWE) systems. Sixth graders were randomly assigned by classroom to an AWE condition that used "Project Essay Grade Writing" (n = 56) or a word-processing condition that used Google Docs (n = 58). Effectiveness was evaluated using multiple metrics:…
Descriptors: Automation, Writing Evaluation, Feedback (Response), Instructional Effectiveness
Wilson, Joshua – Reading and Writing: An Interdisciplinary Journal, 2017
The present study examined growth in writing quality associated with feedback provided by an automated essay evaluation system called PEG Writing. Equal numbers of students with disabilities (SWD) and typically-developing students (TD) matched on prior writing achievement were sampled (n = 1196 total). Data from a subsample of students (n = 655)…
Descriptors: Computer Assisted Testing, Essays, Writing Evaluation, Writing Skills
Roscoe, Rod D.; Wilson, Joshua; Johnson, Adam C.; Mayra, Christopher R. – Grantee Submission, 2017
Automated writing evaluation (AWE) is a popular form of educational technology designed to supplement writing instruction and feedback, yet research on the effectiveness of AWE has observed mixed findings. The current study considered how students' perceptions of automated essay scoring and feedback influenced their writing performance, revising…
Descriptors: Student Attitudes, Writing Instruction, Writing Evaluation, Feedback (Response)
Olinghouse, Natalie G.; Wilson, Joshua – Reading and Writing: An Interdisciplinary Journal, 2013
The purpose of this study was to examine the role of vocabulary in writing across three genres. Fifth graders (N = 105) wrote three compositions: story, persuasive, and informative. Each composition revolved around the topic of outer space to control for background knowledge. Written compositions were scored for holistic writing quality and…
Descriptors: Writing (Composition), Grade 5, Vocabulary, Story Telling
Previous Page | Next Page »
Pages: 1 | 2