NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
Program for International…1
What Works Clearinghouse Rating
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Latifi, Syed; Gierl, Mark – Language Testing, 2021
An automated essay scoring (AES) program is a software system that uses techniques from corpus and computational linguistics and machine learning to grade essays. In this study, we aimed to describe and evaluate particular language features of Coh-Metrix for a novel AES program that would score junior and senior high school students' essays from…
Descriptors: Writing Evaluation, Computer Assisted Testing, Scoring, Essays
Yi Gui – ProQuest LLC, 2024
This study explores using transfer learning in machine learning for natural language processing (NLP) to create generic automated essay scoring (AES) models, providing instant online scoring for statewide writing assessments in K-12 education. The goal is to develop an instant online scorer that is generalizable to any prompt, addressing the…
Descriptors: Writing Tests, Natural Language Processing, Writing Evaluation, Scoring
Peer reviewed Peer reviewed
Direct linkDirect link
Chen, Dandan; Hebert, Michael; Wilson, Joshua – American Educational Research Journal, 2022
We used multivariate generalizability theory to examine the reliability of hand-scoring and automated essay scoring (AES) and to identify how these scoring methods could be used in conjunction to optimize writing assessment. Students (n = 113) included subsamples of struggling writers and non-struggling writers in Grades 3-5 drawn from a larger…
Descriptors: Reliability, Scoring, Essays, Automation
Peer reviewed Peer reviewed
Direct linkDirect link
Myers, Matthew C.; Wilson, Joshua – International Journal of Artificial Intelligence in Education, 2023
This study evaluated the construct validity of six scoring traits of an automated writing evaluation (AWE) system called "MI Write." Persuasive essays (N = 100) written by students in grades 7 and 8 were randomized at the sentence-level using a script written with Python's NLTK module. Each persuasive essay was randomized 30 times (n =…
Descriptors: Construct Validity, Automation, Writing Evaluation, Algorithms
Peer reviewed Peer reviewed
Direct linkDirect link
Wilson, Joshua; Wen, Huijing – Elementary School Journal, 2022
This study investigated fourth and fifth graders' metacognitive knowledge about writing and its relationship to writing performance to help identify areas that might be leveraged when designing effective writing instruction. Students' metacognitive knowledge was probed using a 30-minute informative writing prompt requiring students to teach their…
Descriptors: Elementary School Students, Metacognition, Writing Attitudes, Writing (Composition)
Peer reviewed Peer reviewed
Direct linkDirect link
Zhang, Mo; Bennett, Randy E.; Deane, Paul; van Rijn, Peter W. – Educational Measurement: Issues and Practice, 2019
This study compared gender groups on the processes used in writing essays in an online assessment. Middle-school students from four grades responded to essays in two persuasive subgenres, argumentation and policy recommendation. Writing processes were inferred from four indicators extracted from students' keystroke logs. In comparison to males, on…
Descriptors: Gender Differences, Essays, Computer Assisted Testing, Persuasive Discourse
Peer reviewed Peer reviewed
Direct linkDirect link
Gerard, Libby F.; Linn, Marcia – AERA Online Paper Repository, 2016
We investigate how technologies that automatically score student written essays and assign individualized guidance can support student writing and revision in science. We used the automated scoring tools to assign guidance for student written essays in an online science unit, and studied how students revised their essays based on the guidance and…
Descriptors: Science Instruction, Technical Writing, Revision (Written Composition), Grade 7
Peer reviewed Peer reviewed
Direct linkDirect link
Foxworth, Lauren L.; Hashey, Andrew; Sukhram, Diana P. – Reading & Writing Quarterly, 2019
In an age when students are increasingly expected to demonstrate technology-based writing proficiency, fluency challenges with word processing programs can pose a barrier to successful writing when students are asked to compose using these tools. The current study was designed to determine whether differences existed in typing fluency and digital…
Descriptors: Writing Skills, Students with Disabilities, Learning Disabilities, Word Processing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Shelley, Mack, Ed.; Akcay, Hakan, Ed.; Ozturk, Omer Tayfur, Ed. – International Society for Technology, Education, and Science, 2022
"Proceedings of International Conference on Research in Education and Science" includes full papers presented at the International Conference on Research in Education and Science (ICRES) which took place on March 24-27, 2022 in Antalya, Turkey. The aim of the conference is to offer opportunities to share ideas, to discuss theoretical and…
Descriptors: Educational Technology, Technology Uses in Education, Computer Peripherals, Equipment
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N. – Learning Disabilities: A Contemporary Journal, 2014
The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…
Descriptors: Writing Skills, Writing Ability, Feedback (Response), Computer Mediated Communication
Peer reviewed Peer reviewed
Direct linkDirect link
Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt – Journal of Interactive Learning Research, 2012
The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…
Descriptors: Feedback (Response), Scripts, Formative Evaluation, Essays
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Hu, Xiangen, Ed.; Barnes, Tiffany, Ed.; Hershkovitz, Arnon, Ed.; Paquette, Luc, Ed. – International Educational Data Mining Society, 2017
The 10th International Conference on Educational Data Mining (EDM 2017) is held under the auspices of the International Educational Data Mining Society at the Optics Velley Kingdom Plaza Hotel, Wuhan, Hubei Province, in China. This years conference features two invited talks by: Dr. Jie Tang, Associate Professor with the Department of Computer…
Descriptors: Data Analysis, Data Collection, Graphs, Data Use
Stamper, John, Ed.; Pardos, Zachary, Ed.; Mavrikis, Manolis, Ed.; McLaren, Bruce M., Ed. – International Educational Data Mining Society, 2014
The 7th International Conference on Education Data Mining held on July 4th-7th, 2014, at the Institute of Education, London, UK is the leading international forum for high-quality research that mines large data sets in order to answer educational research questions that shed light on the learning process. These data sets may come from the traces…
Descriptors: Information Retrieval, Data Processing, Data Analysis, Data Collection