NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Öncel, Püren; Flynn, Lauren E.; Sonia, Allison N.; Barker, Kennis E.; Lindsay, Grace C.; McClure, Caleb M.; McNamara, Danielle S.; Allen, Laura K. – Grantee Submission, 2021
Automated Writing Evaluation systems have been developed to help students improve their writing skills through the automated delivery of both summative and formative feedback. These systems have demonstrated strong potential in a variety of educational contexts; however, they remain limited in their personalization and scope. The purpose of the…
Descriptors: Computer Assisted Instruction, Writing Evaluation, Formative Evaluation, Summative Evaluation
Lynette Hazelton; Jessica Nastal; Norbert Elliot; Jill Burstein; Daniel F. McCaffrey – Grantee Submission, 2021
In writing studies research, automated writing evaluation technology is typically examined for a specific, often narrow purpose: to evaluate a particular writing improvement measure, to mine data for changes in writing performance, or to demonstrate the effectiveness of a single technology and accompanying validity arguments. This article adopts a…
Descriptors: Formative Evaluation, Writing Evaluation, Automation, Natural Language Processing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Zhang, H.; Magooda, A.; Litman, D.; Correnti, R.; Wang, E.; Matsumura, L. C.; Howe, E.; Quintana, R. – Grantee Submission, 2019
Writing a good essay typically involves students revising an initial paper draft after receiving feedback. We present eRevise, a web-based writing and revising environment that uses natural language processing features generated for rubric-based essay scoring to trigger formative feedback messages regarding students' use of evidence in…
Descriptors: Formative Evaluation, Essays, Writing (Composition), Revision (Written Composition)
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of writing proficiency generally includes analyses of the specific linguistic and rhetorical features contained in the singular essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing might more closely capture writing skill.…
Descriptors: Writing Evaluation, Writing Tests, Computer Assisted Testing, Writing Skills
Crossley, Scott A.; Kyle, Kristopher; McNamara, Danielle S. – Grantee Submission, 2015
This study investigates the relative efficacy of using linguistic micro-features, the aggregation of such features, and a combination of micro-features and aggregated features in developing automatic essay scoring (AES) models. Although the use of aggregated features is widespread in AES systems (e.g., e-rater; Intellimetric), very little…
Descriptors: Essays, Scoring, Feedback (Response), Writing Evaluation
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of argumentative writing generally includes analyses of the specific linguistic and rhetorical features contained in the individual essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing may more accurately capture their…
Descriptors: Writing (Composition), Persuasive Discourse, Essays, Language Usage
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Jacovina, Matthew E.; McNamara, Danielle S. – Grantee Submission, 2017
In this chapter, we describe several intelligent tutoring systems (ITSs) designed to support student literacy through reading comprehension and writing instruction and practice. Although adaptive instruction can be a powerful tool in the literacy domain, developing these technologies poses significant challenges. For example, evaluating the…
Descriptors: Intelligent Tutoring Systems, Literacy Education, Educational Technology, Technology Uses in Education
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Roscoe, Rod D.; Varner, Laura K.; Crossley, Scott A.; McNamara, Danielle S. – Grantee Submission, 2013
Various computer tools have been developed to support educators' assessment of student writing, including automated essay scoring and automated writing evaluation systems. Research demonstrates that these systems exhibit relatively high scoring accuracy but uncertain instructional efficacy. Students' writing proficiency does not necessarily…
Descriptors: Writing Instruction, Intelligent Tutoring Systems, Computer Assisted Testing, Writing Evaluation