Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 4 |
Descriptor
| Natural Language Processing | 4 |
| Intelligent Tutoring Systems | 3 |
| Scoring | 3 |
| Accuracy | 2 |
| Essays | 2 |
| High School Students | 2 |
| Tests | 2 |
| Writing Apprehension | 2 |
| Academic Achievement | 1 |
| Academic Persistence | 1 |
| Automation | 1 |
| More ▼ | |
Author
| Crossley, Scott | 4 |
| McNamara, Danielle S. | 4 |
| Allen, Laura K. | 2 |
| Baker, Ryan | 1 |
| Barnes, Tiffany | 1 |
| Bergner, Yoav | 1 |
| D'Mello, Sidney | 1 |
| Davenport, Jodi | 1 |
| Jacovina, Matthew E. | 1 |
| Kyle, Kristopher | 1 |
| Mills, Caitlin | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 4 |
| Speeches/Meeting Papers | 4 |
Education Level
| High Schools | 2 |
| Secondary Education | 2 |
| Higher Education | 1 |
Audience
Location
| Arizona (Phoenix) | 1 |
| California | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Writing Apprehension Test | 2 |
| Gates MacGinitie Reading Tests | 1 |
What Works Clearinghouse Rating
Crossley, Scott; Kyle, Kristopher; Davenport, Jodi; McNamara, Danielle S. – International Educational Data Mining Society, 2016
This study introduces the Constructed Response Analysis Tool (CRAT), a freely available tool to automatically assess student responses in online tutoring systems. The study tests CRAT on a dataset of chemistry responses collected in the ChemVLab+. The findings indicate that CRAT can differentiate and classify student responses based on semantic…
Descriptors: Intelligent Tutoring Systems, Chemistry, Natural Language Processing, High School Students
Allen, Laura K.; Mills, Caitlin; Jacovina, Matthew E.; Crossley, Scott; D'Mello, Sidney; McNamara, Danielle S. – Grantee Submission, 2016
Writing training systems have been developed to provide students with instruction and deliberate practice on their writing. Although generally successful in providing accurate scores, a common criticism of these systems is their lack of personalization and adaptive instruction. In particular, these systems tend to place the strongest emphasis on…
Descriptors: Learner Engagement, Psychological Patterns, Writing Instruction, Essays
Crossley, Scott; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study investigates a new approach to automatically assessing essay quality that combines traditional approaches based on assessing textual features with new approaches that measure student attributes such as demographic information, standardized test scores, and survey results. The results demonstrate that combining both text features and…
Descriptors: Automation, Scoring, Essays, Evaluation Methods
Crossley, Scott; McNamara, Danielle S.; Baker, Ryan; Wang, Yuan; Paquette, Luc; Barnes, Tiffany; Bergner, Yoav – International Educational Data Mining Society, 2015
Completion rates for massive open online classes (MOOCs) are notoriously low, but learner intent is an important factor. By studying students who drop out despite their intent to complete the MOOC, it may be possible to develop interventions to improve retention and learning outcomes. Previous research into predicting MOOC completion has focused…
Descriptors: Online Courses, Large Group Instruction, Information Retrieval, Data Analysis

Peer reviewed
Direct link
