Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 2 |
| Since 2017 (last 10 years) | 6 |
| Since 2007 (last 20 years) | 21 |
Descriptor
Source
| Grantee Submission | 14 |
| International Educational… | 2 |
| Discourse Processes: A… | 1 |
| Journal of Educational Data… | 1 |
| Journal of Educational… | 1 |
| Journal of Learning Analytics | 1 |
| Written Communication | 1 |
Author
Publication Type
| Reports - Research | 20 |
| Speeches/Meeting Papers | 10 |
| Journal Articles | 8 |
| Tests/Questionnaires | 4 |
| Reports - Evaluative | 1 |
Education Level
| High Schools | 21 |
| Secondary Education | 17 |
| Higher Education | 7 |
| Postsecondary Education | 7 |
| Junior High Schools | 2 |
| Elementary Education | 1 |
| Grade 10 | 1 |
| Grade 4 | 1 |
| Grade 5 | 1 |
| Grade 6 | 1 |
| Grade 7 | 1 |
| More ▼ | |
Audience
| Researchers | 1 |
| Teachers | 1 |
Location
| Arizona (Phoenix) | 2 |
| Arizona | 1 |
| California | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Gates MacGinitie Reading Tests | 9 |
| Writing Apprehension Test | 1 |
What Works Clearinghouse Rating
Allen, Laura Kristen; Magliano, Joseph P.; McCarthy, Kathryn S.; Sonia, Allison N.; Creer, Sarah D.; McNamara, Danielle S. – Grantee Submission, 2021
The current study examined the extent to which the cohesion detected in readers' constructed responses to multiple documents was predictive of persuasive, source-based essay quality. Participants (N=95) completed multiple-documents reading tasks wherein they were prompted to think-aloud, self-explain, or evaluate the sources while reading a set of…
Descriptors: Reading Comprehension, Connected Discourse, Reader Response, Natural Language Processing
Sonia, Allison N.; Joseph, Magliano P.; McCarthy, Kathryn S.; Creer, Sarah D.; McNamara, Danielle S.; Allen, Laura K. – Grantee Submission, 2022
The constructed responses individuals generate while reading can provide insights into their coherence-building processes. The current study examined how the cohesion of constructed responses relates to performance on an integrated writing task. Participants (N = 95) completed a multiple document reading task wherein they were prompted to think…
Descriptors: Natural Language Processing, Connected Discourse, Reading Processes, Writing Skills
Sonia, Allison N.; Magliano, Joseph P.; McCarthy, Kathryn S.; Creer, Sarah D.; McNamara, Danielle S.; Allen, Laura, K. – Discourse Processes: A Multidisciplinary Journal, 2022
The constructed responses individuals generate while reading can provide insights into their coherence-building processes. The current study examined how the cohesion of constructed responses relates to performance on an integrated writing task. Participants (N = 95) completed a multiple document reading task wherein they were prompted to think…
Descriptors: Natural Language Processing, Connected Discourse, Reading Processes, Writing Skills
Crossley, Scott; Kyle, Kristopher; Davenport, Jodi; McNamara, Danielle S. – International Educational Data Mining Society, 2016
This study introduces the Constructed Response Analysis Tool (CRAT), a freely available tool to automatically assess student responses in online tutoring systems. The study tests CRAT on a dataset of chemistry responses collected in the ChemVLab+. The findings indicate that CRAT can differentiate and classify student responses based on semantic…
Descriptors: Intelligent Tutoring Systems, Chemistry, Natural Language Processing, High School Students
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of writing proficiency generally includes analyses of the specific linguistic and rhetorical features contained in the singular essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing might more closely capture writing skill.…
Descriptors: Writing Evaluation, Writing Tests, Computer Assisted Testing, Writing Skills
Allen, Laura K.; McNamara, Danielle S. – International Educational Data Mining Society, 2015
The current study investigates the degree to which the lexical properties of students' essays can inform stealth assessments of their vocabulary knowledge. In particular, we used indices calculated with the natural language processing tool, TAALES, to predict students' performance on a measure of vocabulary knowledge. To this end, two corpora were…
Descriptors: Vocabulary, Knowledge Level, Models, Natural Language Processing
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of argumentative writing generally includes analyses of the specific linguistic and rhetorical features contained in the individual essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing may more accurately capture their…
Descriptors: Writing (Composition), Persuasive Discourse, Essays, Language Usage
Crossley, Scott; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study investigates a new approach to automatically assessing essay quality that combines traditional approaches based on assessing textual features with new approaches that measure student attributes such as demographic information, standardized test scores, and survey results. The results demonstrate that combining both text features and…
Descriptors: Automation, Scoring, Essays, Evaluation Methods
Allen, Laura K.; Crossley, Scott A.; McNamara, Danielle S. – Grantee Submission, 2015
We investigated linguistic factors that relate to misalignment between students' and teachers' ratings of essay quality. Students (n = 126) wrote essays and rated the quality of their work. Teachers then provided their own ratings of the essays. Results revealed that students who were less accurate in their self-assessments produced essays that…
Descriptors: Essays, Scores, Natural Language Processing, Interrater Reliability
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study builds upon previous work aimed at developing a student model of reading comprehension ability within the intelligent tutoring system, iSTART. Currently, the system evaluates students' self-explanation performance using a local, sentence-level algorithm and does not adapt content based on reading ability. The current study leverages…
Descriptors: Reading Comprehension, Reading Skills, Natural Language Processing, Intelligent Tutoring Systems
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2014
In the current study, we utilize natural language processing techniques to examine relations between the linguistic properties of students' self-explanations and their reading comprehension skills. Linguistic features of students' aggregated self-explanations were analyzed using the Linguistic Inquiry and Word Count (LIWC) software. Results…
Descriptors: Natural Language Processing, Reading Comprehension, Linguistics, Predictor Variables
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2016
A commonly held belief among educators, researchers, and students is that high-quality texts are easier to read than low-quality texts, as they contain more engaging narrative and story-like elements. Interestingly, these assumptions have typically failed to be supported by the literature on writing. Previous research suggests that higher quality…
Descriptors: Role, Writing (Composition), Natural Language Processing, Hypothesis Testing
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Journal of Educational Psychology, 2016
A commonly held belief among educators, researchers, and students is that high-quality texts are easier to read than low-quality texts, as they contain more engaging narrative and story-like elements. Interestingly, these assumptions have typically failed to be supported by the literature on writing. Previous research suggests that higher quality…
Descriptors: Role, Writing (Composition), Natural Language Processing, Hypothesis Testing
Varner, Laura K.; Jackson, G. Tanner; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2013
This study expands upon an existing model of students' reading comprehension ability within an intelligent tutoring system. The current system evaluates students' natural language input using a local student model. We examine the potential to expand this model by assessing the linguistic features of self-explanations aggregated across entire…
Descriptors: Reading Comprehension, Intelligent Tutoring Systems, Natural Language Processing, Reading Ability
Crossley, Scott A.; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Journal of Educational Data Mining, 2016
This study investigates a novel approach to automatically assessing essay quality that combines natural language processing approaches that assess text features with approaches that assess individual differences in writers such as demographic information, standardized test scores, and survey results. The results demonstrate that combining text…
Descriptors: Essays, Scoring, Writing Evaluation, Natural Language Processing
Previous Page | Next Page »
Pages: 1 | 2
Peer reviewed
Direct link
