NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Ngoc My Bui; Jessie S. Barrot – Education and Information Technologies, 2025
With the generative artificial intelligence (AI) tool's remarkable capabilities in understanding and generating meaningful content, intriguing questions have been raised about its potential as an automated essay scoring (AES) system. One such tool is ChatGPT, which is capable of scoring any written work based on predefined criteria. However,…
Descriptors: Artificial Intelligence, Natural Language Processing, Technology Uses in Education, Automation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Hunkoog Jho; Minsu Ha – Journal of Baltic Science Education, 2024
This study aimed at examining the performance of generative artificial intelligence to extract argumentation elements from text. Thus, the researchers developed a web-based framework to provide automated assessment and feedback relying on a large language model, ChatGPT. The results produced by ChatGPT were compared to human experts across…
Descriptors: Feedback (Response), Artificial Intelligence, Persuasive Discourse, Models
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wan, Qian; Crossley, Scott; Banawan, Michelle; Balyan, Renu; Tian, Yu; McNamara, Danielle; Allen, Laura – International Educational Data Mining Society, 2021
The current study explores the ability to predict argumentative claims in structurally-annotated student essays to gain insights into the role of argumentation structure in the quality of persuasive writing. Our annotation scheme specified six types of argumentative components based on the well-established Toulmin's model of argumentation. We…
Descriptors: Essays, Persuasive Discourse, Automation, Identification
McCaffrey, Daniel F.; Zhang, Mo; Burstein, Jill – Grantee Submission, 2022
Background: This exploratory writing analytics study uses argumentative writing samples from two performance contexts--standardized writing assessments and university English course writing assignments--to compare: (1) linguistic features in argumentative writing; and (2) relationships between linguistic characteristics and academic performance…
Descriptors: Persuasive Discourse, Academic Language, Writing (Composition), Academic Achievement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wan, Qian; Crossley, Scott; Allen, Laura; McNamara, Danielle – Grantee Submission, 2020
In this paper, we extracted content-based and structure-based features of text to predict human annotations for claims and nonclaims in argumentative essays. We compared Logistic Regression, Bernoulli Naive Bayes, Gaussian Naive Bayes, Linear Support Vector Classification, Random Forest, and Neural Networks to train classification models. Random…
Descriptors: Persuasive Discourse, Essays, Writing Evaluation, Natural Language Processing
Mozer, Reagan; Miratrixy, Luke; Relyea, Jackie Eunjung; Kim, James S. – Annenberg Institute for School Reform at Brown University, 2021
In a randomized trial that collects text as an outcome, traditional approaches for assessing treatment impact require that each document first be manually coded for constructs of interest by human raters. An impact analysis can then be conducted to compare treatment and control groups, using the hand-coded scores as a measured outcome. This…
Descriptors: Scoring, Automation, Data Analysis, Natural Language Processing
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of argumentative writing generally includes analyses of the specific linguistic and rhetorical features contained in the individual essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing may more accurately capture their…
Descriptors: Writing (Composition), Persuasive Discourse, Essays, Language Usage