NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Geden, Michael; Emerson, Andrew; Carpenter, Dan; Rowe, Jonathan; Azevedo, Roger; Lester, James – International Journal of Artificial Intelligence in Education, 2021
Game-based learning environments are designed to provide effective and engaging learning experiences for students. Predictive student models use trace data extracted from students' in-game learning behaviors to unobtrusively generate early assessments of student knowledge and skills, equipping game-based learning environments with the capacity to…
Descriptors: Game Based Learning, Middle School Students, Microbiology, Secondary School Science
Peer reviewed Peer reviewed
Direct linkDirect link
Katie Lai – College & Research Libraries, 2023
To explore whether artificial intelligence can be used to enhance library services, this study used ChatGPT to answer reference questions. An assessment rubric was used to evaluate how well ChatGPT handled different question types and difficulty levels. Overall ChatGPT's performance was fair, but it did poorly in information accuracy. It scored…
Descriptors: Artificial Intelligence, Technology Uses in Education, Library Services, Reference Services
Nicula, Bogdan; Perret, Cecile A.; Dascalu, Mihai; McNamara, Danielle S. – Grantee Submission, 2020
Open-ended comprehension questions are a common type of assessment used to evaluate how well students understand one of multiple documents. Our aim is to use natural language processing (NLP) to infer the level and type of inferencing within readers' answers to comprehension questions using linguistic and semantic features within their responses.…
Descriptors: Natural Language Processing, Taxonomy, Responses, Semantics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Crossley, Scott; Kyle, Kristopher; Davenport, Jodi; McNamara, Danielle S. – International Educational Data Mining Society, 2016
This study introduces the Constructed Response Analysis Tool (CRAT), a freely available tool to automatically assess student responses in online tutoring systems. The study tests CRAT on a dataset of chemistry responses collected in the ChemVLab+. The findings indicate that CRAT can differentiate and classify student responses based on semantic…
Descriptors: Intelligent Tutoring Systems, Chemistry, Natural Language Processing, High School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank – Educational and Psychological Measurement, 2016
Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…
Descriptors: Educational Assessment, Coding, Automation, Responses