Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 5 |
Descriptor
Source
IEEE Transactions on Learning… | 2 |
Computers & Education | 1 |
International Educational… | 1 |
Journal of Computer Assisted… | 1 |
Author
Aldabe, Itziar | 2 |
Maritxalar, Montse | 2 |
Arruarte, Ana | 1 |
Bohm, Isabell | 1 |
Chang, Chun-Yen | 1 |
Crossley, Scott | 1 |
Davenport, Jodi | 1 |
Di Mitri, Daniele | 1 |
Drachsler, Hendrik | 1 |
Elorriaga, Jon A. | 1 |
Gombert, Sebastian | 1 |
More ▼ |
Publication Type
Reports - Research | 5 |
Journal Articles | 4 |
Speeches/Meeting Papers | 1 |
Education Level
Secondary Education | 5 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
High Schools | 1 |
Audience
Location
California | 1 |
Germany | 1 |
Spain | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Larranaga, Mikel; Aldabe, Itziar; Arruarte, Ana; Elorriaga, Jon A.; Maritxalar, Montse – IEEE Transactions on Learning Technologies, 2022
In a concept learning scenario, any technology-supported learning system must provide students with mechanisms that help them with the acquisition of the concepts to be learned. For the technology-supported learning systems to be successful in this task, the development of didactic material is crucial--a hard task that could be alleviated by means…
Descriptors: Computer Assisted Testing, Science Tests, Multiple Choice Tests, Textbooks
Gombert, Sebastian; Di Mitri, Daniele; Karademir, Onur; Kubsch, Marcus; Kolbe, Hannah; Tautz, Simon; Grimm, Adrian; Bohm, Isabell; Neumann, Knut; Drachsler, Hendrik – Journal of Computer Assisted Learning, 2023
Background: Formative assessments are needed to enable monitoring how student knowledge develops throughout a unit. Constructed response items which require learners to formulate their own free-text responses are well suited for testing their active knowledge. However, assessing such constructed responses in an automated fashion is a complex task…
Descriptors: Coding, Energy, Scientific Concepts, Formative Evaluation
Crossley, Scott; Kyle, Kristopher; Davenport, Jodi; McNamara, Danielle S. – International Educational Data Mining Society, 2016
This study introduces the Constructed Response Analysis Tool (CRAT), a freely available tool to automatically assess student responses in online tutoring systems. The study tests CRAT on a dataset of chemistry responses collected in the ChemVLab+. The findings indicate that CRAT can differentiate and classify student responses based on semantic…
Descriptors: Intelligent Tutoring Systems, Chemistry, Natural Language Processing, High School Students
Aldabe, Itziar; Maritxalar, Montse – IEEE Transactions on Learning Technologies, 2014
The work we present in this paper aims to help teachers create multiple-choice science tests. We focus on a scientific vocabulary-learning scenario taking place in a Basque-language educational environment. In this particular scenario, we explore the option of automatically generating Multiple-Choice Questions (MCQ) by means of Natural Language…
Descriptors: Science Tests, Test Construction, Computer Assisted Testing, Multiple Choice Tests
Wang, Hao-Chuan; Chang, Chun-Yen; Li, Tsai-Yen – Computers & Education, 2008
The work aims to improve the assessment of creative problem-solving in science education by employing language technologies and computational-statistical machine learning methods to grade students' natural language responses automatically. To evaluate constructs like creative problem-solving with validity, open-ended questions that elicit…
Descriptors: Interrater Reliability, Earth Science, Problem Solving, Grading