NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
Assessments and Surveys
Trends in International…1
What Works Clearinghouse Rating
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Christophe O. Soulage; Fabien Van Coppenolle; Fitsum Guebre-Egziabher – Advances in Physiology Education, 2024
Artificial intelligence (AI) has gained massive interest with the public release of the conversational AI "ChatGPT," but it also has become a matter of concern for academia as it can easily be misused. We performed a quantitative evaluation of the performance of ChatGPT on a medical physiology university examination. Forty-one answers…
Descriptors: Medical Students, Medical Education, Artificial Intelligence, Computer Software
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Marli Crabtree; Kenneth L. Thompson; Ellen M. Robertson – HAPS Educator, 2024
Research has suggested that changing one's answer on multiple-choice examinations is more likely to lead to positive academic outcomes. This study aimed to further understand the relationship between changing answer selections and item attributes, student performance, and time within a population of 158 first-year medical students enrolled in a…
Descriptors: Anatomy, Science Tests, Medical Students, Medical Education
Peer reviewed Peer reviewed
Direct linkDirect link
Gerd Kortemeyer; Julian Nöhl; Daria Onishchuk – Physical Review Physics Education Research, 2024
[This paper is part of the Focused Collection in Artificial Intelligence Tools in Physics Teaching and Physics Education Research.] Using a high-stakes thermodynamics exam as the sample (252 students, four multipart problems), we investigate the viability of four workflows for AI-assisted grading of handwritten student solutions. We find that the…
Descriptors: Grading, Physics, Science Instruction, Artificial Intelligence
Peer reviewed Peer reviewed
Direct linkDirect link
Xiong-Skiba, P.; Buckner, S.; Little, C.; Kovalskiy, A. – Physics Teacher, 2020
This paper reports our work on replacing lab report grading by post-online lab quizzes using Desire2Learn (D2L, an online course management software), specifically, how we circumvent some of the limitations imposed by D2L and the outcomes.
Descriptors: Grading, Physics, Science Tests, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Zhang, Lishan; VanLehn, Kurt – Interactive Learning Environments, 2021
Despite their drawback, multiple-choice questions are an enduring feature in instruction because they can be answered more rapidly than open response questions and they are easily scored. However, it can be difficult to generate good incorrect choices (called "distractors"). We designed an algorithm to generate distractors from a…
Descriptors: Semantics, Networks, Multiple Choice Tests, Teaching Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Pujayanto, Pujayanto; Budiharti, Rini; Adhitama, Egy; Nuraini, Niken Rizky Amalia; Putri, Hanung Vernanda – Physics Education, 2018
This research proposes the development of a web-based assessment system to identify students' misconception. The system, named WAS (web-based assessment system), can identify students' misconception profile on linear kinematics automatically after the student has finished the test. The test instrument was developed and validated. Items were…
Descriptors: Misconceptions, Physics, Science Instruction, Databases
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kim, Kerry J.; Meir, Eli; Pope, Denise S.; Wendel, Daniel – Journal of Educational Data Mining, 2017
Computerized classification of student answers offers the possibility of instant feedback and improved learning. Open response (OR) questions provide greater insight into student thinking and understanding than more constrained multiple choice (MC) questions, but development of automated classifiers is more difficult, often requiring training a…
Descriptors: Classification, Computer Assisted Testing, Multiple Choice Tests, Test Format
Mullis, Ina V. S., Ed.; Martin, Michael O., Ed.; von Davier, Matthias, Ed. – International Association for the Evaluation of Educational Achievement, 2021
TIMSS (Trends in International Mathematics and Science Study) is a long-standing international assessment of mathematics and science at the fourth and eighth grades that has been collecting trend data every four years since 1995. About 70 countries use TIMSS trend data for monitoring the effectiveness of their education systems in a global…
Descriptors: Achievement Tests, International Assessment, Science Achievement, Mathematics Achievement
Peer reviewed Peer reviewed
Direct linkDirect link
Alexander, Cara J.; Crescini, Weronika M.; Juskewitch, Justin E.; Lachman, Nirusha; Pawlina, Wojciech – Anatomical Sciences Education, 2009
The goals of our study were to determine the predictive value and usability of an audience response system (ARS) as a knowledge assessment tool in an undergraduate medical curriculum. Over a three year period (2006-2008), data were collected from first year didactic blocks in Genetics/Histology and Anatomy/Radiology (n = 42-50 per class). During…
Descriptors: Feedback (Response), Medical Education, Audience Response, Genetics
Peer reviewed Peer reviewed
Browning, Mark E.; Lehman, James D. – Journal of Research in Science Teaching, 1988
Describes a computer program presenting four genetics problems to monitor the problem solving process of college students. Identifies three main areas of difficulty: computational skills; determination of gametes; and application of previous learning to new situations. (Author/YP)
Descriptors: Biology, College Science, Computer Assisted Testing, Computer Software
Peer reviewed Peer reviewed
Iona, Mario – Physics Teacher, 1989
Refutes 14 items from "Computer Test Bank for Heath Physical Science." Discusses the correct answers item by item. (YP)
Descriptors: Computer Assisted Testing, Computer Software, Computer Software Reviews, Item Banks
Peer reviewed Peer reviewed
Science Teacher, 1989
Reviews seven software programs: (1) "Science Baseball: Biology" (testing a variety of topics); (2) "Wildways: Understanding Wildlife Conservation"; (3) "Earth Science Computer Test Bank"; (4) "Biology Computer Test Bank"; (5) "Computer Play & Learn Series" (a series of drill and test…
Descriptors: Computer Assisted Instruction, Computer Assisted Testing, Computer Software, Computer Software Reviews
Bock, H. Darrell – 1992
The hardware and software system used to create the National Opinion Research Center/Center for Research on Evaluation, Standards, and Student Testing (NORC/CRESST) item databases and test booklets for the 12th-grade science assessment are described. A general description of the capabilities of the system is given, with some specific information…
Descriptors: Computer Assisted Testing, Computer Graphics, Computer Software, Databases