NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Slepkov, A. D.; Van Bussel, M. L.; Fitze, K. M.; Burr, W. S. – SAGE Open, 2021
There is a broad literature in multiple-choice test development, both in terms of item-writing guidelines, and psychometric functionality as a measurement tool. However, most of the published literature concerns multiple-choice testing in the context of expert-designed high-stakes standardized assessments, with little attention being paid to the…
Descriptors: Foreign Countries, Undergraduate Students, Student Evaluation, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Krell, Moritz; Samia Khan; Jan van Driel – Education Sciences, 2021
The development and evaluation of valid assessments of scientific reasoning are an integral part of research in science education. In the present study, we used the linear logistic test model (LLTM) to analyze how item features related to text complexity and the presence of visual representations influence the overall item difficulty of an…
Descriptors: Cognitive Processes, Difficulty Level, Science Tests, Logical Thinking
Peer reviewed Peer reviewed
Direct linkDirect link
Shin, Jinnie; Bulut, Okan; Gierl, Mark J. – Journal of Experimental Education, 2020
The arrangement of response options in multiple-choice (MC) items, especially the location of the most attractive distractor, is considered critical in constructing high-quality MC items. In the current study, a sample of 496 undergraduate students taking an educational assessment course was given three test forms consisting of the same items but…
Descriptors: Foreign Countries, Undergraduate Students, Multiple Choice Tests, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Travis T. Fuchs; Mike Arsenault – School Science Review, 2017
Students, as well as teachers, often learn what makes sense to them, even when it is wrong. These misconceptions are a problem. The authors sought a quick, quantitative way of identifying student misconceptions in secondary science. Using the University of Toronto's National Biology Competition test data, this article presents a method of quickly…
Descriptors: Science Education, Secondary School Science, Misconceptions, Scientific Concepts
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Pachai, Matthew V.; DiBattista, David; Kim, Joseph A. – Canadian Journal for the Scholarship of Teaching and Learning, 2015
Multiple choice writing guidelines are decidedly split on the use of "none of the above" (NOTA), with some authors discouraging and others advocating its use. Moreover, empirical studies of NOTA have produced mixed results. Generally, these studies have utilized NOTA as either the correct response or a distractor and assessed its effect…
Descriptors: Multiple Choice Tests, Test Items, Introductory Courses, Psychology
Peer reviewed Peer reviewed
Direct linkDirect link
DiBattista, David; Sinnige-Egger, Jo-Anne; Fortuna, Glenda – Journal of Experimental Education, 2014
The authors assessed the effects of using "none of the above" as an option in a 40-item, general-knowledge multiple-choice test administered to undergraduate students. Examinees who selected "none of the above" were given an incentive to write the correct answer to the question posed. Using "none of the above" as the…
Descriptors: Multiple Choice Tests, Testing, Undergraduate Students, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Kalas, Pamela; O'Neill, Angie; Pollock, Carol; Birol, Gulnur – CBE - Life Sciences Education, 2013
We have designed, developed, and validated a 17-question Meiosis Concept Inventory (Meiosis CI) to diagnose student misconceptions on meiosis, which is a fundamental concept in genetics. We targeted large introductory biology and genetics courses and used published methodology for question development, which included the validation of questions by…
Descriptors: Scientific Concepts, Misconceptions, Genetics, Introductory Courses
Velanoff, John – 1987
This report describes courseware for comprehensive computer-assisted testing and instruction. With this program, a personal computer can be used to: (1) generate multiple test versions to meet test objectives; (2) create study guides for self-directed learning; and (3) evaluate student and teacher performance. Numerous multiple-choice examples,…
Descriptors: Computer Assisted Instruction, Computer Assisted Testing, Computer Uses in Education, Courseware