Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 7 |
Descriptor
Difficulty Level | 8 |
Foreign Countries | 8 |
Multiple Choice Tests | 8 |
Test Items | 7 |
Undergraduate Students | 4 |
Item Analysis | 3 |
Test Construction | 3 |
Biology | 2 |
Correlation | 2 |
Introductory Courses | 2 |
Misconceptions | 2 |
More ▼ |
Source
Journal of Experimental… | 2 |
CBE - Life Sciences Education | 1 |
Canadian Journal for the… | 1 |
Education Sciences | 1 |
SAGE Open | 1 |
School Science Review | 1 |
Author
DiBattista, David | 2 |
Birol, Gulnur | 1 |
Bulut, Okan | 1 |
Burr, W. S. | 1 |
Fitze, K. M. | 1 |
Fortuna, Glenda | 1 |
Gierl, Mark J. | 1 |
Jan van Driel | 1 |
Kalas, Pamela | 1 |
Kim, Joseph A. | 1 |
Krell, Moritz | 1 |
More ▼ |
Publication Type
Journal Articles | 7 |
Reports - Research | 6 |
Reports - Descriptive | 1 |
Reports - Evaluative | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 6 |
Postsecondary Education | 6 |
Secondary Education | 1 |
Audience
Location
Canada | 8 |
Australia | 1 |
United Kingdom | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Slepkov, A. D.; Van Bussel, M. L.; Fitze, K. M.; Burr, W. S. – SAGE Open, 2021
There is a broad literature in multiple-choice test development, both in terms of item-writing guidelines, and psychometric functionality as a measurement tool. However, most of the published literature concerns multiple-choice testing in the context of expert-designed high-stakes standardized assessments, with little attention being paid to the…
Descriptors: Foreign Countries, Undergraduate Students, Student Evaluation, Multiple Choice Tests
Krell, Moritz; Samia Khan; Jan van Driel – Education Sciences, 2021
The development and evaluation of valid assessments of scientific reasoning are an integral part of research in science education. In the present study, we used the linear logistic test model (LLTM) to analyze how item features related to text complexity and the presence of visual representations influence the overall item difficulty of an…
Descriptors: Cognitive Processes, Difficulty Level, Science Tests, Logical Thinking
Shin, Jinnie; Bulut, Okan; Gierl, Mark J. – Journal of Experimental Education, 2020
The arrangement of response options in multiple-choice (MC) items, especially the location of the most attractive distractor, is considered critical in constructing high-quality MC items. In the current study, a sample of 496 undergraduate students taking an educational assessment course was given three test forms consisting of the same items but…
Descriptors: Foreign Countries, Undergraduate Students, Multiple Choice Tests, Item Response Theory
Travis T. Fuchs; Mike Arsenault – School Science Review, 2017
Students, as well as teachers, often learn what makes sense to them, even when it is wrong. These misconceptions are a problem. The authors sought a quick, quantitative way of identifying student misconceptions in secondary science. Using the University of Toronto's National Biology Competition test data, this article presents a method of quickly…
Descriptors: Science Education, Secondary School Science, Misconceptions, Scientific Concepts
Pachai, Matthew V.; DiBattista, David; Kim, Joseph A. – Canadian Journal for the Scholarship of Teaching and Learning, 2015
Multiple choice writing guidelines are decidedly split on the use of "none of the above" (NOTA), with some authors discouraging and others advocating its use. Moreover, empirical studies of NOTA have produced mixed results. Generally, these studies have utilized NOTA as either the correct response or a distractor and assessed its effect…
Descriptors: Multiple Choice Tests, Test Items, Introductory Courses, Psychology
DiBattista, David; Sinnige-Egger, Jo-Anne; Fortuna, Glenda – Journal of Experimental Education, 2014
The authors assessed the effects of using "none of the above" as an option in a 40-item, general-knowledge multiple-choice test administered to undergraduate students. Examinees who selected "none of the above" were given an incentive to write the correct answer to the question posed. Using "none of the above" as the…
Descriptors: Multiple Choice Tests, Testing, Undergraduate Students, Test Items
Kalas, Pamela; O'Neill, Angie; Pollock, Carol; Birol, Gulnur – CBE - Life Sciences Education, 2013
We have designed, developed, and validated a 17-question Meiosis Concept Inventory (Meiosis CI) to diagnose student misconceptions on meiosis, which is a fundamental concept in genetics. We targeted large introductory biology and genetics courses and used published methodology for question development, which included the validation of questions by…
Descriptors: Scientific Concepts, Misconceptions, Genetics, Introductory Courses
Velanoff, John – 1987
This report describes courseware for comprehensive computer-assisted testing and instruction. With this program, a personal computer can be used to: (1) generate multiple test versions to meet test objectives; (2) create study guides for self-directed learning; and (3) evaluate student and teacher performance. Numerous multiple-choice examples,…
Descriptors: Computer Assisted Instruction, Computer Assisted Testing, Computer Uses in Education, Courseware