Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 1 |
Descriptor
Difficulty Level | 8 |
Multiple Choice Tests | 8 |
Test Items | 7 |
Test Construction | 5 |
Responses | 4 |
Test Format | 4 |
Test Interpretation | 3 |
Cognitive Processes | 2 |
Item Response Theory | 2 |
Literature Reviews | 2 |
Reading Tests | 2 |
More ▼ |
Source
Educational Measurement:… | 2 |
Journal of Educational… | 2 |
Educational and Psychological… | 1 |
National Center for Education… | 1 |
Review of Educational Research | 1 |
Author
Albanese, Mark A. | 1 |
Bolden, Bernadine J. | 1 |
Bulut, Okan | 1 |
Donlon, Thomas F. | 1 |
Frisbie, David A. | 1 |
Garavaglia, Diane R. | 1 |
Gierl, Mark J. | 1 |
Gorin, Joanna S. | 1 |
Guo, Qi | 1 |
Knowles, Susan L. | 1 |
Pearson, P. David | 1 |
More ▼ |
Publication Type
Information Analyses | 8 |
Journal Articles | 6 |
Reports - Research | 4 |
Reports - Evaluative | 3 |
Speeches/Meeting Papers | 2 |
Education Level
Elementary Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 1 |
What Works Clearinghouse Rating
Gierl, Mark J.; Bulut, Okan; Guo, Qi; Zhang, Xinxin – Review of Educational Research, 2017
Multiple-choice testing is considered one of the most effective and enduring forms of educational assessment that remains in practice today. This study presents a comprehensive review of the literature on multiple-choice testing in education focused, specifically, on the development, analysis, and use of the incorrect options, which are also…
Descriptors: Multiple Choice Tests, Difficulty Level, Accuracy, Error Patterns

Donlon, Thomas F. – Journal of Educational Measurement, 1981
Scores within the chance range are differentiated, "uninterpretable" scores being those that demonstrate randomness (broadly defined) by failing to achieve typical levels of correlation with group-determined difficulty. The relevant literature is reviewed. Finally, randomness and uninterpretability are examined in light of the…
Descriptors: Difficulty Level, Guessing (Tests), Multiple Choice Tests, Scores

Albanese, Mark A. – Educational Measurement: Issues and Practice, 1993
A comprehensive review is given of evidence, with a bearing on the recommendation to avoid use of complex multiple choice (CMC) items. Avoiding Type K items (four primary responses and five secondary choices) seems warranted, but evidence against CMC in general is less clear. (SLD)
Descriptors: Cues, Difficulty Level, Multiple Choice Tests, Responses
Gorin, Joanna S. – Journal of Educational Measurement, 2005
Based on a previously validated cognitive processing model of reading comprehension, this study experimentally examines potential generative components of text-based multiple-choice reading comprehension test questions. Previous research (Embretson & Wetzel, 1987; Gorin & Embretson, 2005; Sheehan & Ginther, 2001) shows text encoding and decision…
Descriptors: Reaction Time, Reading Comprehension, Difficulty Level, Test Items

Knowles, Susan L.; Welch, Cynthia A. – Educational and Psychological Measurement, 1992
A meta-analysis of the difficulty and discrimination of the "none-of-the-above" (NOTA) test option was conducted with 12 articles (20 effect sizes) for difficulty and 7 studies (11 effect sizes) for discrimination. Findings indicate that using the NOTA option does not result in items of lesser quality. (SLD)
Descriptors: Difficulty Level, Effect Size, Meta Analysis, Multiple Choice Tests

Frisbie, David A. – Educational Measurement: Issues and Practice, 1992
Literature related to the multiple true-false (MTF) item format is reviewed. Each answer cluster of a MTF item may have several true items and the correctness of each is judged independently. MTF tests appear efficient and reliable, although they are a bit harder than multiple choice items for examinees. (SLD)
Descriptors: Achievement Tests, Difficulty Level, Literature Reviews, Multiple Choice Tests
Bolden, Bernadine J.; Stoddard, Ann – 1980
This study examined the effect of two ways of question phrasing as related to three styles of expository writing on the test performance of elementary school children. Multiple-choice questions were developed for sets of passages which were written using three different syntactic structures; and which had different levels of difficulty. The…
Descriptors: Difficulty Level, Elementary Education, Kernel Sentences, Multiple Choice Tests
Pearson, P. David; Garavaglia, Diane R. – National Center for Education Statistics, 2003
The purpose of this essay is to explore both what is known and what needs to be learned about the information value of performance items "when they are used in large scale assessments." Within the context of the National Assessment of Educational Progress (NAEP), there is substantial motivation for answering these questions. Over the…
Descriptors: Measurement, National Competency Tests, Test Items, Performance