Publication Date
| In 2026 | 0 |
| Since 2025 | 85 |
| Since 2022 (last 5 years) | 453 |
| Since 2017 (last 10 years) | 1241 |
| Since 2007 (last 20 years) | 2515 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 131 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| United Kingdom | 51 |
| Germany | 50 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Kane, Michael T.; Moloney, James M. – 1974
Gilman and Ferry have shown that when the student's score on a multiple choice test is the total number of responses necessary to get all items correct, substantial increases in reliability can occur. In contrast, similar procedures giving partial credit on multiple choice items have resulted in relatively small gains in reliability. The analysis…
Descriptors: Feedback, Guessing (Tests), Multiple Choice Tests, Response Style (Tests)
Moodie, Allan G. – 1972
Percentile scores for Vancouver students in grades 9, 10, 11 and 12 on the AAHPER Cooperative Physical Education Tests are presented. Two of the six forms of the tests were used in these administrations. Every form consists of 60 multiple-choice questions to be completed in 40 minutes. A single score, based on the number of questions answered…
Descriptors: Multiple Choice Tests, Norms, Physical Education, Scores
Schmeiser, Cynthia Board; Whitney, Douglas R. – 1973
Violations of four selected principles of writing multiple-choice items were introduced into an undergraduate religion course mid-term examination. Three of the flaws significantly increased test difficulty. KR-sub-20 values were lower for all of the tests containing the flawed items than for the "good" versions of the items but significantly so…
Descriptors: Item Analysis, Multiple Choice Tests, Research Reports, Test Construction
Diamond, James; Evans, William – 1972
Gibb (1964) defined test-wiseness (TW) as the ability to respond advantageously to item clues in a multiple-choice setting and therefore to obtain credit without knowledge of the subject matter being tested. This study investigated TW in a sample of 6th grade pupils. A test instrument was developed utilizing fictitious material similar to the…
Descriptors: Correlation, Cues, Factor Analysis, Grade 6
Van Mondfrans, Adrian P.; And Others – 1972
Retroactive inhibition, a loss of memory due to learning other materials between recall and exposure to the original materials, was investigated in relation to prose. Two variables were manipulated in the study: similarity of interpolated stories (dissimilar or similar), and the response requirements (completion-recall or multiple-choice). The 190…
Descriptors: Educational Research, Learning Theories, Memory, Multiple Choice Tests
O'Reilly, Robert P.; Streeter, Ronald E. – 1976
The results of a series of factor analyses of a new test of literal comprehension using a multiple-choice cloze format are summarized. These analyses were conducted in the validation of a test design to measure for the most part a factor of literal comprehension independent of IQ and inferential reading processes, yet marked by certain related…
Descriptors: Cloze Procedure, Elementary Education, Factor Analysis, Multiple Choice Tests
McMorris, Robert F.; Leonard, Gregory – 1976
According to conventional wisdom, a test taker should not change his/her first response to a multiple-choice, although empirical evidence has consistently supported such changes. Quizzes for masters level students in educational measurement and evaluation showed increments due to answer changing. Low anxious students tended to make more changes…
Descriptors: Academic Achievement, Cognitive Style, College Students, Multiple Choice Tests
Scherich, Henry; Hanna, Gerald – 1976
The reading comprehension items for the Nelson Reading Skills Test, a revision of a widely used standardized reading test, were administered to several hundred fourth- and sixth-grade students in order to determine whether the student's ability to answer correctly actually depended on his comprehension of the accompanying passage. All the…
Descriptors: Elementary Education, Multiple Choice Tests, Reading Comprehension, Reading Tests
Peer reviewedFrary, Robert B.; And Others – Journal of Experimental Education, 1977
To date a theoretical basis has not been developed for determining changes in reliability when score points from random guessing are eliminated and those from non-randon guessing are retained. This paper presents a derivation of an expression for the reliability coefficient which displays the effect of deleting score components due to random…
Descriptors: Data Analysis, Guessing (Tests), Multiple Choice Tests, Scoring Formulas
Peer reviewedTanner, Jerome; Dwyer, Francis M. – British Journal of Educational Technology, 1978
Investigated the instructional effects of externally controlled pacing of multiple choice questions on the achievement of students when tested by means of projected slides. (Author)
Descriptors: Analysis of Variance, Higher Education, Media Research, Multiple Choice Tests
Peer reviewedMcClelland, G. – Physics Education, 1978
Reviews techniques to consider when writing multiple-choice objective test items. (SL)
Descriptors: College Science, Evaluation, Higher Education, Multiple Choice Tests
Peer reviewedLord, Frederic M. – Journal of Educational Measurement, 1977
Two approaches for determining the optimal number of choices for a test item, presently in the literature, are compared with two new approaches. (Author)
Descriptors: Forced Choice Technique, Latent Trait Theory, Multiple Choice Tests, Test Items
Peer reviewedPyrczak, Fred – Journal of Reading, 1977
Students know more than they think they know, so guessing gives better scores even when there's a penalty for errors. (JM)
Descriptors: Guessing (Tests), Multiple Choice Tests, Reading Research, Reading Tests
Peer reviewedTollefson, Nona – Educational and Psychological Measurement, 1987
This study compared the item difficulty, item discrimination, and test reliability of three forms of multiple-choice items: (1) one correct answer; (2) "none of the above" as a foil; and (3) "none of the above" as the correct answer. Twelve items in the three formats were administered in a college statistics examination. (BS)
Descriptors: Difficulty Level, Higher Education, Item Analysis, Multiple Choice Tests
Peer reviewedOwen, Steven V.; Froman, Robin D. – Educational and Psychological Measurement, 1987
To test further for efficacy of three-option achievement items, parallel three- and five-option item tests were distributed randomly to college students. Results showed no differences in mean item difficulty, mean discrimination or total test score, but a substantial reduction in time spent on three-option items. (Author/BS)
Descriptors: Achievement Tests, Higher Education, Multiple Choice Tests, Test Format


