Publication Date
| In 2026 | 0 |
| Since 2025 | 85 |
| Since 2022 (last 5 years) | 453 |
| Since 2017 (last 10 years) | 1241 |
| Since 2007 (last 20 years) | 2515 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 131 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| United Kingdom | 51 |
| Germany | 50 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Peer reviewedWilcox, Rand R. – Journal of Educational Measurement, 1987
Four procedures are discussed for obtaining a confidence interval when answer-until-correct scoring is used in multiple choice tests. Simulated data show that the choice of procedure depends upon sample size. (GDC)
Descriptors: Computer Simulation, Multiple Choice Tests, Sample Size, Scoring
Peer reviewedRamsey, Philip H.; And Others – Teaching of Psychology, 1987
Reports on an evaluation of answer changing on multiple choice tests. States that any change a student is inclined to make, no matter how low the confidence, was found to lead to a significant gain. Concludes that students who change answers are likely to benefit despite the widely held belief that "first impressions are best." (GEA)
Descriptors: Higher Education, Multiple Choice Tests, Response Style (Tests), Test Wiseness
Peer reviewedNield, Anthony F.; Wintre, Maxine Gallander – Teaching of Psychology, 1986
Introductory psychology students were graded on four tests using multiple-choice format with explicit option (E-option) to explain their answers. Students preferred E-option format over regular multiple-choice, short answer, essay, true/false, and fill-in-the-blank formats. Use of E-option averaged less than one explanation per test over four…
Descriptors: Educational Psychology, Educational Testing, Higher Education, Multiple Choice Tests
Peer reviewedGilman, David Alan; Ferry, Paula – Journal of Educational Measurement, 1972
Results indicate that scoring tests by the self-scoring method can result in a higher split half reliability than tests scored by the traditional right-wrong method. (Authors)
Descriptors: Data Analysis, Multiple Choice Tests, Scoring, Test Construction
Peer reviewedTamir, P. – Journal of Biological Education, 1971
Illustrates the use of answers to free-response questions in generating distractors in equivalent multiple-choice forms by examples of high school biology questions. (AL)
Descriptors: Biology, Multiple Choice Tests, Secondary School Science, Test Construction
Kroll, Neal E. A. – J Exp Psychol, 1970
Descriptors: Cues, Feedback, Learning Processes, Multiple Choice Tests
Roberge, James J.; Kubiniec, Cathleen M. – Educ Psychol Meas, 1970
Descriptors: Computer Programs, Feedback, Multiple Choice Tests, Test Interpretation
Sabers, Darrell L.; White, Gordon W. – J Educ Meas, 1969
Descriptors: Aptitude Tests, Multiple Choice Tests, Predictive Validity, Scoring
Ebel, Robert L. – Educ Psychol Meas, 1969
Descriptors: Item Analysis, Multiple Choice Tests, Objective Tests, Test Reliability
Peer reviewedKendall, Janet Ross; And Others – Journal of Educational Research, 1980
The choice of a particular reading comprehension passage and testing procedure, whether in research or practice, does not allow generalization to other operational definitions of reading comprehension, suggesting serious limitations in most contemporary reading comprehension research and testing. (JD)
Descriptors: Cloze Procedure, Multiple Choice Tests, Reading Comprehension, Research Methodology
Brown, F. Dale; Mitchell, Thomas O. – Educational Technology, 1979
Describes a self-instructional practice test utilizing negative-image slides of multiple choice questions and subsequent correct answer slides to provide immediate feedback. (RAO)
Descriptors: Audiovisual Aids, Autoinstructional Aids, Feedback, Multiple Choice Tests
Kline, Keith – Sierra Club Bulletin, 1979
Presents a 23 question, multiple choice test on energy. Answers are provided. The test is designed for the general public.
Descriptors: Ecology, Energy, Environmental Education, Evaluation
Peer reviewedLinn, Robert L. – Educational and Psychological Measurement, 1976
Testing procedures which involve testees assigning probabilities of correctness to all multiple choice alternatives is examined. Two basic assumptions in these procedures are reviewed. Empirical examinee response data are examined and it is suggested that these assumptions should not be taken lightly in empirical studies of personal probability…
Descriptors: Confidence Testing, Guessing (Tests), Measurement Techniques, Multiple Choice Tests
Peer reviewedWisner, Joel D.; Wisner, Robert J. – Business Education Forum, 1997
Undergraduate business students completed two multiple-choice tests: one in which they indicated their answer and one of three levels of confidence, and one in which they circled the item only when they possessed high confidence in the answer. The three-level test took longer to take and grade; students preferred the second format. Both types…
Descriptors: Business Education, Confidence Testing, Higher Education, Multiple Choice Tests
Peer reviewedFrary, Robert B. – Applied Measurement in Education, 1989
Multiple-choice response and scoring methods that attempt to determine an examinee's degree of knowledge about each item in order to produce a total test score are reviewed. There is apparently little advantage to such schemes; however, they may have secondary benefits such as providing feedback to enhance learning. (SLD)
Descriptors: Knowledge Level, Multiple Choice Tests, Scoring, Scoring Formulas


