Publication Date
| In 2026 | 0 |
| Since 2025 | 89 |
| Since 2022 (last 5 years) | 457 |
| Since 2017 (last 10 years) | 1245 |
| Since 2007 (last 20 years) | 2519 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 131 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| Germany | 51 |
| United Kingdom | 51 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Peer reviewedLevin, Joel R.; And Others – Contemporary Educational Psychology, 1978
High and low achievers listened to 20 sentences under imagery or repetition instructions. The next day, learning was assessed by multiple choice items in which correct alternatives were stated in either verbatim or synonym form, and incorrect alternatives contained either familiar or new words. Implications for test construction were discussed.…
Descriptors: Academic Achievement, Elementary Education, Learning Activities, Multiple Choice Tests
Peer reviewedEbel, Robert L. – Educational and Psychological Measurement, 1978
A multiple true-false item is one where a testee has to identify statements as true or false within a cluster (of two or more) of such statements. Clusters are then scored as items. This study showed such a procedure to yield less reliable results than traditional true-false items. (JKS)
Descriptors: Guessing (Tests), Higher Education, Item Analysis, Multiple Choice Tests
Peer reviewedAiken, Lewis R.; Williams, Newsom – Educational and Psychological Measurement, 1978
Seven formulas for scoring test items with two options (true-false or multiple choice with only two choices) were investigated. Several conditions, such as varying directions for guessing and whether testees had prior knowledge of the proportions of false items on the test were also investigated. (Author/JKS)
Descriptors: Guessing (Tests), Higher Education, Knowledge Level, Multiple Choice Tests
Peer reviewedLongstreth, Langdon E. – Journal of Educational Psychology, 1978
The hypothesis that multiple-choice exams load higher on Level II ability than on Level I ability was confirmed by correlating multiple choice tests with the Cognitive Abilities Test Nonverbal Battery, the forward digit span test, and essay measures. Differences in scores among Asian, black, white, and Mexican American students are discussed in…
Descriptors: Academic Achievement, Cognitive Ability, Difficulty Level, Essay Tests
Peer reviewedKarras, Ray W. – History Teacher, 1978
Suggests ways to construct multiple choice questions which test both factual knowledge and critical thinking. Three models explain how to relate facts, fact sets, and reasons to hypotheses and concepts. (Author/DB)
Descriptors: Critical Thinking, Educational Improvement, History Instruction, Knowledge Level
Smith, Paul – ADE Bulletin, 1978
Examines the way standardized test questions are developed by the Advanced Placement Development Committee; shows that the tests continue to reflect features of New Criticism and have not been changed to reflect recent developments in the discipline of English. (GW)
Descriptors: Advanced Placement, English Instruction, Essay Tests, Higher Education
Peer reviewedStevens, J. M.; And Others – Medical Education, 1977
Five of the medical schools in the University of London collaborated in administering one multiple choice question paper in obstetrics and gynecology, and results showed differences in performance between the five schools on questions and alternatives within questions. The rank order of the schools may result from differences in teaching methods.…
Descriptors: Academic Achievement, Comparative Analysis, Gynecology, Higher Education
Peer reviewedClutterbuck, Michael; Mowchanuk, Timothy – Babel: Journal of the Australian Federation of Modern Language Teachers' Associations, 1977
In October, 1976, a new type of question was introduced into the Victorian HSC German test: A listening comprehension question with multiple-choice answers has replaced the written reproduction question. Results of a computer analysis of the answers given in the exam are reported. (SW)
Descriptors: Achievement Tests, German, Language Instruction, Language Skills
Firges, Jean; Gaessler, Roland – Praxis des Neusprachlichen Unterrichts, 1977
The test, multiple choice in form, has no place for oral expression. Other criticisms are given, e.g., that the test seems to rest on a faulty teaching plan, specifically in the sequencing of grammatical problems to be mastered. (Text is in German.) (IFS/WGA)
Descriptors: Achievement Tests, Content Analysis, French, Graduation Requirements
Ibe, Milagros D. – RELC Journal, 1975
This investigation seeks to determine: the validity and reliability of cloze tests for measuring reading comprehension; the relation of cloze scores to difficulty levels of reading passages; the relation of cloze test performance to length of English training; and the merits of judgmental vs. random word deletion in test construction. (DB)
Descriptors: Cloze Procedure, English (Second Language), Indochinese, Language Teachers
Peer reviewedHaladyna, Thomas M. – Evaluation and the Health Professions, 1987
A set of guidelines for planning and establishing a certification testing program is presented. The guidelines are organized into three components: (1) content specification; (2) item development; and (3) test design. A mythical new certification testing program, called the Tennis Players' Certification Testing Program, is used to illustrate the…
Descriptors: Certification, Guidelines, Health Occupations, Higher Education
Peer reviewedFrary, Robert B. – Journal of Educational Measurement, 1985
Responses to a sample test were simulated for examinees under free-response and multiple-choice formats. Test score sets were correlated with randomly generated sets of unit-normal measures. The extent of superiority of free response tests was sufficiently small so that other considerations might justifiably dictate format choice. (Author/DWH)
Descriptors: Comparative Analysis, Computer Simulation, Essay Tests, Guessing (Tests)
Leuba, Richard J. – Engineering Education, 1986
Explains how multiple choice test items can be devised to measure higher-order learning, including engineering problem solving. Discusses the value and information provided in item analysis procedures with machine-scored tests. Suggests elements to consider in test design. (ML)
Descriptors: College Science, Creative Thinking, Engineering Education, Evaluation Methods
Peer reviewedKarras, Ray – Social Science Record, 1985
Multiple-choice tests are here to stay. They should do more than test for rote memory. They should simultaneously test for both knowledge of subject matter and for thinking skills applied to that subject matter. Suggestions to help teachers prepare better test questions are made. (RM)
Descriptors: Critical Thinking, Educational Strategies, Elementary Secondary Education, Multiple Choice Tests
Peer reviewedLienert, Gustav A.; Oeveste, Hans Zur – Educational and Psychological Measurement, 1985
Configural frequency analysis (CFA) is suggested as a technique for longitudinal research in developmental psychology. Stability and change in answers to multiple choice and yes-no item patterns obtained with repeated measurements are identified by CFA and illustrated by developmental analysis of an item from Gorham's Proverb Test. (Author/DWH)
Descriptors: Cognitive Development, Developmental Psychology, Elementary Secondary Education, Longitudinal Studies


