Publication Date
| In 2026 | 0 |
| Since 2025 | 85 |
| Since 2022 (last 5 years) | 453 |
| Since 2017 (last 10 years) | 1241 |
| Since 2007 (last 20 years) | 2515 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 131 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| United Kingdom | 51 |
| Germany | 50 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Peer reviewedCureton, Edward E. – Educational and Psychological Measurement, 1971
A rebuttal of Frary's 1969 article in Educational and Psychological Measurement. (MS)
Descriptors: Error of Measurement, Guessing (Tests), Multiple Choice Tests, Scoring Formulas
Validity and Likability Ratings for Three Scoring Instructions for a Multiple-Choice Vocabulary Test
Peer reviewedWaters, Carrie Wherry; Waters, Lawrence K. – Educational and Psychological Measurement, 1971
Descriptors: Guessing (Tests), Multiple Choice Tests, Response Style (Tests), Scoring Formulas
Peer reviewedHarke, Douglas J. – Journal of Research in Science Teaching, 1971
Presents rationale and methods of using a hierarchical analysis to validate the use of randomized multiple choice test items. Concludes that solving randomized multiple choice problems require the same or similar cognitive processes or problem solving skills as a free response form. (DS)
Descriptors: Cognitive Processes, College Science, Evaluation, Multiple Choice Tests
Peer reviewedBauer, David H. – Psychology in the Schools, 1971
The results of this study suggest that instructions are a source of information to students about the testing environment that modifies their test-taking behavior. Individual students interpret the same instructions in different ways, and these differences, in turn, result in variations in behavior reflected in test scores. (Author)
Descriptors: Anxiety, Aptitude Tests, Behavior Patterns, Instruction
Peer reviewedCooper, Malcolm D. – English Language Teaching, 1970
The author, a secondary school teacher in Tanzania, presents a test designed to be used as a diagnostic test before a controlled program of integrated grammar and composition teaching and as an achievement test afterwards. (FB)
Descriptors: Diagnostic Tests, English (Second Language), Language Proficiency, Language Tests
Peer reviewedBoyd. Rachel M. – Journal of Reading, 1970
Descriptors: Multiple Choice Tests, Performance Factors, Reading Comprehension, Reading Research
Levine, Harold G.; and others – Amer Educ Res J, 1970
Descriptors: Achievement Tests, Factor Analysis, Measurement Instruments, Medicine
Peer reviewedWillson, Victor L. – Educational and Psychological Measurement, 1982
The Serlin-Kaiser procedure is used to complete a principal components solution for scoring weights for all options of a given item. Coefficient alpha is maximized for a given multiple choice test. (Author/GK)
Descriptors: Analysis of Covariance, Factor Analysis, Multiple Choice Tests, Scoring Formulas
Peer reviewedMorrison, Donald G.; Brockway, George – Psychometrika, 1979
A modified beta binomial model is presented for use in analyzing random guessing multiple choice tests and taste tests. Detection probabilities for each item are distributed beta across the population subjects. Properties for the observable distribution of correct responses are derived. Two concepts of true score estimates are presented.…
Descriptors: Bayesian Statistics, Guessing (Tests), Mathematical Models, Multiple Choice Tests
Peer reviewedStrang, Harold R. – Journal of Educational Research, 1980
A question is raised as to how much emphasis should be placed on the use of technical terminology in lectures, reading assignments, and tests, particularly in introductory college courses, if all that is facilitated pertains to factual recall. (JD)
Descriptors: Comprehension, Higher Education, Multiple Choice Tests, Performance Factors
Peer reviewedCross, Lawrence H.; And Others – Journal of Experimental Education, 1980
Use of choice-weighted scores as a basis for assigning grades in college courses was investigated. Reliability and validity indices offer little to recommend either type of choice-weighted scoring over number-right scoring. The potential for choice-weighted scoring to enhance the teaching/testing process is discussed. (Author/GK)
Descriptors: Credit Courses, Grading, Higher Education, Multiple Choice Tests
Peer reviewedFrary, Robert B. – Applied Psychological Measurement, 1980
Six scoring methods for assigning weights to right or wrong responses according to various instructions given to test takers are analyzed with respect to expected change scores and the effect of various levels of information and misinformation. Three of the methods provide feedback to the test taker. (Author/CTM)
Descriptors: Guessing (Tests), Knowledge Level, Multiple Choice Tests, Scores
Peer reviewedWood, Robert – Journal of Educational Measurement, 1976
The study concludes that all respondents are served best by following instructions which encourage them to answer all items. Such instructions appear to reduce omitting to a point where individual differences in confidence do not introduce any significant distortion into the estimation of ability. (Author/RC)
Descriptors: Guessing (Tests), High School Students, Instruction, Multiple Choice Tests
Peer reviewedReid, Frank J. – Journal of Economic Education, 1976
Examines the conventional scoring formula for multiple-choice tests and proposes an alternative scoring formula which takes into account the situation in which the student does not know the right answer but is able to eliminate one or more of the incorrect alternatives. (Author/AV)
Descriptors: Economics Education, Guessing (Tests), Higher Education, Multiple Choice Tests
Peer reviewedMartinez, Michael E.; Katz, Irvin R. – Educational Assessment, 1996
Item level differences between a type of constructed response item (figural response) and comparable multiple choice items in the domain of architecture were studied. Data from 120 architects and architecture students show that item level differences in difficulty correspond to differences in cognitive processing requirements and that relations…
Descriptors: Architects, Architecture, Cognitive Processes, Constructed Response


