Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 5 |
Descriptor
Guessing (Tests) | 53 |
Objective Tests | 53 |
Multiple Choice Tests | 42 |
Test Reliability | 19 |
Higher Education | 16 |
Test Items | 11 |
Response Style (Tests) | 10 |
Test Construction | 10 |
Test Wiseness | 10 |
Testing Problems | 10 |
Scores | 9 |
More ▼ |
Source
Author
Publication Type
Education Level
Higher Education | 5 |
Postsecondary Education | 4 |
Early Childhood Education | 1 |
Kindergarten | 1 |
Audience
Researchers | 4 |
Practitioners | 2 |
Students | 1 |
Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Facts on Aging Quiz | 2 |
What Works Clearinghouse Rating
McKenna, Peter – International Association for Development of the Information Society, 2018
Multiple Choice Questions come with the correct answer. Examinees have various reasons for selecting their answer, other than knowing it to be correct. Yet MCQs are common as summative assessments in the education of Computer Science and Information Systems students. To what extent can MCQs be answered correctly without knowing the answer; and can…
Descriptors: Multiple Choice Tests, Summative Evaluation, Student Evaluation, Evaluation Methods
McKenna, Peter – Interactive Technology and Smart Education, 2019
Purpose: This paper aims to examine whether multiple choice questions (MCQs) can be answered correctly without knowing the answer and whether constructed response questions (CRQs) offer more reliable assessment. Design/methodology/approach: The paper presents a critical review of existing research on MCQs, then reports on an experimental study…
Descriptors: Multiple Choice Tests, Accuracy, Test Wiseness, Objective Tests
Brassil, Chad E.; Couch, Brian A. – International Journal of STEM Education, 2019
Background: Within undergraduate science courses, instructors often assess student thinking using closed-ended question formats, such as multiple-choice (MC) and multiple-true-false (MTF), where students provide answers with respect to predetermined response options. While MC and MTF questions both consist of a question stem followed by a series…
Descriptors: Multiple Choice Tests, Objective Tests, Student Evaluation, Thinking Skills
Schaap, Lydia; Verkoeijen, Peter; Schmidt, Henk – Assessment & Evaluation in Higher Education, 2014
This study investigated the effects of two different true-false questions on memory awareness and long-term retention of knowledge. Participants took four subsequent knowledge tests on curriculum learning material that they studied at different retention intervals prior to the start of this study (i.e. prior to the first test). At the first and…
Descriptors: Objective Tests, Test Items, Memory, Long Term Memory
Wakabayashi, Tomoko; Guskin, Karen – American Journal of Evaluation, 2010
A total of 271 early childhood professionals completed pre- and post training knowledge assessments in True-False only (TF) or True-False with "unsure" option formats (TFU). In Study 1, only TFU format was used. In Study 2, participants were randomly assigned to TF or TFU formats. Responses which were initially "unsure" were…
Descriptors: Early Childhood Education, Total Quality Management, Pretests Posttests, Young Children

Mueller, Daniel J.; Wasser, Virginia – Journal of Educational Measurement, 1977
Eighteen studies of the effects of changing initial answers to objective test items are reviewed. While students throughout the total test score range tended to gain more points than they lost, higher scoring students gain more than did lower scoring students. Suggestions for further research are made. (Author/JKS)
Descriptors: Guessing (Tests), Literature Reviews, Multiple Choice Tests, Objective Tests

Hsu, Louis M. – Educational and Psychological Measurement, 1979
Though the Paired-Item-Score (Eakin and Long) (EJ 174 780) method of scoring true-false tests has certain advantages over the traditional scoring methods (percentage right and right minus wrong), these advantages are attained at the cost of a larger risk of misranking the examinees. (Author/BW)
Descriptors: Comparative Analysis, Guessing (Tests), Objective Tests, Probability
Abu-Sayf, F. K. – Educational Technology, 1979
Compares methods of scoring multiple-choice tests and discusses right-number scoring, guessing, and omitted items. Test instructions and answer changing are addressed, and attempts to weight test items are reviewed. It is concluded that, since innovations in test scoring are not well-established, the number right method is most appropriate. (RAO)
Descriptors: Guessing (Tests), Multiple Choice Tests, Objective Tests, Scoring

Choppin, B. – British Journal of Educational Psychology, 1975
Using data obtained as part of the cross-cultural IEA study on academic standards, the tendency of particular pupils to guess on multiple-choice tests was measured using an index proposed by Ziller. (Editor)
Descriptors: Educational Psychology, Guessing (Tests), Models, Multiple Choice Tests

Kolstad, Rosemarie K.; And Others – Educational Research Quarterly, 1983
Complex multiple choice (CMC) items are frequently used to test knowledge about repetitive information. In two independent comparisons, performance on the CMC items surpassed that of the multiple true-false clusters. Data indicate that performance on CMC items is inflated, and distractors on CMC items fail to prevent guessing. (Author/PN)
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Objective Tests

Slakter, Malcolm J.; And Others – Journal of Experimental Education, 1971
Descriptors: Educational Research, Elementary School Students, Guessing (Tests), Objective Tests

Harris, Diana K.; And Others – Educational Gerontology, 1996
Multiple-choice and true-false versions of Palmore's first Facts on Aging Quiz were completed by 501 college students. Multiple choice reduced the chances of guessing and had less measurement error for average and above-average respondents. (Author/SK)
Descriptors: Aging (Individuals), College Students, Error of Measurement, Guessing (Tests)

Oosterhof, Albert C.; Glasnapp, Douglas R. – Journal of Experimental Education, 1974
The present study was initiated to investigate the comparability of multiple-choice and true-false item formats when the time necessary to respond to each type of item was equated empirically. (Editor)
Descriptors: Data Analysis, Guessing (Tests), Multiple Choice Tests, Objective Tests

Shatz, Mark A.; Best, John B. – Teaching of Psychology, 1987
Investigates the circumstances under which answer changing is beneficial or detrimental to test performance. Analyzes the success of answer changing in relation to the reasons offered for changing. Concludes that students who reported guessing as their reason for changing answers were not nearly as likely to benefit from changing as were students…
Descriptors: Confidence Testing, Guessing (Tests), Higher Education, Objective Tests

Burton, Richard F.; Miller, David J. – Assessment & Evaluation in Higher Education, 1999
Discusses statistical procedures for increasing test unreliability due to guessing in multiple choice and true/false tests. Proposes two new measures of test unreliability: one concerned with resolution of defined levels of knowledge and the other with the probability of examinees being incorrectly ranked. Both models are based on the binomial…
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Objective Tests