Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 5 |
Descriptor
Guessing (Tests) | 46 |
Multiple Choice Tests | 46 |
Testing Problems | 46 |
Response Style (Tests) | 18 |
Test Items | 16 |
Test Wiseness | 14 |
Scoring Formulas | 13 |
Test Reliability | 13 |
Test Construction | 11 |
Higher Education | 10 |
Item Analysis | 9 |
More ▼ |
Source
Author
Frary, Robert B. | 3 |
Ebel, Robert L. | 2 |
Kolstad, Rosemarie K. | 2 |
Plake, Barbara S. | 2 |
Waller, Michael I. | 2 |
Wilcox, Rand R. | 2 |
Abu-Sayf, F. K. | 1 |
Aiken, Lewis R. | 1 |
Bar-Hillel, Maya | 1 |
Bliss, Leonard B. | 1 |
Boldt, Robert F. | 1 |
More ▼ |
Publication Type
Reports - Research | 24 |
Journal Articles | 15 |
Reports - Evaluative | 6 |
Speeches/Meeting Papers | 6 |
Collected Works - General | 1 |
Guides - Classroom - Teacher | 1 |
Guides - Non-Classroom | 1 |
Opinion Papers | 1 |
Tests/Questionnaires | 1 |
Education Level
Postsecondary Education | 3 |
Higher Education | 2 |
Audience
Researchers | 2 |
Practitioners | 1 |
Location
New Hampshire | 1 |
Texas | 1 |
Turkey | 1 |
United Kingdom | 1 |
Laws, Policies, & Programs
Elementary and Secondary… | 1 |
Assessments and Surveys
Childrens Manifest Anxiety… | 1 |
Iowa Tests of Basic Skills | 1 |
National Assessment of… | 1 |
What Works Clearinghouse Rating
Cesur, Kursat – Educational Policy Analysis and Strategic Research, 2019
Examinees' performances are assessed using a wide variety of different techniques. Multiple-choice (MC) tests are among the most frequently used ones. Nearly, all standardized achievement tests make use of MC test items and there is a variety of ways to score these tests. The study compares number right and liberal scoring (SAC) methods. Mixed…
Descriptors: Multiple Choice Tests, Scoring, Evaluation Methods, Guessing (Tests)
Sarac, Merve; Loken, Eric – International Journal of Testing, 2023
This study is an exploratory analysis of examinee behavior in a large-scale language proficiency test. Despite a number-right scoring system with no penalty for guessing, we found that 16% of examinees omitted at least one answer and that women were more likely than men to omit answers. Item-response theory analyses treating the omitted responses…
Descriptors: English (Second Language), Language Proficiency, Language Tests, Second Language Learning
Haladyna, Thomas M.; Rodriguez, Michael C.; Stevens, Craig – Applied Measurement in Education, 2019
The evidence is mounting regarding the guidance to employ more three-option multiple-choice items. From theoretical analyses, empirical results, and practical considerations, such items are of equal or higher quality than four- or five-option items, and more items can be administered to improve content coverage. This study looks at 58 tests,…
Descriptors: Multiple Choice Tests, Test Items, Testing Problems, Guessing (Tests)
Bramley, Tom; Crisp, Victoria – Assessment in Education: Principles, Policy & Practice, 2019
For many years, question choice has been used in some UK public examinations, with students free to choose which questions they answer from a selection (within certain parameters). There has been little published research on choice of exam questions in recent years in the UK. In this article we distinguish different scenarios in which choice…
Descriptors: Test Items, Test Construction, Difficulty Level, Foreign Countries
Ventouras, Errikos; Triantis, Dimos; Tsiakas, Panagiotis; Stergiopoulos, Charalampos – Computers & Education, 2010
The aim of the present research was to compare the use of multiple-choice questions (MCQs) as an examination method, to the examination based on constructed-response questions (CRQs). Despite that MCQs have an advantage concerning objectivity in the grading process and speed in production of results, they also introduce an error in the final…
Descriptors: Computer Assisted Instruction, Scoring, Grading, Comparative Analysis
Abu-Sayf, F. K. – Educational Technology, 1979
Compares methods of scoring multiple-choice tests and discusses right-number scoring, guessing, and omitted items. Test instructions and answer changing are addressed, and attempts to weight test items are reviewed. It is concluded that, since innovations in test scoring are not well-established, the number right method is most appropriate. (RAO)
Descriptors: Guessing (Tests), Multiple Choice Tests, Objective Tests, Scoring
Ford, Valeria A. – 1973
The purpose of this paper is to acquaint the reader with the topic of test-wiseness. The first section of this paper presents a series of multiple-choice items. The reader is asked to respond to them and is encouraged to read carefully the remainder of this paper for techniques which could improve test-taking performance. The next section defines…
Descriptors: Guessing (Tests), Literature Reviews, Multiple Choice Tests, Response Style (Tests)

Kolstad, Rosemarie K.; And Others – Educational Research Quarterly, 1983
Complex multiple choice (CMC) items are frequently used to test knowledge about repetitive information. In two independent comparisons, performance on the CMC items surpassed that of the multiple true-false clusters. Data indicate that performance on CMC items is inflated, and distractors on CMC items fail to prevent guessing. (Author/PN)
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Objective Tests

Pickering, M. J. – System, 1976
This article is mainly concerned with a study of the relation between the content of a multiple choice test and its composition, from the point of view of those features in tests which enable the testee to derive the correct answer by studying the composition of the test itself. (Author/POP)
Descriptors: Guessing (Tests), Language Tests, Multiple Choice Tests, Test Construction
Wang, Jianjun – 1995
Effects of blind guessing on the success of passing true-false and multiple-choice tests are investigated under a stochastic binomial model. Critical values of guessing are thresholds which signify when the effect of guessing is negligible. By checking a table of critical values assembled in this paper, one can make a decision with 95% confidence…
Descriptors: Bayesian Statistics, Grading, Guessing (Tests), Models
Frary, Robert B.; And Others – 1985
Students in an introductory college course (n=275) responded to equivalent 20-item halves of a test under number-right and formula-scoring instructions. Formula scores of those who omitted items overaged about one point lower than their comparable (formula adjusted) scores on the test half administered under number-right instructions. In contrast,…
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Questionnaires

Rowley, Glenn L. – Journal of Educational Measurement, 1974
Descriptors: Achievement Tests, Anxiety, Educational Testing, Guessing (Tests)

Jones, Phillip D.; Kaufman, Gary G. – Educational and Psychological Measurement, 1975
Different forms of a vocabulary test were administered to college students. Results indicated that as the frequency of specific determiners increased, they formed increasingly strong but differential guessing response sets in high and low scoring groups; however, the magnitude of the effect was much stronger for position specific determiners.…
Descriptors: College Students, Cues, Guessing (Tests), Higher Education

Reiling, Eldon; Taylor, Ryland – Journal of Educational Measurement, 1972
The hypothesis that it is unwise to change answers to multiple choice questions was tested using multiple regression analysis. The hypothesis was rejected as results showed that there are gains to be made by changing responses. (Author/CK)
Descriptors: Guessing (Tests), Hypothesis Testing, Measurement Techniques, Multiple Choice Tests
Ebel, Robert L. – 1973
True-false achievement test items written by typical classroom teachers show about two-thirds of the discrimination of their multiple-choice test items. This is about what should be expected in view of the higher probability of chance success on the true-false items. However, at least half again as many true-false items as multiple-choice items…
Descriptors: Guessing (Tests), Multiple Choice Tests, Objective Tests, Scoring