NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
Elementary and Secondary…1
What Works Clearinghouse Rating
Showing 1 to 15 of 46 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cesur, Kursat – Educational Policy Analysis and Strategic Research, 2019
Examinees' performances are assessed using a wide variety of different techniques. Multiple-choice (MC) tests are among the most frequently used ones. Nearly, all standardized achievement tests make use of MC test items and there is a variety of ways to score these tests. The study compares number right and liberal scoring (SAC) methods. Mixed…
Descriptors: Multiple Choice Tests, Scoring, Evaluation Methods, Guessing (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
Sarac, Merve; Loken, Eric – International Journal of Testing, 2023
This study is an exploratory analysis of examinee behavior in a large-scale language proficiency test. Despite a number-right scoring system with no penalty for guessing, we found that 16% of examinees omitted at least one answer and that women were more likely than men to omit answers. Item-response theory analyses treating the omitted responses…
Descriptors: English (Second Language), Language Proficiency, Language Tests, Second Language Learning
Peer reviewed Peer reviewed
Direct linkDirect link
Haladyna, Thomas M.; Rodriguez, Michael C.; Stevens, Craig – Applied Measurement in Education, 2019
The evidence is mounting regarding the guidance to employ more three-option multiple-choice items. From theoretical analyses, empirical results, and practical considerations, such items are of equal or higher quality than four- or five-option items, and more items can be administered to improve content coverage. This study looks at 58 tests,…
Descriptors: Multiple Choice Tests, Test Items, Testing Problems, Guessing (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
Bramley, Tom; Crisp, Victoria – Assessment in Education: Principles, Policy & Practice, 2019
For many years, question choice has been used in some UK public examinations, with students free to choose which questions they answer from a selection (within certain parameters). There has been little published research on choice of exam questions in recent years in the UK. In this article we distinguish different scenarios in which choice…
Descriptors: Test Items, Test Construction, Difficulty Level, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Ventouras, Errikos; Triantis, Dimos; Tsiakas, Panagiotis; Stergiopoulos, Charalampos – Computers & Education, 2010
The aim of the present research was to compare the use of multiple-choice questions (MCQs) as an examination method, to the examination based on constructed-response questions (CRQs). Despite that MCQs have an advantage concerning objectivity in the grading process and speed in production of results, they also introduce an error in the final…
Descriptors: Computer Assisted Instruction, Scoring, Grading, Comparative Analysis
Abu-Sayf, F. K. – Educational Technology, 1979
Compares methods of scoring multiple-choice tests and discusses right-number scoring, guessing, and omitted items. Test instructions and answer changing are addressed, and attempts to weight test items are reviewed. It is concluded that, since innovations in test scoring are not well-established, the number right method is most appropriate. (RAO)
Descriptors: Guessing (Tests), Multiple Choice Tests, Objective Tests, Scoring
Ford, Valeria A. – 1973
The purpose of this paper is to acquaint the reader with the topic of test-wiseness. The first section of this paper presents a series of multiple-choice items. The reader is asked to respond to them and is encouraged to read carefully the remainder of this paper for techniques which could improve test-taking performance. The next section defines…
Descriptors: Guessing (Tests), Literature Reviews, Multiple Choice Tests, Response Style (Tests)
Peer reviewed Peer reviewed
Kolstad, Rosemarie K.; And Others – Educational Research Quarterly, 1983
Complex multiple choice (CMC) items are frequently used to test knowledge about repetitive information. In two independent comparisons, performance on the CMC items surpassed that of the multiple true-false clusters. Data indicate that performance on CMC items is inflated, and distractors on CMC items fail to prevent guessing. (Author/PN)
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Objective Tests
Peer reviewed Peer reviewed
Pickering, M. J. – System, 1976
This article is mainly concerned with a study of the relation between the content of a multiple choice test and its composition, from the point of view of those features in tests which enable the testee to derive the correct answer by studying the composition of the test itself. (Author/POP)
Descriptors: Guessing (Tests), Language Tests, Multiple Choice Tests, Test Construction
Wang, Jianjun – 1995
Effects of blind guessing on the success of passing true-false and multiple-choice tests are investigated under a stochastic binomial model. Critical values of guessing are thresholds which signify when the effect of guessing is negligible. By checking a table of critical values assembled in this paper, one can make a decision with 95% confidence…
Descriptors: Bayesian Statistics, Grading, Guessing (Tests), Models
Frary, Robert B.; And Others – 1985
Students in an introductory college course (n=275) responded to equivalent 20-item halves of a test under number-right and formula-scoring instructions. Formula scores of those who omitted items overaged about one point lower than their comparable (formula adjusted) scores on the test half administered under number-right instructions. In contrast,…
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Questionnaires
Peer reviewed Peer reviewed
Rowley, Glenn L. – Journal of Educational Measurement, 1974
Descriptors: Achievement Tests, Anxiety, Educational Testing, Guessing (Tests)
Peer reviewed Peer reviewed
Jones, Phillip D.; Kaufman, Gary G. – Educational and Psychological Measurement, 1975
Different forms of a vocabulary test were administered to college students. Results indicated that as the frequency of specific determiners increased, they formed increasingly strong but differential guessing response sets in high and low scoring groups; however, the magnitude of the effect was much stronger for position specific determiners.…
Descriptors: College Students, Cues, Guessing (Tests), Higher Education
Peer reviewed Peer reviewed
Reiling, Eldon; Taylor, Ryland – Journal of Educational Measurement, 1972
The hypothesis that it is unwise to change answers to multiple choice questions was tested using multiple regression analysis. The hypothesis was rejected as results showed that there are gains to be made by changing responses. (Author/CK)
Descriptors: Guessing (Tests), Hypothesis Testing, Measurement Techniques, Multiple Choice Tests
Ebel, Robert L. – 1973
True-false achievement test items written by typical classroom teachers show about two-thirds of the discrimination of their multiple-choice test items. This is about what should be expected in view of the higher probability of chance success on the true-false items. However, at least half again as many true-false items as multiple-choice items…
Descriptors: Guessing (Tests), Multiple Choice Tests, Objective Tests, Scoring
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4