NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 40 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Alamri, Aeshah; Higham, Philip A. – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2022
Corrective feedback is often touted as a critical benefit to learning, boosting testing effects when retrieval is poor and reducing negative testing effects. Here, we explore the dark side of corrective feedback. In three experiments, we found that corrective feedback on multiple-choice (MC) practice questions is later endorsed as the answer to…
Descriptors: Feedback (Response), Multiple Choice Tests, Cues, Recall (Psychology)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Uysal, Ibrahim; Sahin-Kürsad, Merve; Kiliç, Abdullah Faruk – Participatory Educational Research, 2022
The aim of the study was to examine the common items in the mixed format (e.g., multiple-choices and essay items) contain parameter drifts in the test equating processes performed with the common item nonequivalent groups design. In this study, which was carried out using Monte Carlo simulation with a fully crossed design, the factors of test…
Descriptors: Test Items, Test Format, Item Response Theory, Equated Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Taehyeong Kim; Byungmin Lee – Language Assessment Quarterly, 2025
The Korean College Scholastic Ability Test (CSAT) aims to assess Korean high school students' scholastic ability required for college readiness. As a high-stakes test, the examination serves as a pivotal hurdle for university admission and exerts a strong washback effect on the educational system in Korea. The present study set out to investigate…
Descriptors: Reading Comprehension, Reading Tests, Language Tests, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Leber, Jasmin; Renkl, Alexander; Nückles, Matthias; Wäschle, Kristin – Learning: Research and Practice, 2018
According to the model of constructive alignment, learners adjust their learning strategies to the announced assessment (backwash effect). Hence, when teaching for understanding, the assessment method should be aligned with this teaching goal to ensure that learners engage in corresponding learning strategies. A quasi-experimental field study with…
Descriptors: Learning Strategies, Testing Problems, Educational Objectives, Learning Motivation
Peer reviewed Peer reviewed
Direct linkDirect link
Ventouras, Errikos; Triantis, Dimos; Tsiakas, Panagiotis; Stergiopoulos, Charalampos – Computers & Education, 2011
The aim of the present research was to compare the use of multiple-choice questions (MCQs) as an examination method against the oral examination (OE) method. MCQs are widely used and their importance seems likely to grow, due to their inherent suitability for electronic assessment. However, MCQs are influenced by the tendency of examinees to guess…
Descriptors: Grades (Scholastic), Scoring, Multiple Choice Tests, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Ventouras, Errikos; Triantis, Dimos; Tsiakas, Panagiotis; Stergiopoulos, Charalampos – Computers & Education, 2010
The aim of the present research was to compare the use of multiple-choice questions (MCQs) as an examination method, to the examination based on constructed-response questions (CRQs). Despite that MCQs have an advantage concerning objectivity in the grading process and speed in production of results, they also introduce an error in the final…
Descriptors: Computer Assisted Instruction, Scoring, Grading, Comparative Analysis
Peer reviewed Peer reviewed
Houston, John P. – Journal of Educational Psychology, 1983
Using an index of answer copying developed by Houston, it was found that rearranged questions alone did not reduce answer copying, whereas rearrangement of both questions and answers effectively eliminated detectable cheating. (Author)
Descriptors: Cheating, Higher Education, Measurement Techniques, Multiple Choice Tests
Peer reviewed Peer reviewed
Kolstad, Rosemarie K.; And Others – Educational Research Quarterly, 1983
Complex multiple choice (CMC) items are frequently used to test knowledge about repetitive information. In two independent comparisons, performance on the CMC items surpassed that of the multiple true-false clusters. Data indicate that performance on CMC items is inflated, and distractors on CMC items fail to prevent guessing. (Author/PN)
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Objective Tests
Haladyna, Thomas M. – 1999
This book explains writing effective multiple-choice test items and studying responses to items to evaluate and improve them, two topics that are very important in the development of many cognitive tests. The chapters are: (1) "Providing a Context for Multiple-Choice Testing"; (2) "Constructed-Response and Multiple-Choice Item Formats"; (3)…
Descriptors: Constructed Response, Multiple Choice Tests, Test Construction, Test Format
Ferguson, William F. – 1983
College undergraduates (n=38) were administered identical multiple choice tests with randomly presented answer-sheets numbered either vertically or horizontally. Of the originally-scheduled four tests during the semester, tests one and three were retested with entirely different test questions, also multiple choice, resulting in scores from tests,…
Descriptors: Answer Sheets, Cheating, Higher Education, Multiple Choice Tests
Peer reviewed Peer reviewed
Downing, Steven M. – Educational Measurement: Issues and Practice, 1992
Research on true-false (TF), multiple-choice, and alternate-choice (AC) tests is reviewed, discussing strengths, weaknesses, and the usefulness in classroom and large-scale testing of each. Recommendations are made for improving use of AC items to overcome some of the problems associated with TF items. (SLD)
Descriptors: Comparative Analysis, Educational Research, Multiple Choice Tests, Objective Tests
Slem, Charles M. – 1981
Over the years many criticisms have been offered against the multiple choice test format. Ambiguous, and emphasizing isolated information, they are also the most difficult objective tests to construct. Over-interpretation is a danger of multiple choice examinations with students picking subtle answers the test makers consider incorrect. Yet, the…
Descriptors: Constructed Response, Essay Tests, Higher Education, Multiple Choice Tests
Siskind, Theresa G.; Anderson, Lorin W. – 1982
The study was designed to examine the similarity of response options generated by different item writers using a systematic approach to item writing. The similarity of response options to student responses for the same item stems presented in an open-ended format was also examined. A non-systematic (subject matter expertise) approach and a…
Descriptors: Algorithms, Item Analysis, Multiple Choice Tests, Quality Control
Peer reviewed Peer reviewed
Johnson, Bruce R. – American Mathematical Monthly, 1991
Described is an approach that substantially reduces the annotated shortcomings of standard multiple-choice tests presented to lower-division college mathematics and statistics classes. Examples are included from each discipline. (JJK)
Descriptors: College Mathematics, Distractors (Tests), Higher Education, Mathematics Education
Klein, Stephen P.; Bolus, Roger – 1983
A solution to reduce the likelihood of one examinee copying another's answers on large scale tests that require all examinees to answer the same set of questions is to use multiple test forms that differ in terms of item ordering. This study was conducted to determine whether varying the sequence in which blocks of items were presented to…
Descriptors: Adults, Cheating, Cost Effectiveness, Item Analysis
Previous Page | Next Page »
Pages: 1  |  2  |  3