Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 7 |
Since 2006 (last 20 years) | 19 |
Descriptor
Multiple Choice Tests | 37 |
Responses | 37 |
Test Format | 37 |
Test Items | 26 |
Difficulty Level | 13 |
Test Construction | 9 |
Mathematics Tests | 8 |
Comparative Analysis | 7 |
Foreign Countries | 7 |
Equated Scores | 5 |
Scoring | 5 |
More ▼ |
Source
Author
Publication Type
Journal Articles | 29 |
Reports - Research | 28 |
Reports - Evaluative | 8 |
Speeches/Meeting Papers | 6 |
Information Analyses | 3 |
Tests/Questionnaires | 1 |
Education Level
Audience
Location
Chile (Santiago) | 1 |
Illinois | 1 |
Israel (Tel Aviv) | 1 |
Netherlands | 1 |
Nigeria | 1 |
Norway | 1 |
Thailand | 1 |
United Kingdom | 1 |
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 3 |
Advanced Placement… | 1 |
Preliminary Scholastic… | 1 |
SAT (College Admission Test) | 1 |
Test of English as a Foreign… | 1 |
What Works Clearinghouse Rating
Stefanie A. Wind; Yuan Ge – Measurement: Interdisciplinary Research and Perspectives, 2024
Mixed-format assessments made up of multiple-choice (MC) items and constructed response (CR) items that are scored using rater judgments include unique psychometric considerations. When these item types are combined to estimate examinee achievement, information about the psychometric quality of each component can depend on that of the other. For…
Descriptors: Interrater Reliability, Test Bias, Multiple Choice Tests, Responses
Guo, Wenjing; Wind, Stefanie A. – Journal of Educational Measurement, 2021
The use of mixed-format tests made up of multiple-choice (MC) items and constructed response (CR) items is popular in large-scale testing programs, including the National Assessment of Educational Progress (NAEP) and many district- and state-level assessments in the United States. Rater effects, or raters' scoring tendencies that result in…
Descriptors: Test Format, Multiple Choice Tests, Scoring, Test Items
Wang, Yu; Chiu, Chia-Yi; Köhn, Hans Friedrich – Journal of Educational and Behavioral Statistics, 2023
The multiple-choice (MC) item format has been widely used in educational assessments across diverse content domains. MC items purportedly allow for collecting richer diagnostic information. The effectiveness and economy of administering MC items may have further contributed to their popularity not just in educational assessment. The MC item format…
Descriptors: Multiple Choice Tests, Nonparametric Statistics, Test Format, Educational Assessment
Bush, Martin – Assessment & Evaluation in Higher Education, 2015
The humble multiple-choice test is very widely used within education at all levels, but its susceptibility to guesswork makes it a suboptimal assessment tool. The reliability of a multiple-choice test is partly governed by the number of items it contains; however, longer tests are more time consuming to take, and for some subject areas, it can be…
Descriptors: Guessing (Tests), Multiple Choice Tests, Test Format, Test Reliability
Reardon, Sean F.; Kalogrides, Demetra; Fahle, Erin M.; Podolsky, Anne; Zárate, Rosalía C. – Educational Researcher, 2018
Prior research suggests that males outperform females, on average, on multiple-choice items compared to their relative performance on constructed-response items. This paper characterizes the extent to which gender achievement gaps on state accountability tests across the United States are associated with those tests' item formats. Using roughly 8…
Descriptors: Test Items, Test Format, Gender Differences, Achievement Gap
Beserra, Vagner; Nussbaum, Miguel; Grass, Antonio – Interactive Learning Environments, 2017
When using educational video games, particularly drill-and-practice video games, there are several ways of providing an answer to a quiz. The majority of paper-based options can be classified as being either multiple-choice or constructed-response. Therefore, in the process of creating an educational drill-and-practice video game, one fundamental…
Descriptors: Multiple Choice Tests, Drills (Practice), Educational Games, Video Games
Sangwin, Christopher J.; Jones, Ian – Educational Studies in Mathematics, 2017
In this paper we report the results of an experiment designed to test the hypothesis that when faced with a question involving the inverse direction of a reversible mathematical process, students solve a multiple-choice version by verifying the answers presented to them by the direct method, not by undertaking the actual inverse calculation.…
Descriptors: Mathematics Achievement, Mathematics Tests, Multiple Choice Tests, Computer Assisted Testing
Stevenson, Claire E.; Heiser, Willem J.; Resing, Wilma C. M. – Journal of Psychoeducational Assessment, 2016
Multiple-choice (MC) analogy items are often used in cognitive assessment. However, in dynamic testing, where the aim is to provide insight into potential for learning and the learning process, constructed-response (CR) items may be of benefit. This study investigated whether training with CR or MC items leads to differences in the strategy…
Descriptors: Logical Thinking, Multiple Choice Tests, Test Items, Cognitive Tests
Thanyapa, Inadaphat; Currie, Michael – Language Testing in Asia, 2014
The overall study from which findings are presented gave stem-equivalent short answer and multiple choice tests of English structure and reading to students at Prince of Songkla University, Thailand. A comparison of scores, facility, discrimination, reliability and validity from 3-, 4- and 5-option versions of the multiple choice test found little…
Descriptors: Foreign Countries, Multiple Choice Tests, College Students, Language Tests
Wang, Zhen; Yao, Lihua – ETS Research Report Series, 2013
The current study used simulated data to investigate the properties of a newly proposed method (Yao's rater model) for modeling rater severity and its distribution under different conditions. Our study examined the effects of rater severity, distributions of rater severity, the difference between item response theory (IRT) models with rater effect…
Descriptors: Test Format, Test Items, Responses, Computation
Hohensinn, Christine; Kubinger, Klaus D. – Educational and Psychological Measurement, 2011
In aptitude and achievement tests, different response formats are usually used. A fundamental distinction must be made between the class of multiple-choice formats and the constructed response formats. Previous studies have examined the impact of different response formats applying traditional statistical approaches, but these influences can also…
Descriptors: Item Response Theory, Multiple Choice Tests, Responses, Test Format
Fauskanger, Janne; Mosvold, Reidar – North American Chapter of the International Group for the Psychology of Mathematics Education, 2012
The mathematical knowledge for teaching (MKT) measures have become widely used among researchers both within and outside the U.S. Despite the apparent success, the MKT measures and underlying framework have been subject to criticism. The multiple-choice format of the items has been criticized, and some critics have suggested that opening up the…
Descriptors: Foreign Countries, Elementary School Teachers, Secondary School Teachers, Mathematics Teachers
Kim, Sooyeon; Walker, Michael E. – ETS Research Report Series, 2009
We examined the appropriateness of the anchor composition in a mixed-format test, which includes both multiple-choice (MC) and constructed-response (CR) items, using subpopulation invariance indices. We derived linking functions in the nonequivalent groups with anchor test (NEAT) design using two types of anchor sets: (a) MC only and (b) a mix of…
Descriptors: Test Format, Equated Scores, Test Items, Multiple Choice Tests
Kim, Sooyeon; Walker, Michael E.; McHale, Frederick – ETS Research Report Series, 2008
This study examined variations of the nonequivalent-groups equating design for mixed-format tests--tests containing both multiple-choice (MC) and constructed-response (CR) items--to determine which design was most effective in producing equivalent scores across the two tests to be equated. Four linking designs were examined: (a) an anchor with…
Descriptors: Equated Scores, Test Format, Multiple Choice Tests, Responses
Kim, Sooyeon; Walker, Michael E.; McHale, Frederick – Journal of Educational Measurement, 2010
In this study we examined variations of the nonequivalent groups equating design for tests containing both multiple-choice (MC) and constructed-response (CR) items to determine which design was most effective in producing equivalent scores across the two tests to be equated. Using data from a large-scale exam, this study investigated the use of…
Descriptors: Measures (Individuals), Scoring, Equated Scores, Test Bias