Publication Date
In 2025 | 2 |
Since 2024 | 3 |
Since 2021 (last 5 years) | 20 |
Since 2016 (last 10 years) | 50 |
Since 2006 (last 20 years) | 69 |
Descriptor
Difficulty Level | 78 |
Item Response Theory | 78 |
Multiple Choice Tests | 78 |
Test Items | 71 |
Foreign Countries | 32 |
Test Construction | 21 |
Science Tests | 17 |
Item Analysis | 16 |
Statistical Analysis | 15 |
Psychometrics | 14 |
Test Format | 14 |
More ▼ |
Source
Author
DeBoer, George E. | 3 |
Herrmann-Abell, Cari F. | 3 |
Tindal, Gerald | 3 |
Alonzo, Julie | 2 |
Andrich, David | 2 |
Bucak, S. Deniz | 2 |
Bulut, Okan | 2 |
Gierl, Mark J. | 2 |
Hardcastle, Joseph | 2 |
Hohensinn, Christine | 2 |
Marais, Ida | 2 |
More ▼ |
Publication Type
Reports - Research | 63 |
Journal Articles | 60 |
Reports - Evaluative | 9 |
Speeches/Meeting Papers | 6 |
Dissertations/Theses -… | 4 |
Numerical/Quantitative Data | 4 |
Information Analyses | 2 |
Reports - Descriptive | 2 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 22 |
Secondary Education | 20 |
Postsecondary Education | 19 |
Elementary Education | 13 |
Junior High Schools | 7 |
Middle Schools | 7 |
Grade 8 | 6 |
Grade 7 | 5 |
High Schools | 5 |
Grade 3 | 3 |
Grade 5 | 3 |
More ▼ |
Audience
Location
Indonesia | 5 |
Nigeria | 4 |
Australia | 3 |
Turkey | 3 |
Germany | 2 |
Kentucky | 2 |
Malaysia | 2 |
Alabama | 1 |
Arizona | 1 |
Arkansas | 1 |
California | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Aiman Mohammad Freihat; Omar Saleh Bani Yassin – Educational Process: International Journal, 2025
Background/purpose: This study aimed to reveal the accuracy of estimation of multiple-choice test items parameters following the models of the item-response theory in measurement. Materials/methods: The researchers depended on the measurement accuracy indicators, which express the absolute difference between the estimated and actual values of the…
Descriptors: Accuracy, Computation, Multiple Choice Tests, Test Items
Lang, Joseph B. – Journal of Educational and Behavioral Statistics, 2023
This article is concerned with the statistical detection of copying on multiple-choice exams. As an alternative to existing permutation- and model-based copy-detection approaches, a simple randomization p-value (RP) test is proposed. The RP test, which is based on an intuitive match-score statistic, makes no assumptions about the distribution of…
Descriptors: Identification, Cheating, Multiple Choice Tests, Item Response Theory
Thompson, Kathryn N. – ProQuest LLC, 2023
It is imperative to collect validity evidence prior to interpreting and using test scores. During the process of collecting validity evidence, test developers should consider whether test scores are contaminated by sources of extraneous information. This is referred to as construct irrelevant variance, or the "degree to which test scores are…
Descriptors: Test Wiseness, Test Items, Item Response Theory, Scores
Jin, Kuan-Yu; Siu, Wai-Lok; Huang, Xiaoting – Journal of Educational Measurement, 2022
Multiple-choice (MC) items are widely used in educational tests. Distractor analysis, an important procedure for checking the utility of response options within an MC item, can be readily implemented in the framework of item response theory (IRT). Although random guessing is a popular behavior of test-takers when answering MC items, none of the…
Descriptors: Guessing (Tests), Multiple Choice Tests, Item Response Theory, Attention
Roger Young; Emily Courtney; Alexander Kah; Mariah Wilkerson; Yi-Hsin Chen – Teaching of Psychology, 2025
Background: Multiple-choice item (MCI) assessments are burdensome for instructors to develop. Artificial intelligence (AI, e.g., ChatGPT) can streamline the process without sacrificing quality. The quality of AI-generated MCIs and human experts is comparable. However, whether the quality of AI-generated MCIs is equally good across various domain-…
Descriptors: Item Response Theory, Multiple Choice Tests, Psychology, Textbooks
Liu, Chunyan; Jurich, Daniel; Morrison, Carol; Grabovsky, Irina – Applied Measurement in Education, 2021
The existence of outliers in the anchor items can be detrimental to the estimation of examinee ability and undermine the validity of score interpretation across forms. However, in practice, anchor item performance can become distorted due to various reasons. This study compares the performance of modified "INFIT" and "OUTFIT"…
Descriptors: Equated Scores, Test Items, Item Response Theory, Difficulty Level
Al-zboon, Habis Saad; Alrekebat, Amjad Farhan – International Journal of Higher Education, 2021
This study aims at identifying the effect of multiple-choice test items' difficulty degree on the reliability coefficient and the standard error of measurement depending on the item response theory IRT. To achieve the objectives of the study, (WinGen3) software was used to generate the IRT parameters (difficulty, discrimination, guessing) for four…
Descriptors: Multiple Choice Tests, Test Items, Difficulty Level, Error of Measurement
Alicia A. Stoltenberg – ProQuest LLC, 2024
Multiple-select multiple-choice items, or multiple-choice items with more than one correct answer, are used to quickly assess content on standardized assessments. Because there are multiple keys to these item types, there are also multiple ways to score student responses to these items. The purpose of this study was to investigate how changing the…
Descriptors: Scoring, Evaluation Methods, Multiple Choice Tests, Standardized Tests
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2022
As implementation of the "Next Generation Science Standards" moves forward, there is a need for new assessments that can measure students' integrated three-dimensional science learning. The National Research Council has suggested that these assessments be multicomponent tasks that utilize a combination of item formats including…
Descriptors: Multiple Choice Tests, Conditioning, Test Items, Item Response Theory
Musa Adekunle Ayanwale – Discover Education, 2023
Examination scores obtained by students from the West African Examinations Council (WAEC), and National Business and Technical Examinations Board (NABTEB) may not be directly comparable due to differences in examination administration, item characteristics of the subject in question, and student abilities. For more accurate comparisons, scores…
Descriptors: Equated Scores, Mathematics Tests, Test Items, Test Format
Choi, Ikkyu; Zu, Jiyun – ETS Research Report Series, 2022
Synthetically generated speech (SGS) has become an integral part of our oral communication in a wide variety of contexts. It can be generated instantly at a low cost and allows precise control over multiple aspects of output, all of which can be highly appealing to second language (L2) assessment developers who have traditionally relied upon human…
Descriptors: Test Wiseness, Multiple Choice Tests, Test Items, Difficulty Level
Fadillah, Sarah Meilani; Ha, Minsu; Nuraeni, Eni; Indriyanti, Nurma Yunita – Malaysian Journal of Learning and Instruction, 2023
Purpose: Researchers discovered that when students were given the opportunity to change their answers, a majority changed their responses from incorrect to correct, and this change often increased the overall test results. What prompts students to modify their answers? This study aims to examine the modification of scientific reasoning test, with…
Descriptors: Science Tests, Multiple Choice Tests, Test Items, Decision Making
Mustafa, Nazahiyah; Khairani, Ahmad Zamri; Ishak, Nor Asniza – International Journal of Evaluation and Research in Education, 2021
This study aimed to calibrate the test items of science process skills used as a test at primary school students to provide information on the difficulty of each item. Data were collected from 128 standard five students in a primary school in Penang. The test was given in multiple-choice as many as 40 items consisting of 33 items partial credit…
Descriptors: Elementary School Students, Elementary School Science, Science Process Skills, Foreign Countries
Shin, Jinnie; Bulut, Okan; Gierl, Mark J. – Journal of Experimental Education, 2020
The arrangement of response options in multiple-choice (MC) items, especially the location of the most attractive distractor, is considered critical in constructing high-quality MC items. In the current study, a sample of 496 undergraduate students taking an educational assessment course was given three test forms consisting of the same items but…
Descriptors: Foreign Countries, Undergraduate Students, Multiple Choice Tests, Item Response Theory
Azevedo, Jose Manuel; Oliveira, Ema P.; Beites, Patrícia Damas – International Journal of Information and Learning Technology, 2019
Purpose: The purpose of this paper is to find appropriate forms of analysis of multiple-choice questions (MCQ) to obtain an assessment method, as fair as possible, for the students. The authors intend to ascertain if it is possible to control the quality of the MCQ contained in a bank of questions, implemented in Moodle, presenting some evidence…
Descriptors: Learning Analytics, Multiple Choice Tests, Test Theory, Item Response Theory