Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 8 |
Descriptor
Data Analysis | 8 |
Multiple Choice Tests | 8 |
Test Items | 8 |
Item Analysis | 4 |
Item Response Theory | 4 |
Foreign Countries | 3 |
Simulation | 3 |
Test Bias | 3 |
Difficulty Level | 2 |
Grade 8 | 2 |
Mathematics Tests | 2 |
More ▼ |
Source
ETS Research Report Series | 1 |
Journal of Educational… | 1 |
Journal of Information… | 1 |
Malaysian Journal of Learning… | 1 |
National Assessment Governing… | 1 |
Numeracy | 1 |
Psychometrika | 1 |
School Science and Mathematics | 1 |
Author
Bolt, Daniel M. | 2 |
Suh, Youngsuk | 2 |
Abu Kassim, Noor Lide | 1 |
Badrasawi, Kamal J. I. | 1 |
Chen, Haiwen H. | 1 |
Crisel Suarez | 1 |
Daud, Nuraihan Mat | 1 |
Dodeen, Hamzeh | 1 |
Dunn, Brian K. | 1 |
Edward Nuhfer | 1 |
Innabi, Hanan | 1 |
More ▼ |
Publication Type
Journal Articles | 7 |
Reports - Research | 5 |
Reports - Evaluative | 2 |
Numerical/Quantitative Data | 1 |
Reports - Descriptive | 1 |
Education Level
Secondary Education | 3 |
Grade 8 | 2 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Grade 12 | 1 |
Grade 4 | 1 |
High Schools | 1 |
Higher Education | 1 |
Intermediate Grades | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
More ▼ |
Audience
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 1 |
Program for International… | 1 |
What Works Clearinghouse Rating
Quality Assurance of Learning Assessments in Large Information Systems and Decision Analysis Courses
Ugray, Zsolt; Dunn, Brian K. – Journal of Information Systems Education, 2022
As Information Systems courses have become both more data-focused and student numbers have increased, there has emerged a greater need to assess technical and analytical skills more efficiently and effectively. Multiple-choice examinations provide a means for accomplishing this, though creating effective multiple-choice assessment items within a…
Descriptors: Quality Assurance, Information Systems, Computer Science Education, Student Evaluation
Paul J. Walter; Edward Nuhfer; Crisel Suarez – Numeracy, 2021
We introduce an approach for making a quantitative comparison of the item response curves (IRCs) of any two populations on a multiple-choice test instrument. In this study, we employ simulated and actual data. We apply our approach to a dataset of 12,187 participants on the 25-item Science Literacy Concept Inventory (SLCI), which includes ample…
Descriptors: Item Analysis, Multiple Choice Tests, Simulation, Data Analysis
Badrasawi, Kamal J. I.; Abu Kassim, Noor Lide; Daud, Nuraihan Mat – Malaysian Journal of Learning and Instruction, 2017
Purpose: The study sought to determine the hierarchical nature of reading skills. Whether reading is a "unitary" or "multi-divisible" skill is still a contentious issue. So is the hierarchical order of reading skills. Determining the hierarchy of reading skills is challenging as item difficulty is greatly influenced by factors…
Descriptors: Foreign Countries, Secondary School Students, Reading Tests, Test Items
Chen, Haiwen H.; von Davier, Matthias; Yamamoto, Kentaro; Kong, Nan – ETS Research Report Series, 2015
One major issue with large-scale assessments is that the respondents might give no responses to many items, resulting in less accurate estimations of both assessed abilities and item parameters. This report studies how the types of items affect the item-level nonresponse rates and how different methods of treating item-level nonresponses have an…
Descriptors: Achievement Tests, Foreign Countries, International Assessment, Secondary School Students
National Assessment Governing Board, 2017
The National Assessment of Educational Progress (NAEP) is the only continuing and nationally representative measure of trends in academic achievement of U.S. elementary and secondary school students in various subjects. For more than four decades, NAEP assessments have been conducted periodically in reading, mathematics, science, writing, U.S.…
Descriptors: Mathematics Achievement, Multiple Choice Tests, National Competency Tests, Educational Trends
Suh, Youngsuk; Bolt, Daniel M. – Journal of Educational Measurement, 2011
In multiple-choice items, differential item functioning (DIF) in the correct response may or may not be caused by differentially functioning distractors. Identifying distractors as causes of DIF can provide valuable information for potential item revision or the design of new test items. In this paper, we examine a two-step approach based on…
Descriptors: Test Items, Test Bias, Multiple Choice Tests, Simulation
Suh, Youngsuk; Bolt, Daniel M. – Psychometrika, 2010
Nested logit item response models for multiple-choice data are presented. Relative to previous models, the new models are suggested to provide a better approximation to multiple-choice items where the application of a solution strategy precedes consideration of response options. In practice, the models also accommodate collapsibility across all…
Descriptors: Computation, Simulation, Psychometrics, Models
Innabi, Hanan; Dodeen, Hamzeh – School Science and Mathematics, 2006
The purpose of this study is to analyze items that exhibit gender-related Differential Item Functioning (DIF) in Mathematics in Jordan. Data was taken from the TIMSS 1999 of Jordan, which includes responses of 5, 299 eighth grade students. Mantel-Haenszel (MH) DIF procedure was applied to 124 multiple-choice items. The results showed that 37 items…
Descriptors: Foreign Countries, Grade 8, Teaching Methods, Females