Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 4 |
Descriptor
Error of Measurement | 4 |
Evaluation Criteria | 4 |
Item Response Theory | 4 |
Sample Size | 4 |
Test Items | 3 |
Data Analysis | 2 |
Research Problems | 2 |
Simulation | 2 |
Statistical Bias | 2 |
Test Length | 2 |
Ability | 1 |
More ▼ |
Source
ETS Research Report Series | 1 |
Educational and Psychological… | 1 |
Journal of Educational… | 1 |
Online Submission | 1 |
Author
A. Corinne Huggins-Manley | 1 |
Arikan, Çigdem Akin | 1 |
Cho, Sun-Joo | 1 |
Eric A. Wright | 1 |
Gu, Lixiong | 1 |
Inal, Hatice | 1 |
Lee, Woo-yeol | 1 |
M. David Miller | 1 |
Manna, Venessa F. | 1 |
Soysal, Sümeyra | 1 |
Walter L. Leite | 1 |
More ▼ |
Publication Type
Journal Articles | 4 |
Reports - Research | 4 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Ziying Li; A. Corinne Huggins-Manley; Walter L. Leite; M. David Miller; Eric A. Wright – Educational and Psychological Measurement, 2022
The unstructured multiple-attempt (MA) item response data in virtual learning environments (VLEs) are often from student-selected assessment data sets, which include missing data, single-attempt responses, multiple-attempt responses, and unknown growth ability across attempts, leading to a complex and complicated scenario for using this kind of…
Descriptors: Sequential Approach, Item Response Theory, Data, Simulation
Manna, Venessa F.; Gu, Lixiong – ETS Research Report Series, 2019
When using the Rasch model, equating with a nonequivalent groups anchor test design is commonly achieved by adjustment of new form item difficulty using an additive equating constant. Using simulated 5-year data, this report compares 4 approaches to calculating the equating constants and the subsequent impact on equating results. The 4 approaches…
Descriptors: Item Response Theory, Test Items, Test Construction, Sample Size
Lee, Woo-yeol; Cho, Sun-Joo – Journal of Educational Measurement, 2017
Cross-level invariance in a multilevel item response model can be investigated by testing whether the within-level item discriminations are equal to the between-level item discriminations. Testing the cross-level invariance assumption is important to understand constructs in multilevel data. However, in most multilevel item response model…
Descriptors: Test Items, Item Response Theory, Item Analysis, Simulation
Soysal, Sümeyra; Arikan, Çigdem Akin; Inal, Hatice – Online Submission, 2016
This study aims to investigate the effect of methods to deal with missing data on item difficulty estimations under different test length conditions and sampling sizes. In this line, a data set including 10, 20 and 40 items with 100 and 5000 sampling size was prepared. Deletion process was applied at the rates of 5%, 10% and 20% under conditions…
Descriptors: Research Problems, Data Analysis, Item Response Theory, Test Items