Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 1 |
Descriptor
Science Tests | 2 |
Test Format | 2 |
Test Items | 2 |
Accuracy | 1 |
Achievement Tests | 1 |
Computation | 1 |
Difficulty Level | 1 |
Elementary Secondary Education | 1 |
Field Tests | 1 |
Gender Bias | 1 |
Item Analysis | 1 |
More ▼ |
Source
Educational Assessment | 2 |
Author
Hambleton, Ronald K. | 1 |
Morrison, Kristin M. | 1 |
Robin, Frederic | 1 |
Steedle, Jeffrey T. | 1 |
Zenisky, April L. | 1 |
Publication Type
Journal Articles | 2 |
Reports - Evaluative | 1 |
Reports - Research | 1 |
Education Level
Elementary Secondary Education | 2 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Steedle, Jeffrey T.; Morrison, Kristin M. – Educational Assessment, 2019
Assessment items are commonly field tested prior to operational use to observe statistical item properties such as difficulty. Item parameter estimates from field testing may be used to assign scores via pre-equating or computer adaptive designs. This study examined differences between item difficulty estimates based on field test and operational…
Descriptors: Field Tests, Test Items, Statistics, Difficulty Level
Zenisky, April L.; Hambleton, Ronald K.; Robin, Frederic – Educational Assessment, 2004
Differential item functioning (DIF) analyses are a routine part of the development of large-scale assessments. Less common are studies to understand the potential sources of DIF. The goals of this study were (a) to identify gender DIF in a large-scale science assessment and (b) to look for trends in the DIF and non-DIF items due to content,…
Descriptors: Program Effectiveness, Test Format, Science Tests, Test Items