Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 2 |
Descriptor
Comparative Analysis | 2 |
Error Patterns | 2 |
Test Bias | 2 |
Asian Americans | 1 |
Cognitive Ability | 1 |
College Entrance Examinations | 1 |
Correlation | 1 |
Evaluation Criteria | 1 |
Grade 7 | 1 |
Grade 8 | 1 |
Item Analysis | 1 |
More ▼ |
Source
ETS Research Report Series | 2 |
Publication Type
Journal Articles | 2 |
Reports - Research | 2 |
Numerical/Quantitative Data | 1 |
Education Level
Elementary Education | 1 |
Grade 7 | 1 |
Grade 8 | 1 |
Higher Education | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Postsecondary Education | 1 |
Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Fu, Jianbin; Wise, Maxwell – ETS Research Report Series, 2012
In the Cognitively Based Assessment of, for, and as Learning ("CBAL"™) research initiative, innovative K-12 prototype tests based on cognitive competency models are developed. This report presents the statistical results of the 2 CBAL Grade 8 writing tests and 2 Grade 7 reading tests administered to students in 20 states in spring 2011.…
Descriptors: Cognitive Ability, Grade 8, Writing Tests, Grade 7
Yu, Lei; Moses, Tim; Puhan, Gautam; Dorans, Neil – ETS Research Report Series, 2008
All differential item functioning (DIF) methods require at least a moderate sample size for effective DIF detection. Samples that are less than 200 pose a challenge for DIF analysis. Smoothing can improve upon the estimation of the population distribution by preserving major features of an observed frequency distribution while eliminating the…
Descriptors: Test Bias, Item Response Theory, Sample Size, Evaluation Criteria