Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 3 |
Descriptor
Disabilities | 3 |
High Stakes Tests | 3 |
Mathematics Tests | 3 |
Testing Accommodations | 3 |
Grade 7 | 2 |
Item Response Theory | 2 |
Test Bias | 2 |
Age Differences | 1 |
Assistive Technology | 1 |
Comparative Analysis | 1 |
Context Effect | 1 |
More ▼ |
Author
Engelhard, George, Jr. | 3 |
Cheong, Yuk Fai | 1 |
Domaleski, Christopher S. | 1 |
Fincher, Melissa | 1 |
Randall, Jennifer | 1 |
Publication Type
Journal Articles | 3 |
Reports - Descriptive | 1 |
Reports - Evaluative | 1 |
Reports - Research | 1 |
Education Level
Grade 7 | 3 |
Grade 3 | 1 |
Grade 4 | 1 |
Grade 6 | 1 |
Middle Schools | 1 |
Audience
Location
Georgia | 3 |
Laws, Policies, & Programs
Assessments and Surveys
Georgia Criterion Referenced… | 3 |
What Works Clearinghouse Rating
Randall, Jennifer; Cheong, Yuk Fai; Engelhard, George, Jr. – Educational and Psychological Measurement, 2011
To address whether or not modifications in test administration influence item functioning for students with disabilities on a high-stakes statewide problem-solving assessment, a sample of 868 students (with and without disabilities) from 74 Georgia schools were randomly assigned to one of three testing conditions (resource guide, calculator, or…
Descriptors: Item Response Theory, Models, Context Effect, Test Bias
Engelhard, George, Jr.; Fincher, Melissa; Domaleski, Christopher S. – Applied Measurement in Education, 2011
This study examines the effects of two test administration accommodations on the mathematics performance of students within the context of a large-scale statewide assessment. The two test administration accommodations were resource guides and calculators. A stratified random sample of schools was selected to represent the demographic…
Descriptors: Testing Accommodations, Disabilities, High Stakes Tests, Program Effectiveness
Engelhard, George, Jr. – Educational and Psychological Measurement, 2009
The major purpose of this study is to describe a conceptual framework for examining differential item functioning (DIF) and differential person functioning (DPF) as types of model-data misfit within the context of assessing students with disabilities. Specifically, DIF and DPF can be viewed through the lens of residual analyses. Residual analyses…
Descriptors: Item Response Theory, Test Bias, Disabilities, Special Needs Students