Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 4 |
Descriptor
Simulation | 7 |
Student Evaluation | 7 |
Test Bias | 7 |
Test Items | 4 |
Evaluation Methods | 3 |
Item Response Theory | 3 |
Achievement Tests | 2 |
Adaptive Testing | 2 |
Models | 2 |
Ability | 1 |
Academic Ability | 1 |
More ▼ |
Source
Applied Measurement in… | 2 |
Applied Psychological… | 1 |
Education Sciences | 1 |
International Journal of… | 1 |
ProQuest LLC | 1 |
Author
Boughton, Keith A. | 1 |
Cui, Ying | 1 |
Gierl, Mark J. | 1 |
Gotzmann, Andrea | 1 |
Hou, Likun | 1 |
Loken, Eric | 1 |
Mousavi, Amin | 1 |
Reckase, Mark D. | 1 |
Rulison, Kelly L. | 1 |
Rutkowski, David | 1 |
Rutkowski, Leslie | 1 |
More ▼ |
Publication Type
Journal Articles | 5 |
Reports - Research | 4 |
Reports - Evaluative | 2 |
Dissertations/Theses -… | 1 |
Education Level
Elementary Education | 1 |
Grade 4 | 1 |
Intermediate Grades | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
Program for International… | 1 |
What Works Clearinghouse Rating
Mousavi, Amin; Cui, Ying – Education Sciences, 2020
Often, important decisions regarding accountability and placement of students in performance categories are made on the basis of test scores generated from tests, therefore, it is important to evaluate the validity of the inferences derived from test results. One of the threats to the validity of such inferences is aberrant responding. Several…
Descriptors: Student Evaluation, Educational Testing, Psychological Testing, Item Response Theory
Rutkowski, Leslie; Rutkowski, David; Zhou, Yan – International Journal of Testing, 2016
Using an empirically-based simulation study, we show that typically used methods of choosing an item calibration sample have significant impacts on achievement bias and system rankings. We examine whether recent PISA accommodations, especially for lower performing participants, can mitigate some of this bias. Our findings indicate that standard…
Descriptors: Simulation, International Programs, Adolescents, Student Evaluation
Hou, Likun – ProQuest LLC, 2013
Analyzing examinees' responses using cognitive diagnostic models (CDMs) have the advantages of providing richer diagnostic information. To ensure the validity of the results from these models, differential item functioning (DIF) in CDMs needs to be investigated. In this dissertation, the model-based DIF detection method, Wald-CDM procedure is…
Descriptors: Test Bias, Models, Cognitive Processes, Diagnostic Tests
Rulison, Kelly L.; Loken, Eric – Applied Psychological Measurement, 2009
A difficult result to interpret in Computerized Adaptive Tests (CATs) occurs when an ability estimate initially drops and then ascends continuously until the test ends, suggesting that the true ability may be higher than implied by the final estimate. This study explains why this asymmetry occurs and shows that early mistakes by high-ability…
Descriptors: Computer Assisted Testing, Adaptive Testing, Item Response Theory, Academic Ability
Gierl, Mark J.; Gotzmann, Andrea; Boughton, Keith A. – Applied Measurement in Education, 2004
Differential item functioning (DIF) analyses are used to identify items that operate differently between two groups, after controlling for ability. The Simultaneous Item Bias Test (SIBTEST) is a popular DIF detection method that matches examinees on a true score estimate of ability. However in some testing situations, like test translation and…
Descriptors: True Scores, Simulation, Test Bias, Student Evaluation
Wang, Wen-Chung; Su, Ya-Hui – Applied Measurement in Education, 2004
In this study we investigated the effects of the average signed area (ASA) between the item characteristic curves of the reference and focal groups and three test purification procedures on the uniform differential item functioning (DIF) detection via the Mantel-Haenszel (M-H) method through Monte Carlo simulations. The results showed that ASA,…
Descriptors: Test Bias, Student Evaluation, Evaluation Methods, Test Items
The Effect of Item Choice on Ability Estimation When Using a Simple Logistic Tailored Testing Model.
Reckase, Mark D. – 1975
This paper explores the effects of item choice on ability estimation when using a tailored testing procedure based on the Rasch simple logistic model. Most studies of the simple logistic model imply that ability estimates are totally independent of the items used, regardless of the testing procedure. This paper shows that the ability estimate is…
Descriptors: Ability, Achievement Tests, Adaptive Testing, Individual Differences