NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Vo, Thao T.; French, Brian F. – Educational Measurement: Issues and Practice, 2021
The use and interpretation of educational and psychological test scores are paramount to individual outcomes and opportunities. Methods for detecting differential item functioning (DIF) are imperative for item analysis when developing and revising assessments, particularly as it pertains to fairness across populations, languages, and cultures. We…
Descriptors: Risk Assessment, Needs Assessment, Test Bias, Youth
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Vo, Thao T. – Journal of Psychoeducational Assessment, 2020
The Washington Assessment of Risk and Needs of Students (WARNS) is a brief self-report measure designed for schools, courts, and youth service providers to identify student behaviors and contexts related to school truancy. Empirical support for WARNS item invariance between ethnic groups is lacking. This study examined differential item…
Descriptors: Truancy, Student Behavior, Test Bias, Measures (Individuals)
Peer reviewed Peer reviewed
Direct linkDirect link
Gotch, Chad M.; French, Brian F. – Educational Assessment, 2020
The State of Washington requires school districts to file court petitions on students with excessive unexcused absences. The "Washington Assessment of Risks and Needs of Students" (WARNS), a self-report screening instrument developed for use by high school and juvenile court personnel in such situations, purports to measure six facets of…
Descriptors: Risk Assessment, Needs Assessment, Truancy, Measurement Techniques
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Finch, W. Holmes – Journal of Educational Measurement, 2015
SIBTEST is a differential item functioning (DIF) detection method that is accurate and effective with small samples, in the presence of group mean differences, and for assessment of both uniform and nonuniform DIF. The presence of multilevel data with DIF detection has received increased attention. Ignoring such structure can inflate Type I error.…
Descriptors: Test Bias, Data, Simulation, Accuracy
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; Hernández Finch, Maria E.; French, Brian F. – International Journal of Testing, 2016
Differential item functioning (DIF) assessment is key in score validation. When DIF is present scores may not accurately reflect the construct of interest for some groups of examinees, leading to incorrect conclusions from the scores. Given rising immigration, and the increased reliance of educational policymakers on cross-national assessments…
Descriptors: Test Bias, Scores, Native Language, Language Usage
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Finch, W. Holmes – Educational and Psychological Measurement, 2013
Multilevel data structures are ubiquitous in the assessment of differential item functioning (DIF), particularly in large-scale testing programs. There are a handful of DIF procures for researchers to select from that appropriately account for multilevel data structures. However, little, if any, work has been completed to extend a popular DIF…
Descriptors: Test Bias, Statistical Analysis, Comparative Analysis, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Gotch, Chad M. – Journal of Psychoeducational Assessment, 2013
The Brigance Comprehensive Inventory of Basic Skills-II (CIBS-II) is a diagnostic battery intended for children in grades 1st through 6th. The aim of this study was to test for item invariance, or differential item functioning (DIF), of the CIBS-II across sex in the standardization sample through the use of item response theory DIF detection…
Descriptors: Gender Differences, Elementary School Students, Test Bias, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Finch, W. Holmes – Journal of Educational Measurement, 2010
The purpose of this study was to examine the performance of differential item functioning (DIF) assessment in the presence of a multilevel structure that often underlies data from large-scale testing programs. Analyses were conducted using logistic regression (LR), a popular, flexible, and effective tool for DIF detection. Data were simulated…
Descriptors: Test Bias, Testing Programs, Evaluation, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; French, Brian F. – Educational and Psychological Measurement, 2008
A number of statistical methods exist for the detection of differential item functioning (DIF). The performance of DIF methods has been widely studied and generally found to be effective in the detection of both uniform and nonuniform DIF. Anecdotal reports suggest that these techniques may too often incorrectly detect the presence of one type of…
Descriptors: Test Bias, Simulation, Statistical Analysis, Probability
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Maller, Susan J. – Educational and Psychological Measurement, 2007
Two unresolved implementation issues with logistic regression (LR) for differential item functioning (DIF) detection include ability purification and effect size use. Purification is suggested to control inaccuracies in DIF detection as a result of DIF items in the ability estimate. Additionally, effect size use may be beneficial in controlling…
Descriptors: Effect Size, Test Bias, Guidelines, Error of Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; French, Brian F. – Educational and Psychological Measurement, 2007
Differential item functioning (DIF) continues to receive attention both in applied and methodological studies. Because DIF can be an indicator of irrelevant variance that can influence test scores, continuing to evaluate and improve the accuracy of detection methods is an essential step in gathering score validity evidence. Methods for detecting…
Descriptors: Item Response Theory, Factor Analysis, Test Bias, Comparative Analysis