NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 12 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes – Educational and Psychological Measurement, 2023
Psychometricians have devoted much research and attention to categorical item responses, leading to the development and widespread use of item response theory for the estimation of model parameters and identification of items that do not perform in the same way for examinees from different population subgroups (e.g., differential item functioning…
Descriptors: Test Bias, Item Response Theory, Computation, Methods
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Finch, W. Holmes – Journal of Educational Measurement, 2015
SIBTEST is a differential item functioning (DIF) detection method that is accurate and effective with small samples, in the presence of group mean differences, and for assessment of both uniform and nonuniform DIF. The presence of multilevel data with DIF detection has received increased attention. Ignoring such structure can inflate Type I error.…
Descriptors: Test Bias, Data, Simulation, Accuracy
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes – Applied Measurement in Education, 2016
Differential item functioning (DIF) assessment is a crucial component in test construction, serving as the primary way in which instrument developers ensure that measures perform in the same way for multiple groups within the population. When such is not the case, scores may not accurately reflect the trait of interest for all individuals in the…
Descriptors: Test Bias, Monte Carlo Methods, Comparative Analysis, Population Groups
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; Hernández Finch, Maria E.; French, Brian F. – International Journal of Testing, 2016
Differential item functioning (DIF) assessment is key in score validation. When DIF is present scores may not accurately reflect the construct of interest for some groups of examinees, leading to incorrect conclusions from the scores. Given rising immigration, and the increased reliance of educational policymakers on cross-national assessments…
Descriptors: Test Bias, Scores, Native Language, Language Usage
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Finch, W. Holmes – Educational and Psychological Measurement, 2013
Multilevel data structures are ubiquitous in the assessment of differential item functioning (DIF), particularly in large-scale testing programs. There are a handful of DIF procures for researchers to select from that appropriately account for multilevel data structures. However, little, if any, work has been completed to extend a popular DIF…
Descriptors: Test Bias, Statistical Analysis, Comparative Analysis, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes – Applied Psychological Measurement, 2012
Increasingly, researchers interested in identifying potentially biased test items are encouraged to use a confirmatory, rather than exploratory, approach. One such method for confirmatory testing is rooted in differential bundle functioning (DBF), where hypotheses regarding potential differential item functioning (DIF) for sets of items (bundles)…
Descriptors: Test Bias, Test Items, Statistical Analysis, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; Hernández Finch, Maria E. – Educational and Psychological Measurement, 2013
The assessment of test data for the presence of differential item functioning (DIF) is a key component of instrument development and validation. Among the many methods that have been used successfully in such analyses is the mixture modeling approach. Using this approach to identify the presence of DIF has been touted as potentially superior for…
Descriptors: Learning Disabilities, Testing Accommodations, Test Bias, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes – Educational and Psychological Measurement, 2011
Missing information is a ubiquitous aspect of data analysis, including responses to items on cognitive and affective instruments. Although the broader statistical literature describes missing data methods, relatively little work has focused on this issue in the context of differential item functioning (DIF) detection. Such prior research has…
Descriptors: Test Bias, Data Analysis, Item Response Theory, Regression (Statistics)
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Finch, W. Holmes – Journal of Educational Measurement, 2010
The purpose of this study was to examine the performance of differential item functioning (DIF) assessment in the presence of a multilevel structure that often underlies data from large-scale testing programs. Analyses were conducted using logistic regression (LR), a popular, flexible, and effective tool for DIF detection. Data were simulated…
Descriptors: Test Bias, Testing Programs, Evaluation, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Mucherah, Winnie; Finch, W. Holmes; Keaikitse, Setlhomo – International Journal of Testing, 2012
Understanding adolescent self-concept is of great concern for educators, mental health professionals, and parents, as research consistently demonstrates that low self-concept is related to a number of problem behaviors and poor outcomes. Thus, accurate measurements of self-concept are key, and the validity of such measurements, including the…
Descriptors: Test Bias, Mental Health Workers, Validity, Self Concept Measures
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; French, Brian F. – Educational and Psychological Measurement, 2008
A number of statistical methods exist for the detection of differential item functioning (DIF). The performance of DIF methods has been widely studied and generally found to be effective in the detection of both uniform and nonuniform DIF. Anecdotal reports suggest that these techniques may too often incorrectly detect the presence of one type of…
Descriptors: Test Bias, Simulation, Statistical Analysis, Probability
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; French, Brian F. – Educational and Psychological Measurement, 2007
Differential item functioning (DIF) continues to receive attention both in applied and methodological studies. Because DIF can be an indicator of irrelevant variance that can influence test scores, continuing to evaluate and improve the accuracy of detection methods is an essential step in gathering score validity evidence. Methods for detecting…
Descriptors: Item Response Theory, Factor Analysis, Test Bias, Comparative Analysis