Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 19 |
Descriptor
Evaluation Methods | 36 |
Models | 36 |
Test Bias | 36 |
Item Response Theory | 9 |
Measurement Techniques | 8 |
Test Items | 8 |
Simulation | 7 |
Student Evaluation | 7 |
Comparative Analysis | 5 |
Educational Assessment | 5 |
Factor Analysis | 5 |
More ▼ |
Source
Author
Wang, Wen-Chung | 3 |
Cho, Sun-Joo | 2 |
Albano, Anthony D. | 1 |
Artur Pokropek | 1 |
Beland, Sebastien | 1 |
Bell, Gregory | 1 |
Berg, Rachelle | 1 |
Bonner, Fred | 1 |
Bottge, Brian A. | 1 |
Carmen Köhler | 1 |
Chajewski, Michael | 1 |
More ▼ |
Publication Type
Education Level
Higher Education | 7 |
Secondary Education | 3 |
Postsecondary Education | 2 |
Elementary Secondary Education | 1 |
Two Year Colleges | 1 |
Audience
Administrators | 1 |
Practitioners | 1 |
Location
Asia | 1 |
California | 1 |
Canada | 1 |
India | 1 |
Japan | 1 |
United States | 1 |
Laws, Policies, & Programs
Education for All Handicapped… | 2 |
Assessments and Surveys
Program for International… | 2 |
California Achievement Tests | 1 |
Graduate Record Examinations | 1 |
Medical College Admission Test | 1 |
SAT (College Admission Test) | 1 |
Wechsler Intelligence Scale… | 1 |
What Works Clearinghouse Rating
Carmen Köhler; Lale Khorramdel; Artur Pokropek; Johannes Hartig – Journal of Educational Measurement, 2024
For assessment scales applied to different groups (e.g., students from different states; patients in different countries), multigroup differential item functioning (MG-DIF) needs to be evaluated in order to ensure that respondents with the same trait level but from different groups have equal response probabilities on a particular item. The…
Descriptors: Measures (Individuals), Test Bias, Models, Item Response Theory
Cho, Sun-Joo; Suh, Youngsuk; Lee, Woo-yeol – Educational Measurement: Issues and Practice, 2016
The purpose of this ITEMS module is to provide an introduction to differential item functioning (DIF) analysis using mixture item response models. The mixture item response models for DIF analysis involve comparing item profiles across latent groups, instead of manifest groups. First, an overview of DIF analysis based on latent groups, called…
Descriptors: Test Bias, Research Methodology, Evaluation Methods, Models
Assessment of Differential Item Functioning under Cognitive Diagnosis Models: The DINA Model Example
Li, Xiaomin; Wang, Wen-Chung – Journal of Educational Measurement, 2015
The assessment of differential item functioning (DIF) is routinely conducted to ensure test fairness and validity. Although many DIF assessment methods have been developed in the context of classical test theory and item response theory, they are not applicable for cognitive diagnosis models (CDMs), as the underlying latent attributes of CDMs are…
Descriptors: Test Bias, Models, Cognitive Measurement, Evaluation Methods
Hou, Likun; de la Torre, Jimmy; Nandakumar, Ratna – Journal of Educational Measurement, 2014
Analyzing examinees' responses using cognitive diagnostic models (CDMs) has the advantage of providing diagnostic information. To ensure the validity of the results from these models, differential item functioning (DIF) in CDMs needs to be investigated. In this article, the Wald test is proposed to examine DIF in the context of CDMs. This study…
Descriptors: Test Bias, Models, Simulation, Error Patterns
Albano, Anthony D. – Journal of Educational Measurement, 2013
In many testing programs it is assumed that the context or position in which an item is administered does not have a differential effect on examinee responses to the item. Violations of this assumption may bias item response theory estimates of item and person parameters. This study examines the potentially biasing effects of item position. A…
Descriptors: Test Items, Item Response Theory, Test Format, Questioning Techniques
Wang, Chuang; Kim, Do-Hong; Ng, Kok-Mun – Journal of Psychoeducational Assessment, 2012
This study examined the factorial and item-level invariance of Wong and Law's emotional intelligence scale (WLEIS) in a sample of 375 international students in U.S. universities. Confirmatory factor analysis (CFA) and differential item functioning (DIF) analysis were employed at the test and item level, respectively. International students from…
Descriptors: Factor Analysis, Emotional Intelligence, Test Bias, Models
Liu, Yan; Zumbo, Bruno D. – Educational and Psychological Measurement, 2012
There is a lack of research on the effects of outliers on the decisions about the number of factors to retain in an exploratory factor analysis, especially for outliers arising from unintended and unknowingly included subpopulations. The purpose of the present research was to investigate how outliers from an unintended and unknowingly included…
Descriptors: Factor Analysis, Factor Structure, Evaluation Research, Evaluation Methods
Engelhard, George, Jr.; Wind, Stefanie A.; Kobrin, Jennifer L.; Chajewski, Michael – College Board, 2013
The purpose of this study is to illustrate the use of explanatory models based on Rasch measurement theory to detect systematic relationships between student and item characteristics and achievement differences using differential item functioning (DIF), differential group functioning (DGF), and differential person functioning (DPF) techniques. The…
Descriptors: Test Bias, Evaluation Methods, Measurement Techniques, Writing Evaluation
Woods, Carol M.; Grimm, Kevin J. – Applied Psychological Measurement, 2011
In extant literature, multiple indicator multiple cause (MIMIC) models have been presented for identifying items that display uniform differential item functioning (DIF) only, not nonuniform DIF. This article addresses, for apparently the first time, the use of MIMIC models for testing both uniform and nonuniform DIF with categorical indicators. A…
Descriptors: Test Bias, Testing, Interaction, Item Response Theory
Gomez, Rapson – Journal of Attention Disorders, 2012
Objective: Generalized partial credit model, which is based on item response theory (IRT), was used to test differential item functioning (DIF) for the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed.), inattention (IA), and hyperactivity/impulsivity (HI) symptoms across boys and girls. Method: To accomplish this, parents completed…
Descriptors: Rating Scales, Item Response Theory, Mental Disorders, Attention Deficit Hyperactivity Disorder
Cho, Sun-Joo; Bottge, Brian A.; Cohen, Allan S.; Kim, Seock-Ho – Journal of Special Education, 2011
Current methods for detecting growth of students' problem-solving skills in math focus mainly on analyzing changes in test scores. Score-level analysis, however, may fail to reflect subtle changes that might be evident at the item level. This article demonstrates a method for studying item-level changes using data from a multiwave experiment with…
Descriptors: Test Bias, Group Membership, Mathematics Skills, Ability
Magis, David; Raiche, Gilles; Beland, Sebastien; Gerard, Paul – International Journal of Testing, 2011
We present an extension of the logistic regression procedure to identify dichotomous differential item functioning (DIF) in the presence of more than two groups of respondents. Starting from the usual framework of a single focal group, we propose a general approach to estimate the item response functions in each group and to test for the presence…
Descriptors: Language Skills, Identification, Foreign Countries, Evaluation Methods
Wang, Wen-Chung; Shih, Ching-Lin; Yang, Chih-Chien – Educational and Psychological Measurement, 2009
This study implements a scale purification procedure onto the standard MIMIC method for differential item functioning (DIF) detection and assesses its performance through a series of simulations. It is found that the MIMIC method with scale purification (denoted as M-SP) outperforms the standard MIMIC method (denoted as M-ST) in controlling…
Descriptors: Test Items, Measures (Individuals), Test Bias, Evaluation Research
Wyse, Adam E.; Mapuranga, Raymond – International Journal of Testing, 2009
Differential item functioning (DIF) analysis is a statistical technique used for ensuring the equity and fairness of educational assessments. This study formulates a new DIF analysis method using the information similarity index (ISI). ISI compares item information functions when data fits the Rasch model. Through simulations and an international…
Descriptors: Test Bias, Evaluation Methods, Test Items, Educational Assessment
Koedel, Cory – Economics of Education Review, 2009
This paper examines whether educational production in secondary school involves joint production among teachers across subjects. In doing so, it also provides insights into the reliability of value-added modeling. Teacher value-added to reading test scores is estimated for four different teacher types: English, math, science and social-studies.…
Descriptors: Teacher Role, Reading Tests, English Teachers, Secondary School Teachers