NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)3
Since 2006 (last 20 years)15
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 16 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Ercikan, Kadriye; Guo, Hongwen; He, Qiwei – Educational Assessment, 2020
Comparing group is one of the key uses of large-scale assessment results, which are used to gain insights to inform policy and practice and to examine the comparability of scores and score meaning. Such comparisons typically focus on examinees' final answers and responses to test questions, ignoring response process differences groups may engage…
Descriptors: Data Use, Responses, Comparative Analysis, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, Maria Elena; Ercikan, Kadriye; Lyons-Thomas, Juliette; Holtzman, Steven – Applied Measurement in Education, 2016
Differential item functioning (DIF) analyses have been used as the primary method in large-scale assessments to examine fairness for subgroups. Currently, DIF analyses are conducted utilizing manifest methods using observed characteristics (gender and race/ethnicity) for grouping examinees. Homogeneity of item responses is assumed denoting that…
Descriptors: Test Bias, Language Minorities, Effect Size, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Grover, Raman K.; Ercikan, Kadriye – Applied Measurement in Education, 2017
In gender differential item functioning (DIF) research it is assumed that all members of a gender group have similar item response patterns and therefore generalizations from group level to subgroup and individual levels can be made accurately. However DIF items do not necessarily disadvantage every member of a gender group to the same degree,…
Descriptors: Gender Differences, Test Bias, Socioeconomic Status, Reading Achievement
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, María Elena; Ercikan, Kadriye; Zumbo, Bruno D. – Applied Measurement in Education, 2014
Heterogeneity within English language learners (ELLs) groups has been documented. Previous research on differential item functioning (DIF) analyses suggests that accurate DIF detection rates are reduced greatly when groups are heterogeneous. In this simulation study, we investigated the effects of heterogeneity within linguistic (ELL) groups on…
Descriptors: Test Bias, Accuracy, English Language Learners, Simulation
Peer reviewed Peer reviewed
Direct linkDirect link
Ercikan, Kadriye; Chen, Michelle Y.; Lyons-Thomas, Juliette; Goodrich, Shawna; Sandilands, Debra; Roth, Wolff-Michael; Simon, Marielle – International Journal of Testing, 2015
The purpose of this research is to examine the comparability of mathematics and science scores for students from English language backgrounds (ELB) and non-English language backgrounds (NELB). We examine the relationship between English reading proficiency and performance on mathematics and science assessments in Australia, Canada, the United…
Descriptors: Scores, Mathematics Tests, Science Tests, Native Speakers
Peer reviewed Peer reviewed
Direct linkDirect link
Ercikan, Kadriye; Roth, Wolff-Michael; Simon, Marielle; Sandilands, Debra; Lyons-Thomas, Juliette – Applied Measurement in Education, 2014
Diversity and heterogeneity among language groups have been well documented. Yet most fairness research that focuses on measurement comparability considers linguistic minority students such as English language learners (ELLs) or Francophone students living in minority contexts in Canada as a single group. Our focus in this research is to examine…
Descriptors: Test Bias, Language Minorities, French Canadians, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, María Elena; Ercikan, Kadriye; Zumbo, Bruno D.; Lawless, René – International Journal of Testing, 2014
In this study, we contrast results from two differential item functioning (DIF) approaches (manifest and latent class) by the number of items and sources of items identified as DIF using data from an international reading assessment. The latter approach yielded three latent classes, presenting evidence of heterogeneity in examinee response…
Descriptors: Test Bias, Comparative Analysis, Reading Tests, Effect Size
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, Maria Elena; Ercikan, Kadriye; Zumbo, Bruno – International Journal of Testing, 2013
In this study, we investigated differential item functioning (DIF) and its sources using a latent class (LC) modeling approach. Potential sources of LC DIF related to instruction and teacher-related variables were investigated using substantive and three statistical approaches: descriptive discriminant function, multinomial logistic regression,…
Descriptors: Test Bias, Test Items, Multivariate Analysis, Discriminant Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, Maria E.; Ercikan, Kadriye – Applied Measurement in Education, 2011
In this study, we examine the degree of construct comparability and possible sources of incomparability of the English and French versions of the Programme for International Student Assessment (PISA) 2003 problem-solving measure administered in Canada. Several approaches were used to examine construct comparability at the test- (examination of…
Descriptors: Foreign Countries, English, French, Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Roth, Wolff-Michael; Oliveri, Maria Elena; Sandilands, Debra Dallie; Lyons-Thomas, Juliette; Ercikan, Kadriye – International Journal of Science Education, 2013
Even if national and international assessments are designed to be comparable, subsequent psychometric analyses often reveal differential item functioning (DIF). Central to achieving comparability is to examine the presence of DIF, and if DIF is found, to investigate its sources to ensure differentially functioning items that do not lead to bias.…
Descriptors: Test Bias, Evaluation Methods, Protocol Analysis, Science Achievement
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, Maria Elena; Olson, Brent F.; Ercikan, Kadriye; Zumbo, Bruno D. – International Journal of Testing, 2012
In this study, the Canadian English and French versions of the Problem-Solving Measure of the Programme for International Student Assessment 2003 were examined to investigate their degree of measurement comparability at the item- and test-levels. Three methods of differential item functioning (DIF) were compared: parametric and nonparametric item…
Descriptors: Foreign Students, Test Bias, Speech Communication, Effect Size
Peer reviewed Peer reviewed
Direct linkDirect link
Ercikan, Kadriye; Arim, Rubab; Law, Danielle; Domene, Jose; Gagnon, France; Lacroix, Serge – Educational Measurement: Issues and Practice, 2010
This paper demonstrates and discusses the use of think aloud protocols (TAPs) as an approach for examining and confirming sources of differential item functioning (DIF). The TAPs are used to investigate to what extent surface characteristics of the items that are identified by expert reviews as sources of DIF are supported by empirical evidence…
Descriptors: Test Bias, Protocol Analysis, Cognitive Processes, Expertise
Peer reviewed Peer reviewed
Direct linkDirect link
Sandilands, Debra; Oliveri, Maria Elena; Zumbo, Bruno D.; Ercikan, Kadriye – International Journal of Testing, 2013
International large-scale assessments of achievement often have a large degree of differential item functioning (DIF) between countries, which can threaten score equivalence and reduce the validity of inferences based on comparisons of group performances. It is important to understand potential sources of DIF to improve the validity of future…
Descriptors: Validity, Measures (Individuals), International Studies, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Wu, Amery D.; Ercikan, Kadriye – International Journal of Testing, 2006
Identifying the sources of differential item functioning (DIF) in international assessments is very challenging, because such sources are often nebulous and intertwined. Even though researchers frequently focus on test translation and content area, few actually go beyond these factors to investigate other cultural sources of DIF. This article…
Descriptors: Test Bias, Cultural Influences, Case Studies, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Mendes-Barnett, Sharon; Ercikan, Kadriye – Applied Measurement in Education, 2006
This study contributes to understanding sources of gender differential item functioning (DIF) on mathematics tests. This study focused on identifying sources of DIF and differential bundle functioning for boys and girls on the British Columbia Principles of Mathematics Exam (Grade 12) using a confirmatory SIBTEST approach based on a…
Descriptors: Gender Differences, Test Bias, Mathematics Tests, Multidimensional Scaling
Previous Page | Next Page »
Pages: 1  |  2