Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 8 |
Descriptor
Comparative Analysis | 8 |
Test Bias | 8 |
Foreign Countries | 7 |
English | 5 |
Effect Size | 3 |
French | 3 |
Scores | 3 |
Test Items | 3 |
Achievement Tests | 2 |
Evaluation Methods | 2 |
Grade 4 | 2 |
More ▼ |
Source
International Journal of… | 4 |
Applied Measurement in… | 2 |
Educational Assessment | 1 |
International Journal of… | 1 |
Author
Ercikan, Kadriye | 8 |
Lyons-Thomas, Juliette | 3 |
Oliveri, Maria Elena | 3 |
Zumbo, Bruno D. | 3 |
Roth, Wolff-Michael | 2 |
Sandilands, Debra | 2 |
Chen, Michelle Y. | 1 |
Goodrich, Shawna | 1 |
Guo, Hongwen | 1 |
He, Qiwei | 1 |
Holtzman, Steven | 1 |
More ▼ |
Publication Type
Journal Articles | 8 |
Reports - Research | 8 |
Education Level
Elementary Education | 2 |
Grade 4 | 2 |
Intermediate Grades | 1 |
Secondary Education | 1 |
Audience
Location
Canada | 5 |
United States | 3 |
Australia | 1 |
China | 1 |
Colombia | 1 |
Hong Kong | 1 |
Kuwait | 1 |
North America | 1 |
Qatar | 1 |
South Korea | 1 |
Taiwan | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Program for International… | 5 |
Progress in International… | 2 |
What Works Clearinghouse Rating
Ercikan, Kadriye; Guo, Hongwen; He, Qiwei – Educational Assessment, 2020
Comparing group is one of the key uses of large-scale assessment results, which are used to gain insights to inform policy and practice and to examine the comparability of scores and score meaning. Such comparisons typically focus on examinees' final answers and responses to test questions, ignoring response process differences groups may engage…
Descriptors: Data Use, Responses, Comparative Analysis, Test Bias
Oliveri, Maria Elena; Ercikan, Kadriye; Lyons-Thomas, Juliette; Holtzman, Steven – Applied Measurement in Education, 2016
Differential item functioning (DIF) analyses have been used as the primary method in large-scale assessments to examine fairness for subgroups. Currently, DIF analyses are conducted utilizing manifest methods using observed characteristics (gender and race/ethnicity) for grouping examinees. Homogeneity of item responses is assumed denoting that…
Descriptors: Test Bias, Language Minorities, Effect Size, Foreign Countries
Ercikan, Kadriye; Chen, Michelle Y.; Lyons-Thomas, Juliette; Goodrich, Shawna; Sandilands, Debra; Roth, Wolff-Michael; Simon, Marielle – International Journal of Testing, 2015
The purpose of this research is to examine the comparability of mathematics and science scores for students from English language backgrounds (ELB) and non-English language backgrounds (NELB). We examine the relationship between English reading proficiency and performance on mathematics and science assessments in Australia, Canada, the United…
Descriptors: Scores, Mathematics Tests, Science Tests, Native Speakers
Oliveri, María Elena; Ercikan, Kadriye; Zumbo, Bruno D.; Lawless, René – International Journal of Testing, 2014
In this study, we contrast results from two differential item functioning (DIF) approaches (manifest and latent class) by the number of items and sources of items identified as DIF using data from an international reading assessment. The latter approach yielded three latent classes, presenting evidence of heterogeneity in examinee response…
Descriptors: Test Bias, Comparative Analysis, Reading Tests, Effect Size
Oliveri, Maria E.; Ercikan, Kadriye – Applied Measurement in Education, 2011
In this study, we examine the degree of construct comparability and possible sources of incomparability of the English and French versions of the Programme for International Student Assessment (PISA) 2003 problem-solving measure administered in Canada. Several approaches were used to examine construct comparability at the test- (examination of…
Descriptors: Foreign Countries, English, French, Tests
Roth, Wolff-Michael; Oliveri, Maria
Elena; Sandilands, Debra Dallie; Lyons-Thomas, Juliette; Ercikan, Kadriye – International Journal of Science Education, 2013
Even if national and international assessments are designed to be comparable, subsequent psychometric analyses often reveal differential item functioning (DIF). Central to achieving comparability is to examine the presence of DIF, and if DIF is found, to investigate its sources to ensure differentially functioning items that do not lead to bias.…
Descriptors: Test Bias, Evaluation Methods, Protocol Analysis, Science Achievement
Oliveri, Maria Elena; Olson, Brent F.; Ercikan, Kadriye; Zumbo, Bruno D. – International Journal of Testing, 2012
In this study, the Canadian English and French versions of the Problem-Solving Measure of the Programme for International Student Assessment 2003 were examined to investigate their degree of measurement comparability at the item- and test-levels. Three methods of differential item functioning (DIF) were compared: parametric and nonparametric item…
Descriptors: Foreign Students, Test Bias, Speech Communication, Effect Size
Sandilands, Debra; Oliveri, Maria Elena; Zumbo, Bruno D.; Ercikan, Kadriye – International Journal of Testing, 2013
International large-scale assessments of achievement often have a large degree of differential item functioning (DIF) between countries, which can threaten score equivalence and reduce the validity of inferences based on comparisons of group performances. It is important to understand potential sources of DIF to improve the validity of future…
Descriptors: Validity, Measures (Individuals), International Studies, Foreign Countries