NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Rios, Joseph A.; Soland, James – International Journal of Testing, 2022
The objective of the present study was to investigate item-, examinee-, and country-level correlates of rapid guessing (RG) in the context of the 2018 PISA science assessment. Analyzing data from 267,148 examinees across 71 countries showed that over 50% of examinees engaged in RG on an average proportion of one in 10 items. Descriptive…
Descriptors: Foreign Countries, International Assessment, Achievement Tests, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Rujun Xu; James Soland – International Journal of Testing, 2024
International surveys are increasingly being used to understand nonacademic outcomes like math and science motivation, and to inform education policy changes within countries. Such instruments assume that the measure works consistently across countries, ethnicities, and languages--that is, they assume measurement invariance. While studies have…
Descriptors: Surveys, Statistical Bias, Achievement Tests, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Michaelides, Michalis P.; Ivanova, Militsa; Nicolaou, Christiana – International Journal of Testing, 2020
The study examined the relationship between examinees' test-taking effort and their accuracy rate on items from the PISA 2015 assessment. The 10% normative threshold method was applied on Science multiple-choice items in the Cyprus sample to detect rapid guessing behavior. Results showed that the extent of rapid guessing across simple and complex…
Descriptors: Accuracy, Multiple Choice Tests, International Assessment, Achievement Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Rutkowski, Leslie; Rutkowski, David; Zhou, Yan – International Journal of Testing, 2016
Using an empirically-based simulation study, we show that typically used methods of choosing an item calibration sample have significant impacts on achievement bias and system rankings. We examine whether recent PISA accommodations, especially for lower performing participants, can mitigate some of this bias. Our findings indicate that standard…
Descriptors: Simulation, International Programs, Adolescents, Student Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Solano-Flores, Guillermo; Wang, Chao; Shade, Chelsey – International Journal of Testing, 2016
We examined multimodality (the representation of information in multiple semiotic modes) in the context of international test comparisons. Using Program of International Student Assessment (PISA)-2009 data, we examined the correlation of the difficulty of science items and the complexity of their illustrations. We observed statistically…
Descriptors: Semiotics, Difficulty Level, Test Items, Science Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Rindermann, Heiner; Baumeister, Antonia E. E. – International Journal of Testing, 2015
Scholastic tests regard cognitive abilities to be domain-specific competences. However, high correlations between competences indicate either high task similarity or a dependence on common factors. The present rating study examined the validity of 12 Programme for International Student Assessment (PISA) and Third or Trends in International…
Descriptors: Test Validity, Test Interpretation, Competence, Reading Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Le, Luc T. – International Journal of Testing, 2009
This study uses PISA cycle 3 field trial data to investigate the relationships between gender differential item functioning (DIF) across countries and test languages for science items and their formats and the four other dimensions defined in PISA framework: focus, context, competency, and scientific knowledge. The data used were collected from 60…
Descriptors: Test Bias, Gender Bias, Science Tests, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Wyse, Adam E.; Mapuranga, Raymond – International Journal of Testing, 2009
Differential item functioning (DIF) analysis is a statistical technique used for ensuring the equity and fairness of educational assessments. This study formulates a new DIF analysis method using the information similarity index (ISI). ISI compares item information functions when data fits the Rasch model. Through simulations and an international…
Descriptors: Test Bias, Evaluation Methods, Test Items, Educational Assessment