Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 8 |
Descriptor
Test Items | 8 |
Foreign Countries | 7 |
Achievement Tests | 5 |
Science Tests | 5 |
Secondary School Students | 5 |
International Assessment | 4 |
Correlation | 3 |
Item Analysis | 3 |
Test Bias | 3 |
Comparative Analysis | 2 |
Computation | 2 |
More ▼ |
Source
International Journal of… | 8 |
Author
Baumeister, Antonia E. E. | 1 |
Ivanova, Militsa | 1 |
James Soland | 1 |
Le, Luc T. | 1 |
Mapuranga, Raymond | 1 |
Michaelides, Michalis P. | 1 |
Nicolaou, Christiana | 1 |
Rindermann, Heiner | 1 |
Rios, Joseph A. | 1 |
Rujun Xu | 1 |
Rutkowski, David | 1 |
More ▼ |
Publication Type
Journal Articles | 8 |
Reports - Research | 6 |
Reports - Evaluative | 2 |
Education Level
Secondary Education | 6 |
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
Program for International… | 8 |
Trends in International… | 1 |
What Works Clearinghouse Rating
Rios, Joseph A.; Soland, James – International Journal of Testing, 2022
The objective of the present study was to investigate item-, examinee-, and country-level correlates of rapid guessing (RG) in the context of the 2018 PISA science assessment. Analyzing data from 267,148 examinees across 71 countries showed that over 50% of examinees engaged in RG on an average proportion of one in 10 items. Descriptive…
Descriptors: Foreign Countries, International Assessment, Achievement Tests, Secondary School Students
Rujun Xu; James Soland – International Journal of Testing, 2024
International surveys are increasingly being used to understand nonacademic outcomes like math and science motivation, and to inform education policy changes within countries. Such instruments assume that the measure works consistently across countries, ethnicities, and languages--that is, they assume measurement invariance. While studies have…
Descriptors: Surveys, Statistical Bias, Achievement Tests, Foreign Countries
Michaelides, Michalis P.; Ivanova, Militsa; Nicolaou, Christiana – International Journal of Testing, 2020
The study examined the relationship between examinees' test-taking effort and their accuracy rate on items from the PISA 2015 assessment. The 10% normative threshold method was applied on Science multiple-choice items in the Cyprus sample to detect rapid guessing behavior. Results showed that the extent of rapid guessing across simple and complex…
Descriptors: Accuracy, Multiple Choice Tests, International Assessment, Achievement Tests
Rutkowski, Leslie; Rutkowski, David; Zhou, Yan – International Journal of Testing, 2016
Using an empirically-based simulation study, we show that typically used methods of choosing an item calibration sample have significant impacts on achievement bias and system rankings. We examine whether recent PISA accommodations, especially for lower performing participants, can mitigate some of this bias. Our findings indicate that standard…
Descriptors: Simulation, International Programs, Adolescents, Student Evaluation
Solano-Flores, Guillermo; Wang, Chao; Shade, Chelsey – International Journal of Testing, 2016
We examined multimodality (the representation of information in multiple semiotic modes) in the context of international test comparisons. Using Program of International Student Assessment (PISA)-2009 data, we examined the correlation of the difficulty of science items and the complexity of their illustrations. We observed statistically…
Descriptors: Semiotics, Difficulty Level, Test Items, Science Tests
Rindermann, Heiner; Baumeister, Antonia E. E. – International Journal of Testing, 2015
Scholastic tests regard cognitive abilities to be domain-specific competences. However, high correlations between competences indicate either high task similarity or a dependence on common factors. The present rating study examined the validity of 12 Programme for International Student Assessment (PISA) and Third or Trends in International…
Descriptors: Test Validity, Test Interpretation, Competence, Reading Tests
Le, Luc T. – International Journal of Testing, 2009
This study uses PISA cycle 3 field trial data to investigate the relationships between gender differential item functioning (DIF) across countries and test languages for science items and their formats and the four other dimensions defined in PISA framework: focus, context, competency, and scientific knowledge. The data used were collected from 60…
Descriptors: Test Bias, Gender Bias, Science Tests, Test Items
Wyse, Adam E.; Mapuranga, Raymond – International Journal of Testing, 2009
Differential item functioning (DIF) analysis is a statistical technique used for ensuring the equity and fairness of educational assessments. This study formulates a new DIF analysis method using the information similarity index (ISI). ISI compares item information functions when data fits the Rasch model. Through simulations and an international…
Descriptors: Test Bias, Evaluation Methods, Test Items, Educational Assessment