Publication Date
In 2025 | 77 |
Descriptor
Source
Author
Catherine Underwood | 2 |
Kouider Mokhtari | 2 |
Leslie Rutkowski | 2 |
Lisa De Bortoli | 2 |
Nirmal Ghimire | 2 |
Okan Bulut | 2 |
Pongprapan Pongsophon | 2 |
Sari Sulkunen | 2 |
Yuyang Cai | 2 |
Abdel Rahamane Baba-Moussa | 1 |
Ai Noi Lee | 1 |
More ▼ |
Publication Type
Journal Articles | 64 |
Reports - Research | 58 |
Reports - Evaluative | 12 |
Information Analyses | 4 |
Reports - Descriptive | 4 |
Numerical/Quantitative Data | 3 |
Tests/Questionnaires | 1 |
Education Level
Audience
Location
Australia | 13 |
United States | 8 |
United Kingdom | 5 |
Canada | 4 |
Finland | 4 |
Germany | 4 |
Hong Kong | 4 |
Italy | 4 |
Macau | 4 |
Portugal | 4 |
South Korea | 4 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Program for International… | 77 |
Trends in International… | 5 |
Program for the International… | 1 |
Progress in International… | 1 |
Remote Associates Test | 1 |
What Works Clearinghouse Rating
Hasibe Yahsi Sari; Hulya Kelecioglu – International Journal of Assessment Tools in Education, 2025
The aim of the study is to examine the effect of polytomous item ratio on ability estimation in different conditions in multistage tests (MST) using mixed tests. The study is simulation-based research. In the PISA 2018 application, the ability parameters of the individuals and the item pool were created by using the item parameters estimated from…
Descriptors: Test Items, Test Format, Accuracy, Test Length
B. Goecke; S. Weiss; B. Barbot – Journal of Creative Behavior, 2025
The present paper questions the content validity of the eight creativity-related self-report scales available in PISA 2022's context questionnaire and provides a set of considerations for researchers interested in using these indexes. Specifically, we point out some threats to the content validity of these scales (e.g., "creative thinking…
Descriptors: Creativity, Creativity Tests, Questionnaires, Content Validity
Zeynep Uzun; Tuncay Ögretmen – Large-scale Assessments in Education, 2025
This study aimed to evaluate the item model fit by equating the forms of the PISA 2018 mathematics subtest with concurrent common items equating in samples from Türkiye, the UK, and Italy. The answers given in mathematics subtest Forms 2, 8, and 12 were used in this context. Analyzes were performed using the Dichotomous Rasch Model in the WINSTEPS…
Descriptors: Item Response Theory, Test Items, Foreign Countries, Mathematics Tests
Selcuk Acar; Yuyang Shen – Journal of Creative Behavior, 2025
Creativity tests, like creativity itself, vary widely in their structure and use. These differences include instructions, test duration, environments, prompt and response modalities, and the structure of test items. A key factor is task structure, referring to the specificity of the number of responses requested for a given prompt. Classic…
Descriptors: Creativity, Creative Thinking, Creativity Tests, Task Analysis
Paula Álvarez-Huerta; Alexander Muela; Inaki Larrea – Journal of Creative Behavior, 2025
This paper considers the inclusion of creative thinking as an innovative domain within the OECD's PISA (Program for International Student Assessment) and offers a series of reflections on the opportunities and limitations that follow from this. Although this recognition of the importance of creative thinking represents a step forward, the current…
Descriptors: Creative Thinking, Creativity Tests, Achievement Tests, Foreign Countries
Lisa De Bortoli – Australian Council for Educational Research, 2025
This issue of Snapshots explores how student effort may vary depending on the perceived stakes of the assessment. We revisit the self-reported levels of effort that students invested in the PISA test and compare them with the effort they indicated they would have made if the results counted towards their school marks. Australia's results are…
Descriptors: Foreign Countries, Secondary School Students, Achievement Tests, International Assessment
Ersoy Öz; Okan Bulut; Zuhal Fatma Cellat; Hülya Yürekli – Education and Information Technologies, 2025
Predicting student performance in international large-scale assessments (ILSAs) is crucial for understanding educational outcomes on a global scale. ILSAs, such as the Program for International Student Assessment and the Trends in International Mathematics and Science Study, serve as vital tools for policymakers, educators, and researchers to…
Descriptors: Foreign Countries, Achievement Tests, Secondary School Students, International Assessment
Esra Sözer Boz – Education and Information Technologies, 2025
International large-scale assessments provide cross-national data on students' cognitive and non-cognitive characteristics. A critical methodological issue that often arises in comparing data from cross-national studies is ensuring measurement invariance, indicating that the construct under investigation is the same across the compared groups.…
Descriptors: Achievement Tests, International Assessment, Foreign Countries, Secondary School Students
Okan Bulut; Guher Gorgun; Hacer Karamese – Journal of Educational Measurement, 2025
The use of multistage adaptive testing (MST) has gradually increased in large-scale testing programs as MST achieves a balanced compromise between linear test design and item-level adaptive testing. MST works on the premise that each examinee gives their best effort when attempting the items, and their responses truly reflect what they know or can…
Descriptors: Response Style (Tests), Testing Problems, Testing Accommodations, Measurement
Marjo Sirén; Sari Sulkunen – Scandinavian Journal of Educational Research, 2025
This study examined which aspects of critical literacy are focused on in the reading literacy assessment for the Programme for International Student Assessment (PISA) 2018 and what kinds of texts are related to the critical literacy items in the test. Based on theory-oriented qualitative content analysis, critical literacy items in PISA…
Descriptors: International Assessment, Achievement Tests, Foreign Countries, Secondary School Students
Jung Yeon Park; Sean Joo; Zikun Li; Hyejin Yoon – Educational Measurement: Issues and Practice, 2025
This study examines potential assessment bias based on students' primary language status in PISA 2018. Specifically, multilingual (MLs) and nonmultilingual (non-MLs) students in the United States are compared with regard to their response time as well as scored responses across three cognitive domains (reading, mathematics, and science).…
Descriptors: Achievement Tests, Secondary School Students, International Assessment, Test Bias
Baptiste Barbot; James C. Kaufman – Journal of Creative Behavior, 2025
The OECD's PISA program assesses 15-year-old students globally in key competencies every 3 years, providing influential data on education quality and spurring policy debates. In the latest cycle, the innovation domain focused on creative thinking, assessing over 140,000 students across 60+ countries, in the largest study of adolescent creativity…
Descriptors: Achievement Tests, Foreign Countries, Secondary School Students, International Assessment
Mark Bray; Abdel Rahamane Baba-Moussa – Compare: A Journal of Comparative and International Education, 2025
This paper examines and builds on an earlier contribution to this journal focusing on private supplementary tutoring -- widely known as shadow education -- in Francophone West and Central Africa. Drawing on wider literature about research methods in this domain, it examines the basis for the numerical estimates presented in the original article…
Descriptors: Foreign Countries, Tutoring, Supplementary Education, Private Education
Umut Atasever; Francis L. Huang; Leslie Rutkowski – Large-scale Assessments in Education, 2025
When analyzing large-scale assessments (LSAs) that use complex sampling designs, it is important to account for probability sampling using weights. However, the use of these weights in multilevel models has been widely debated, particularly regarding their application at different levels of the model. Yet, no consensus has been reached on the best…
Descriptors: Mathematics Tests, International Assessment, Elementary Secondary Education, Foreign Countries
Nils Myszkowski; Martin Storme – Journal of Creative Behavior, 2025
In the PISA 2022 creative thinking test, students provide a response to a prompt, which is then coded by human raters as no credit, partial credit, or full credit. Like many large-scale educational testing frameworks, PISA uses the generalized partial credit model (GPCM) as a response model for these ordinal ratings. In this paper, we show that…
Descriptors: Creative Thinking, Creativity Tests, Scores, Prompting