NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Soysal, Sümeyra – Participatory Educational Research, 2023
Applying a measurement instrument developed in a specific country to other countries raise a critical and important question of interest in especially cross-cultural studies. Confirmatory factor analysis (CFA) is the most preferred and used method to examine the cross-cultural applicability of measurement tools. Although CFA is a sophisticated…
Descriptors: Generalization, Cross Cultural Studies, Measurement Techniques, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Grund, Simon; Lüdtke, Oliver; Robitzsch, Alexander – Journal of Educational and Behavioral Statistics, 2021
Large-scale assessments (LSAs) use Mislevy's "plausible value" (PV) approach to relate student proficiency to noncognitive variables administered in a background questionnaire. This method requires background variables to be completely observed, a requirement that is seldom fulfilled. In this article, we evaluate and compare the…
Descriptors: Data Analysis, Error of Measurement, Research Problems, Statistical Inference
Ayvalli, Merve; Biçak, Bayram – Online Submission, 2018
The aim of this study is to investigate the measurement invariance of PISA 2012 mathematical literacy among the OECD member countries, and gender and region groups in Turkey. Among cognitive test booklets implemented in PISA 2012, booklet 8 which was used commonly by all countries was selected for this correlational survey study. The study was…
Descriptors: Mathematics Tests, Error of Measurement, Gender Differences, Geographic Regions
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sekercioglu, Güçlü; Kogar, Hakan – Novitas-ROYAL (Research on Youth and Language), 2018
The aim of the present study was to examine the measurement invariance (MI) of the reading, mathematics, and science tests in terms of the commonly used languages. It also aimed to examine the differential item functioning (DIF) of the PISA test, the original items of which are in the languages of English and French, in terms of the language…
Descriptors: Error of Measurement, Item Response Theory, International Assessment, Achievement Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Rutkowski, Leslie – Applied Measurement in Education, 2014
Large-scale assessment programs such as the National Assessment of Educational Progress (NAEP), Trends in International Mathematics and Science Study (TIMSS), and Programme for International Student Assessment (PISA) use a sophisticated assessment administration design called matrix sampling that minimizes the testing burden on individual…
Descriptors: Measurement, Testing, Item Sampling, Computation
Peer reviewed Peer reviewed
Direct linkDirect link
Sachse, Karoline A.; Roppelt, Alexander; Haag, Nicole – Journal of Educational Measurement, 2016
Trend estimation in international comparative large-scale assessments relies on measurement invariance between countries. However, cross-national differential item functioning (DIF) has been repeatedly documented. We ran a simulation study using national item parameters, which required trends to be computed separately for each country, to compare…
Descriptors: Comparative Analysis, Measurement, Test Bias, Simulation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Carnoy, Martin – National Education Policy Center, 2015
Stanford education professor Martin Carnoy examines four main critiques of how international test results are used in policymaking. Of particular interest are critiques of the policy analyses published by the Program for International Student Assessment (PISA). Using average PISA scores as a comparative measure of student achievement is misleading…
Descriptors: Criticism, Reputation, Test Validity, Error of Measurement