NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Salles, Franck; Dos Santos, Reinaldo; Keskpaik, Saskia – Large-scale Assessments in Education, 2020
During this digital era, France, like many other countries, is undergoing a transition from paper-based assessments to digital assessments in education. There is a rising interest in technology-enhanced items which offer innovative ways to assess traditional competencies, as well as addressing problem solving skills, specifically in mathematics.…
Descriptors: Foreign Countries, Didacticism, Mathematics Tests, Learning Analytics
Peer reviewed Peer reviewed
Direct linkDirect link
Jiang, Yang; Gong, Tao; Saldivia, Luis E.; Cayton-Hodges, Gabrielle; Agard, Christopher – Large-scale Assessments in Education, 2021
In 2017, the mathematics assessments that are part of the National Assessment of Educational Progress (NAEP) program underwent a transformation shifting the administration from paper-and-pencil formats to digitally-based assessments (DBA). This shift introduced new interactive item types that bring rich process data and tremendous opportunities to…
Descriptors: Data Use, Learning Analytics, Test Items, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Kuang, Huan; Sahin, Fusun – Large-scale Assessments in Education, 2023
Background: Examinees may not make enough effort when responding to test items if the assessment has no consequence for them. These disengaged responses can be problematic in low-stakes, large-scale assessments because they can bias item parameter estimates. However, the amount of bias, and whether this bias is similar across administrations, is…
Descriptors: Test Items, Comparative Analysis, Mathematics Tests, Reaction Time
Peer reviewed Peer reviewed
Direct linkDirect link
Sahin, Füsun; Colvin, Kimberly F. – Large-scale Assessments in Education, 2020
The item responses of examinees who rapid-guess, who do not spend enough time reading and engaging with an item, will not reflect their true ability on that item. Rapid-disengagement refers to rapidly selecting a response to multiple-choice items (i.e., rapid-guess), omitting items, or providing short-unrelated answers to open-ended items in an…
Descriptors: Guessing (Tests), Item Response Theory, Reaction Time, Learner Engagement
Peer reviewed Peer reviewed
Direct linkDirect link
Costa, Denise Reis; Chen, Chia-Wen – Large-scale Assessments in Education, 2023
Given the ongoing development of computer-based tasks, there has been increasing interest in modelling students' behaviour indicators from log file data with contextual variables collected via questionnaires. In this work, we apply a latent regression model to analyse the relationship between latent constructs (i.e., performance, speed, and…
Descriptors: Achievement Tests, Secondary School Students, International Assessment, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Fishbein, Bethany; Martin, Michael O.; Mullis, Ina V. S.; Foy, Pierre – Large-scale Assessments in Education, 2018
Background: TIMSS 2019 is the first assessment in the TIMSS transition to a computer-based assessment system, called eTIMSS. The TIMSS 2019 Item Equivalence Study was conducted in advance of the field test in 2017 to examine the potential for mode effects on the psychometric behavior of the TIMSS mathematics and science trend items induced by the…
Descriptors: Mathematics Achievement, Science Achievement, Mathematics Tests, Elementary Secondary Education