NotesFAQContact Us
Collection
Advanced
Search Tips
Education Level
Secondary Education24
Audience
Laws, Policies, & Programs
Assessments and Surveys
Program for International…24
National Assessment of…1
What Works Clearinghouse Rating
Showing 1 to 15 of 24 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Jung Yeon Park; Sean Joo; Zikun Li; Hyejin Yoon – Educational Measurement: Issues and Practice, 2025
This study examines potential assessment bias based on students' primary language status in PISA 2018. Specifically, multilingual (MLs) and nonmultilingual (non-MLs) students in the United States are compared with regard to their response time as well as scored responses across three cognitive domains (reading, mathematics, and science).…
Descriptors: Achievement Tests, Secondary School Students, International Assessment, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Ulitzsch, Esther; Domingue, Benjamin W.; Kapoor, Radhika; Kanopka, Klint; Rios, Joseph A. – Educational Measurement: Issues and Practice, 2023
Common response-time-based approaches for non-effortful response behavior (NRB) in educational achievement tests filter responses that are associated with response times below some threshold. These approaches are, however, limited in that they require a binary decision on whether a response is classified as stemming from NRB; thus ignoring…
Descriptors: Reaction Time, Responses, Behavior, Achievement Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Okan Bulut; Guher Gorgun; Hacer Karamese – Journal of Educational Measurement, 2025
The use of multistage adaptive testing (MST) has gradually increased in large-scale testing programs as MST achieves a balanced compromise between linear test design and item-level adaptive testing. MST works on the premise that each examinee gives their best effort when attempting the items, and their responses truly reflect what they know or can…
Descriptors: Response Style (Tests), Testing Problems, Testing Accommodations, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Surina He; Xiaoxiao Liu; Ying Cui – Educational Psychology, 2025
The increasing use of low-stakes international assessments highlights the importance of test-taking efforts. Previous studies have used self-reported and response time-based measures to examine this effort. Although differences between these measures have been suggested, their association with performances and potential gender gaps remains…
Descriptors: Achievement Tests, International Assessment, Foreign Countries, Secondary School Students
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Militsa G. Ivanova; Michalis P. Michaelides – Practical Assessment, Research & Evaluation, 2023
Research on methods for measuring examinee engagement with constructed-response items is limited. The present study used data from the PISA 2018 Reading domain to construct and compare indicators of test-taking effort on constructed-response items: response time, number of actions, the union (combining effortless responses detected by either…
Descriptors: Achievement Tests, Foreign Countries, International Assessment, Secondary School Students
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Basak Erdem Kara – International Electronic Journal of Elementary Education, 2025
Achievement tests are commonly used in education to evaluate students' academic performance and proficiency in specific subject areas. However, there is a major problem that threatens the validity of achievement test scores which is test-taking disengagement. Respondents provide answers that are inconsistent with their true ability level and can…
Descriptors: International Assessment, Secondary School Students, Foreign Countries, Achievement Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bulut, Hatice Cigdem – International Journal of Assessment Tools in Education, 2021
Several studies have been published on disengaged test respondents, and others have analyzed disengaged survey respondents separately. For many large-scale assessments, students answer questionnaire and test items in succession. This study examines the percentage of students who continuously engage in disengaged responding behaviors across…
Descriptors: Reaction Time, Response Style (Tests), Foreign Countries, International Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Lundgren, Erik; Eklöf, Hanna – International Journal of Testing, 2023
This study aims to assess student motivation to provide valid responses to the PISA student questionnaire. This was done by modeling response times using a three-component finite mixture model, comprising two satisficing response styles (rapid and idle) and one optimizing response style. Each participant's motivation was operationalized as their…
Descriptors: Student Motivation, Reaction Time, Questionnaires, International Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Liu, Yue; Liu, Hongyun – Journal of Educational and Behavioral Statistics, 2021
The prevalence and serious consequences of noneffortful responses from unmotivated examinees are well-known in educational measurement. In this study, we propose to apply an iterative purification process based on a response time residual method with fixed item parameter estimates to detect noneffortful responses. The proposed method is compared…
Descriptors: Response Style (Tests), Reaction Time, Test Items, Accuracy
Peer reviewed Peer reviewed
Direct linkDirect link
Pools, Elodie – Applied Measurement in Education, 2022
Many low-stakes assessments, such as international large-scale surveys, are administered during time-limited testing sessions and some test-takers are not able to endorse the last items of the test, resulting in not-reached (NR) items. However, because the test has no consequence for the respondents, these NR items can also stem from quitting the…
Descriptors: Achievement Tests, Foreign Countries, International Assessment, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Shin, Hyo Jeong; Jewsbury, Paul A.; van Rijn, Peter W. – Large-scale Assessments in Education, 2022
The present paper investigates and examines the conditional dependencies between cognitive responses (RA; Response Accuracy) and process data, in particular, response times (RT) in large-scale educational assessments. Using two prominent large-scale assessments, NAEP and PISA, we examined the RA-RT conditional dependencies within each item in the…
Descriptors: Cognitive Processes, Reaction Time, Educational Assessment, Achievement Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Rios, Joseph A.; Soland, James – International Journal of Testing, 2022
The objective of the present study was to investigate item-, examinee-, and country-level correlates of rapid guessing (RG) in the context of the 2018 PISA science assessment. Analyzing data from 267,148 examinees across 71 countries showed that over 50% of examinees engaged in RG on an average proportion of one in 10 items. Descriptive…
Descriptors: Foreign Countries, International Assessment, Achievement Tests, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Han, Areum; Krieger, Florian; Borgonovi, Francesca; Greiff, Samuel – Large-scale Assessments in Education, 2023
Process data are becoming more and more popular in education research. In the field of computer-based assessments of collaborative problem solving (ColPS), process data have been used to identify students' test-taking strategies while working on the assessment, and such data can be used to complement data collected on accuracy and overall…
Descriptors: Behavior Patterns, Cooperative Learning, Problem Solving, Reaction Time
Peer reviewed Peer reviewed
Direct linkDirect link
DeCarlo, Lawrence T. – Journal of Educational Measurement, 2021
In a signal detection theory (SDT) approach to multiple choice exams, examinees are viewed as choosing, for each item, the alternative that is perceived as being the most plausible, with perceived plausibility depending in part on whether or not an item is known. The SDT model is a process model and provides measures of item difficulty, item…
Descriptors: Perception, Bias, Theories, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Kuang, Huan; Sahin, Fusun – Large-scale Assessments in Education, 2023
Background: Examinees may not make enough effort when responding to test items if the assessment has no consequence for them. These disengaged responses can be problematic in low-stakes, large-scale assessments because they can bias item parameter estimates. However, the amount of bias, and whether this bias is similar across administrations, is…
Descriptors: Test Items, Comparative Analysis, Mathematics Tests, Reaction Time
Previous Page | Next Page »
Pages: 1  |  2