NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 52 results Save | Export
Joanna Williamson – Research Matters, 2025
Teachers, examiners and assessment experts know from experience that some candidates annotate exam questions. "Annotation" includes anything the candidate writes or draws outside of the designated response space, such as underlining, jotting, circling, sketching and calculating. Annotations are of interest because they may evidence…
Descriptors: Mathematics, Tests, Documentation, Secondary Education
Peer reviewed Peer reviewed
Direct linkDirect link
Esther Ulitzsch; Janine Buchholz; Hyo Jeong Shin; Jonas Bertling; Oliver Lüdtke – Large-scale Assessments in Education, 2024
Common indicator-based approaches to identifying careless and insufficient effort responding (C/IER) in survey data scan response vectors or timing data for aberrances, such as patterns signaling straight lining, multivariate outliers, or signals that respondents rushed through the administered items. Each of these approaches is susceptible to…
Descriptors: Response Style (Tests), Attention, Achievement Tests, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Okan Bulut; Guher Gorgun; Hacer Karamese – Journal of Educational Measurement, 2025
The use of multistage adaptive testing (MST) has gradually increased in large-scale testing programs as MST achieves a balanced compromise between linear test design and item-level adaptive testing. MST works on the premise that each examinee gives their best effort when attempting the items, and their responses truly reflect what they know or can…
Descriptors: Response Style (Tests), Testing Problems, Testing Accommodations, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Saskia van Laar; Jianan Chen; Johan Braeken – Measurement: Interdisciplinary Research and Perspectives, 2024
Questionnaires in educational research assessing students' attitudes and beliefs are low-stakes for the students. As a consequence, students might not always consistently respond to a questionnaire scale but instead provide more random response patterns with no clear link to items' contents. We study inter-individual differences in students'…
Descriptors: Foreign Countries, Response Style (Tests), Grade 8, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Surina He; Xiaoxiao Liu; Ying Cui – Educational Psychology, 2025
The increasing use of low-stakes international assessments highlights the importance of test-taking efforts. Previous studies have used self-reported and response time-based measures to examine this effort. Although differences between these measures have been suggested, their association with performances and potential gender gaps remains…
Descriptors: Achievement Tests, International Assessment, Foreign Countries, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Ulitzsch, Esther; Lüdtke, Oliver; Robitzsch, Alexander – Educational Measurement: Issues and Practice, 2023
Country differences in response styles (RS) may jeopardize cross-country comparability of Likert-type scales. When adjusting for rather than investigating RS is the primary goal, it seems advantageous to impose minimal assumptions on RS structures and leverage information from multiple scales for RS measurement. Using PISA 2015 background…
Descriptors: Response Style (Tests), Comparative Analysis, Achievement Tests, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
J. A. Bialo; H. Li – Educational Assessment, 2024
This study evaluated differential item functioning (DIF) in achievement motivation items before and after using anchoring vignettes as a statistical tool to account for group differences in response styles across gender and ethnicity. We applied the nonparametric scoring of the vignettes to motivation items from the 2015 Programme for…
Descriptors: Test Bias, Student Motivation, Achievement Tests, Secondary School Students
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bulut, Hatice Cigdem – International Journal of Assessment Tools in Education, 2021
Several studies have been published on disengaged test respondents, and others have analyzed disengaged survey respondents separately. For many large-scale assessments, students answer questionnaire and test items in succession. This study examines the percentage of students who continuously engage in disengaged responding behaviors across…
Descriptors: Reaction Time, Response Style (Tests), Foreign Countries, International Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Esther Ulitzsch; Steffi Pohl; Lale Khorramdel; Ulf Kroehne; Matthias von Davier – Journal of Educational and Behavioral Statistics, 2024
Questionnaires are by far the most common tool for measuring noncognitive constructs in psychology and educational sciences. Response bias may pose an additional source of variation between respondents that threatens validity of conclusions drawn from questionnaire data. We present a mixture modeling approach that leverages response time data from…
Descriptors: Item Response Theory, Response Style (Tests), Questionnaires, Secondary School Students
Steven R. Hiner – ProQuest LLC, 2023
The purpose of this study was to determine if there were significant statistical differences between scores on constructed response and computer-scorable questions on an accelerated middle school math placement test in a large urban school district in Ohio, and to ensure that all students have an opportunity to take the test. Five questions on a…
Descriptors: Scores, Middle Schools, Mathematics Tests, Placement Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Doval, Eduardo; Delicado, Pedro – Journal of Educational and Behavioral Statistics, 2020
We propose new methods for identifying and classifying aberrant response patterns (ARPs) by means of functional data analysis. These methods take the person response function (PRF) of an individual and compare it with the pattern that would correspond to a generic individual of the same ability according to the item-person response surface. ARPs…
Descriptors: Response Style (Tests), Data Analysis, Identification, Classification
Peer reviewed Peer reviewed
Direct linkDirect link
Liu, Yue; Liu, Hongyun – Journal of Educational and Behavioral Statistics, 2021
The prevalence and serious consequences of noneffortful responses from unmotivated examinees are well-known in educational measurement. In this study, we propose to apply an iterative purification process based on a response time residual method with fixed item parameter estimates to detect noneffortful responses. The proposed method is compared…
Descriptors: Response Style (Tests), Reaction Time, Test Items, Accuracy
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Boško, Martin; Vonková, Hana; Papajoanu, Ondrej; Moore, Angie – Bulgarian Comparative Education Society, 2023
International large-scale assessments, such as Programme for International Student Assessment (PISA), are a crucial source of information for education researchers and policymakers. The assessment also includes a student questionnaire, however, the data can be biased by the differences in reporting behavior between students. In this paper, we…
Descriptors: Comparative Analysis, Response Style (Tests), Foreign Countries, Institutional Characteristics
Peer reviewed Peer reviewed
Direct linkDirect link
Rios, Joseph A. – Educational and Psychological Measurement, 2021
Low test-taking effort as a validity threat is common when examinees perceive an assessment context to have minimal personal value. Prior research has shown that in such contexts, subgroups may differ in their effort, which raises two concerns when making subgroup mean comparisons. First, it is unclear how differential effort could influence…
Descriptors: Response Style (Tests), Statistical Analysis, Measurement, Comparative Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, HyeSun; Smith, Weldon; Martinez, Angel; Ferris, Heather; Bova, Joe – Applied Measurement in Education, 2021
The aim of the current research was to provide recommendations to facilitate the development and use of anchoring vignettes (AVs) for cross-cultural comparisons in education. Study 1 identified six factors leading to order violations and ties in AV responses based on cognitive interviews with 15-year-old students. The factors were categorized into…
Descriptors: Vignettes, Test Items, Equated Scores, Nonparametric Statistics
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4