Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 5 |
Since 2016 (last 10 years) | 7 |
Since 2006 (last 20 years) | 9 |
Descriptor
Guessing (Tests) | 9 |
Reaction Time | 4 |
Achievement Tests | 3 |
Response Style (Tests) | 3 |
Scores | 3 |
Bias | 2 |
College Students | 2 |
Gender Differences | 2 |
Item Response Theory | 2 |
Learner Engagement | 2 |
Mathematics Tests | 2 |
More ▼ |
Source
Educational Assessment | 9 |
Author
DeMars, Christine E. | 3 |
Wise, Steven L. | 3 |
Soland, James | 2 |
Bi, Sharon Z. | 1 |
Cronin, John | 1 |
Deng, Jiayi | 1 |
Ihlenfeldt, Samuel D. | 1 |
Im, Sukkeun | 1 |
Jensen, Nate | 1 |
Keys, Tran D. | 1 |
Kuhfeld, Megan | 1 |
More ▼ |
Publication Type
Journal Articles | 9 |
Reports - Research | 7 |
Reports - Evaluative | 2 |
Information Analyses | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 3 |
Elementary Education | 2 |
Postsecondary Education | 2 |
Secondary Education | 2 |
Grade 8 | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
Measures of Academic Progress | 1 |
What Works Clearinghouse Rating
Rios, Joseph A.; Deng, Jiayi; Ihlenfeldt, Samuel D. – Educational Assessment, 2022
The present meta-analysis sought to quantify the average degree of aggregated test score distortion due to rapid guessing (RG). Included studies group-administered a low-stakes cognitive assessment, identified RG via response times, and reported the rate of examinees engaging in RG, the percentage of RG responses observed, and/or the degree of…
Descriptors: Guessing (Tests), Testing Problems, Scores, Item Response Theory
Wise, Steven L.; Im, Sukkeun; Lee, Jay – Educational Assessment, 2021
This study investigated test-taking engagement on the Spring 2019 administration of a large-scale state summative assessment. Through the identification of rapid-guessing behavior -- which is a validated indicator of disengagement -- the percentage of Grade 8 test events with meaningful amounts of rapid guessing was 5.5% in mathematics, 6.7% in…
Descriptors: Accountability, Test Results, Guessing (Tests), Summative Evaluation
Waterbury, Glenn Thomas; DeMars, Christine E. – Educational Assessment, 2021
Vertical scaling is used to put tests of different difficulty onto a common metric. The Rasch model is often used to perform vertical scaling, despite its strict functional form. Few, if any, studies have examined anchor item choice when using the Rasch model to vertically scale data that do not fit the model. The purpose of this study was to…
Descriptors: Test Items, Equated Scores, Item Response Theory, Scaling
Ulitzsch, Esther; Penk, Christiane; von Davier, Matthias; Pohl, Steffi – Educational Assessment, 2021
Identifying and considering test-taking effort is of utmost importance for drawing valid inferences on examinee competency in low-stakes tests. Different approaches exist for doing so. The speed-accuracy+engagement model aims at identifying non-effortful test-taking behavior in terms of nonresponse and rapid guessing based on responses and…
Descriptors: Response Style (Tests), Guessing (Tests), Reaction Time, Measurement Techniques
Soland, James; Jensen, Nate; Keys, Tran D.; Bi, Sharon Z.; Wolk, Emily – Educational Assessment, 2019
A vast literature investigates academic disengagement among students, including its ultimate manifestation, dropping out of school. Research also shows that test disengagement can be a problem for many inferences educators and policymakers wish to draw from test scores. However, few studies consider whether academic and test disengagement are…
Descriptors: Achievement Tests, Learner Engagement, Test Wiseness, Attendance
Wise, Steven L.; Kuhfeld, Megan R.; Cronin, John – Educational Assessment, 2022
The arrival of the COVID-19 pandemic had a profound effect on K-12 education. Most schools transitioned to remote instruction, and some used remote testing to assess student learning. Remote testing, however, is less controlled than in-school testing, leading to concerns regarding test-taking engagement. This study compared the disengagement of…
Descriptors: Computer Assisted Testing, COVID-19, Pandemics, Learner Engagement
Soland, James; Kuhfeld, Megan – Educational Assessment, 2019
Considerable research has examined the use of rapid guessing measures to identify disengaged item responses. However, little is known about students who rapidly guess over the course of several tests. In this study, we use achievement test data from six administrations over three years to investigate whether rapid guessing is a stable trait-like…
Descriptors: Testing, Guessing (Tests), Reaction Time, Achievement Tests
Wise, Steven L.; DeMars, Christine E. – Educational Assessment, 2010
Educational program assessment studies often use data from low-stakes tests to provide evidence of program quality. The validity of scores from such tests, however, is potentially threatened by examinee noneffort. This study investigated the extent to which one type of noneffort--rapid-guessing behavior--distorted the results from three types of…
Descriptors: Validity, Program Evaluation, Guessing (Tests), Motivation
DeMars, Christine E. – Educational Assessment, 2007
A series of 8 tests was administered to university students over 4 weeks for program assessment purposes. The stakes of these tests were low for students; they received course points based on test completion, not test performance. Tests were administered in a counterbalanced order across 2 administrations. Response time effort, a measure of the…
Descriptors: Reaction Time, Guessing (Tests), Testing Programs, College Students