Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 14 |
Descriptor
Reaction Time | 19 |
Guessing (Tests) | 9 |
Achievement Tests | 8 |
Computer Assisted Testing | 7 |
Item Response Theory | 7 |
Test Items | 6 |
Response Style (Tests) | 5 |
Scores | 5 |
Test Validity | 5 |
Motivation | 4 |
Psychometrics | 4 |
More ▼ |
Source
Author
Wise, Steven L. | 19 |
DeMars, Christine E. | 4 |
Kong, Xiaojing | 4 |
Bhola, Dennison S. | 3 |
Kong, Xiaojing J. | 2 |
Soland, James | 2 |
Yang, Sheng-Ta | 2 |
Beverly, Tanesia | 1 |
Brinkhuis, Matthieu | 1 |
Circi, Ruhan | 1 |
Domingue, Benjamin W. | 1 |
More ▼ |
Publication Type
Journal Articles | 14 |
Reports - Research | 10 |
Reports - Evaluative | 8 |
Speeches/Meeting Papers | 4 |
Numerical/Quantitative Data | 1 |
Opinion Papers | 1 |
Reports - Descriptive | 1 |
Education Level
Higher Education | 6 |
Postsecondary Education | 3 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Middle Schools | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
Major Field Achievement Test… | 1 |
SAT (College Admission Test) | 1 |
United States Medical… | 1 |
What Works Clearinghouse Rating
Feinberg, Richard; Jurich, Daniel; Wise, Steven L. – Applied Measurement in Education, 2021
Previous research on rapid responding tends to implicitly consider examinees as either engaging in solution behavior or purely guessing. However, particularly in a high-stakes testing context, examinees perceiving that they are running out of time may consider the remaining items for less time than necessary to provide a fully informed response,…
Descriptors: High Stakes Tests, Reaction Time, Response Style (Tests), Licensing Examinations (Professions)
Domingue, Benjamin W.; Kanopka, Klint; Stenhaug, Ben; Sulik, Michael J.; Beverly, Tanesia; Brinkhuis, Matthieu; Circi, Ruhan; Faul, Jessica; Liao, Dandan; McCandliss, Bruce; Obradovic, Jelena; Piech, Chris; Porter, Tenelle; Soland, James; Weeks, Jon; Wise, Steven L.; Yeatman, Jason – Journal of Educational and Behavioral Statistics, 2022
The speed-accuracy trade-off (SAT) suggests that time constraints reduce response accuracy. Its relevance in observational settings--where response time (RT) may not be constrained but respondent speed may still vary--is unclear. Using 29 data sets containing data from cognitive tasks, we use a flexible method for identification of the SAT (which…
Descriptors: Accuracy, Reaction Time, Task Analysis, College Entrance Examinations
Wise, Steven L. – Applied Measurement in Education, 2019
The identification of rapid guessing is important to promote the validity of achievement test scores, particularly with low-stakes tests. Effective methods for identifying rapid guesses require reliable threshold methods that are also aligned with test taker behavior. Although several common threshold methods are based on rapid guessing response…
Descriptors: Guessing (Tests), Identification, Reaction Time, Reliability
Soland, James; Wise, Steven L.; Gao, Lingyun – Applied Measurement in Education, 2019
Disengaged responding is a phenomenon that often biases observed scores from achievement tests and surveys in practically and statistically significant ways. This problem has led to the development of methods to detect and correct for disengaged responses on both achievement test and survey scores. One major disadvantage when trying to detect…
Descriptors: Reaction Time, Metadata, Response Style (Tests), Student Surveys
Wise, Steven L. – Educational Measurement: Issues and Practice, 2017
The rise of computer-based testing has brought with it the capability to measure more aspects of a test event than simply the answers selected or constructed by the test taker. One behavior that has drawn much research interest is the time test takers spend responding to individual multiple-choice items. In particular, very short response…
Descriptors: Guessing (Tests), Multiple Choice Tests, Test Items, Reaction Time
Wise, Steven L. – Measurement: Interdisciplinary Research and Perspectives, 2015
The growing presence of computer-based testing has brought with it the capability to routinely capture the time that test takers spend on individual test items. This, in turn, has led to an increased interest in potential applications of response time in measuring intellectual ability and achievement. Goldhammer (this issue) provides a very useful…
Descriptors: Reaction Time, Measurement, Computer Assisted Testing, Achievement Tests
Setzer, J. Carl; Wise, Steven L.; van den Heuvel, Jill R.; Ling, Guangming – Applied Measurement in Education, 2013
Assessment results collected under low-stakes testing situations are subject to effects of low examinee effort. The use of computer-based testing allows researchers to develop new ways of measuring examinee effort, particularly using response times. At the item level, responses can be classified as exhibiting either rapid-guessing behavior or…
Descriptors: Testing, Guessing (Tests), Reaction Time, Test Items
Wise, Steven L.; DeMars, Christine E. – Educational Assessment, 2010
Educational program assessment studies often use data from low-stakes tests to provide evidence of program quality. The validity of scores from such tests, however, is potentially threatened by examinee noneffort. This study investigated the extent to which one type of noneffort--rapid-guessing behavior--distorted the results from three types of…
Descriptors: Validity, Program Evaluation, Guessing (Tests), Motivation
DeMars, Christine E.; Wise, Steven L. – International Journal of Testing, 2010
This investigation examined whether different rates of rapid guessing between groups could lead to detectable levels of differential item functioning (DIF) in situations where the item parameters were the same for both groups. Two simulation studies were designed to explore this possibility. The groups in Study 1 were simulated to reflect…
Descriptors: Guessing (Tests), Test Bias, Motivation, Gender Differences
Wise, Steven L.; Ma, Lingling; Kingsbury, G. Gage; Hauser, Carl – Northwest Evaluation Association, 2010
This study investigated the relationships between when a test is administered and the amount of test-taking effort exhibited by examinees. Three time-related variables were investigated: the time of year the test was administered, the day of the week the test event occurred, and the time of day that the test event occurred. Mean effort did not…
Descriptors: Academic Achievement, Test Wiseness, Investigations, Schematic Studies
Kong, Xiaojing J.; Bhola, Dennison S.; Wise, Steven L. – Online Submission, 2005
In this study four methods were compared for setting a response time threshold that differentiates rapid-guessing behavior from solution behavior when examinees are obliged to complete a low-stakes test. The four methods examined were: (1) a fixed threshold for all test items; (2) thresholds based on item surface features such as the amount of…
Descriptors: Reaction Time, Response Style (Tests), Methods, Achievement Tests
Kong, Xiaojing J.; Wise, Steven L.; Bhola, Dennison S. – Educational and Psychological Measurement, 2007
This study compared four methods for setting item response time thresholds to differentiate rapid-guessing behavior from solution behavior. Thresholds were either (a) common for all test items, (b) based on item surface features such as the amount of reading required, (c) based on visually inspecting response time frequency distributions, or (d)…
Descriptors: Test Items, Reaction Time, Timed Tests, Item Response Theory
Wise, Steven L.; DeMars, Christine E. – Journal of Educational Measurement, 2006
The validity of inferences based on achievement test scores is dependent on the amount of effort that examinees put forth while taking the test. With low-stakes tests, for which this problem is particularly prevalent, there is a consequent need for psychometric models that can take into account differing levels of examinee effort. This article…
Descriptors: Guessing (Tests), Psychometrics, Inferences, Reaction Time
Wise, Steven L.; DeMars, Christine E.; Kong, Xiaojing – Online Submission, 2005
The validity of inferences based on achievement test scores is dependent on the amount of effort that examinees put forth while taking the test. With low-stakes tests, for which this problem is particularly prevalent, there is a consequent need for psychometric models that can take into account different levels of examinee effort. This article…
Descriptors: Item Response Theory, Mathematical Models, Measurement Techniques, Reaction Time
Wise, Steven L.; Kong, Xiaojing – Online Submission, 2005
When low-stakes assessments are administered to examinees, the degree to which examinees give their best effort is often unclear, complicating the validity and interpretation of the resulting test scores. This study introduces a new method for measuring examinee test-taking effort on computer-based test items based on item response time. This…
Descriptors: Computer Assisted Testing, Reaction Time, Response Style (Tests), Measurement Techniques
Previous Page | Next Page ยป
Pages: 1 | 2