Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 10 |
Descriptor
Item Response Theory | 10 |
Reaction Time | 10 |
Statistical Analysis | 10 |
Test Items | 7 |
Cheating | 3 |
Comparative Analysis | 3 |
Computer Assisted Testing | 3 |
Simulation | 3 |
Test Wiseness | 3 |
Bayesian Statistics | 2 |
Computation | 2 |
More ▼ |
Source
Author
Man, Kaiwen | 2 |
Ali, Usama S. | 1 |
Bezirhan, Ummugul | 1 |
Chang, Hua-Hua | 1 |
Demirkaya, Onur | 1 |
Harring, Jeffery R. | 1 |
Harring, Jeffrey R. | 1 |
Jansen, Margo G. H. | 1 |
Jensen, Nate | 1 |
Jiao, Hong | 1 |
Johnson, Matthew S. | 1 |
More ▼ |
Publication Type
Journal Articles | 8 |
Reports - Research | 8 |
Reports - Descriptive | 1 |
Reports - Evaluative | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Elementary Secondary Education | 1 |
Grade 3 | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
United States Medical… | 1 |
What Works Clearinghouse Rating
Demirkaya, Onur; Bezirhan, Ummugul; Zhang, Jinming – Journal of Educational and Behavioral Statistics, 2023
Examinees with item preknowledge tend to obtain inflated test scores that undermine test score validity. With the availability of process data collected in computer-based assessments, the research on detecting item preknowledge has progressed on using both item scores and response times. Item revisit patterns of examinees can also be utilized as…
Descriptors: Test Items, Prior Learning, Knowledge Level, Reaction Time
Man, Kaiwen; Harring, Jeffrey R. – Educational and Psychological Measurement, 2021
Many approaches have been proposed to jointly analyze item responses and response times to understand behavioral differences between normally and aberrantly behaved test-takers. Biometric information, such as data from eye trackers, can be used to better identify these deviant testing behaviors in addition to more conventional data types. Given…
Descriptors: Cheating, Item Response Theory, Reaction Time, Eye Movements
Sinharay, Sandip; Johnson, Matthew S. – Grantee Submission, 2019
According to Wollack and Schoenig (2018), benefitting from item preknowledge is one of the three broad types of test fraud that occur in educational assessments. We use tools from constrained statistical inference to suggest a new statistic that is based on item scores and response times and can be used to detect the examinees who may have…
Descriptors: Scores, Test Items, Reaction Time, Cheating
Man, Kaiwen; Harring, Jeffery R.; Ouyang, Yunbo; Thomas, Sarah L. – International Journal of Testing, 2018
Many important high-stakes decisions--college admission, academic performance evaluation, and even job promotion--depend on accurate and reliable scores from valid large-scale assessments. However, examinees sometimes cheat by copying answers from other test-takers or practicing with test items ahead of time, which can undermine the effectiveness…
Descriptors: Reaction Time, High Stakes Tests, Test Wiseness, Cheating
Kahraman, Nilüfer – Eurasian Journal of Educational Research, 2014
Problem: Practitioners working with multiple-choice tests have long utilized Item Response Theory (IRT) models to evaluate the performance of test items for quality assurance. The use of similar applications for performance tests, however, is often encumbered due to the challenges encountered in working with complicated data sets in which local…
Descriptors: Item Response Theory, Licensing Examinations (Professions), Performance Based Assessment, Computer Simulation
Jensen, Nate; Rice, Andrew; Soland, James – Educational Evaluation and Policy Analysis, 2018
While most educators assume that not all students try their best on achievement tests, no current research examines if behaviors associated with low test effort, like rapidly guessing on test items, affect teacher value-added estimates. In this article, we examined the prevalence of rapid guessing to determine if this behavior varied by grade,…
Descriptors: Item Response Theory, Value Added Models, Achievement Tests, Test Items
Maris, Gunter; van der Maas, Han – Psychometrika, 2012
Starting from an explicit scoring rule for time limit tasks incorporating both response time and accuracy, and a definite trade-off between speed and accuracy, a response model is derived. Since the scoring rule is interpreted as a sufficient statistic, the model belongs to the exponential family. The various marginal and conditional distributions…
Descriptors: Item Response Theory, Scoring, Reaction Time, Accuracy
Ali, Usama S.; Chang, Hua-Hua – ETS Research Report Series, 2014
Adaptive testing is advantageous in that it provides more efficient ability estimates with fewer items than linear testing does. Item-driven adaptive pretesting may also offer similar advantages, and verification of such a hypothesis about item calibration was the main objective of this study. A suitability index (SI) was introduced to adaptively…
Descriptors: Adaptive Testing, Simulation, Pretests Posttests, Test Items
Wang, Shudong; Jiao, Hong – Online Submission, 2011
For decades, researchers and practitioners have made a great deal of effort to study a variety of methods to increase parameter accuracy, but only recently can researchers start focusing on improving parameter estimations by using a joint model that could incorporate RT and students information as CI. Given that many tests are currently…
Descriptors: Reaction Time, Item Response Theory, Computer Assisted Testing, Computation
Jansen, Margo G. H. – Journal of Educational and Behavioral Statistics, 2007
The author considers a latent trait model for the response time on a (set of) pure speed test(s), the multiplicative gamma model (MGM), which is based on the assumption that the test response times are approximately gamma distributed, with known index parameters and scale parameters depending on subject ability and test difficulty parameters. Like…
Descriptors: Reaction Time, Timed Tests, Item Response Theory, Models