Publication Date
In 2025 | 0 |
Since 2024 | 2 |
Since 2021 (last 5 years) | 8 |
Since 2016 (last 10 years) | 17 |
Since 2006 (last 20 years) | 27 |
Descriptor
Reaction Time | 27 |
Item Response Theory | 12 |
Models | 12 |
Test Items | 12 |
Computation | 9 |
Computer Assisted Testing | 9 |
Accuracy | 8 |
Comparative Analysis | 7 |
Statistical Analysis | 7 |
Simulation | 6 |
Bayesian Statistics | 5 |
More ▼ |
Source
Journal of Educational and… | 27 |
Author
Chang, Hua-Hua | 4 |
van der Linden, Wim J. | 4 |
Ranger, Jochen | 3 |
Wang, Chun | 3 |
Fan, Zhewen | 2 |
Kuhn, Jörg-Tobias | 2 |
Tijmstra, Jesper | 2 |
Avetisyan, Marianna | 1 |
Beverly, Tanesia | 1 |
Bezirhan, Ummugul | 1 |
Bolsinova, Maria | 1 |
More ▼ |
Publication Type
Journal Articles | 27 |
Reports - Research | 24 |
Reports - Descriptive | 2 |
Numerical/Quantitative Data | 1 |
Reports - Evaluative | 1 |
Education Level
Higher Education | 4 |
Postsecondary Education | 3 |
Secondary Education | 2 |
Audience
Location
Netherlands | 2 |
Australia | 1 |
Austria | 1 |
Belgium | 1 |
Canada | 1 |
Cyprus | 1 |
Czech Republic | 1 |
Denmark | 1 |
Estonia | 1 |
Finland | 1 |
France | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Armed Services Vocational… | 1 |
Program for International… | 1 |
SAT (College Admission Test) | 1 |
What Works Clearinghouse Rating
Nana Kim; Daniel M. Bolt – Journal of Educational and Behavioral Statistics, 2024
Some previous studies suggest that response times (RTs) on rating scale items can be informative about the content trait, but a more recent study suggests they may also be reflective of response styles. The latter result raises questions about the possible consideration of RTs for content trait estimation, as response styles are generally viewed…
Descriptors: Item Response Theory, Reaction Time, Response Style (Tests), Psychometrics
Junhuan Wei; Liufen Luo; Yan Cai; Dongbo Tu – Journal of Educational and Behavioral Statistics, 2024
Response times (RTs) facilitate the quantification of underlying cognitive processes in problem-solving behavior. To provide more comprehensive diagnostic feedback on strategy selection and attribute profiles with multistrategy cognitive diagnosis model (CDM) and utilize additional information for item RTs, this study develops a multistrategy…
Descriptors: Reaction Time, Problem Solving, Selection Criteria, Accuracy
Zhan, Peida; Man, Kaiwen; Wind, Stefanie A.; Malone, Jonathan – Journal of Educational and Behavioral Statistics, 2022
Respondents' problem-solving behaviors comprise behaviors that represent complicated cognitive processes that are frequently systematically tied to one another. Biometric data, such as visual fixation counts (FCs), which are an important eye-tracking indicator, can be combined with other types of variables that reflect different aspects of…
Descriptors: Reaction Time, Cognitive Measurement, Eye Movements, Problem Solving
Demirkaya, Onur; Bezirhan, Ummugul; Zhang, Jinming – Journal of Educational and Behavioral Statistics, 2023
Examinees with item preknowledge tend to obtain inflated test scores that undermine test score validity. With the availability of process data collected in computer-based assessments, the research on detecting item preknowledge has progressed on using both item scores and response times. Item revisit patterns of examinees can also be utilized as…
Descriptors: Test Items, Prior Learning, Knowledge Level, Reaction Time
Zhu, Hongyue; Jiao, Hong; Gao, Wei; Meng, Xiangbin – Journal of Educational and Behavioral Statistics, 2023
Change-point analysis (CPA) is a method for detecting abrupt changes in parameter(s) underlying a sequence of random variables. It has been applied to detect examinees' aberrant test-taking behavior by identifying abrupt test performance change. Previous studies utilized maximum likelihood estimations of ability parameters, focusing on detecting…
Descriptors: Bayesian Statistics, Test Wiseness, Behavior Problems, Reaction Time
Domingue, Benjamin W.; Kanopka, Klint; Stenhaug, Ben; Sulik, Michael J.; Beverly, Tanesia; Brinkhuis, Matthieu; Circi, Ruhan; Faul, Jessica; Liao, Dandan; McCandliss, Bruce; Obradovic, Jelena; Piech, Chris; Porter, Tenelle; Soland, James; Weeks, Jon; Wise, Steven L.; Yeatman, Jason – Journal of Educational and Behavioral Statistics, 2022
The speed-accuracy trade-off (SAT) suggests that time constraints reduce response accuracy. Its relevance in observational settings--where response time (RT) may not be constrained but respondent speed may still vary--is unclear. Using 29 data sets containing data from cognitive tasks, we use a flexible method for identification of the SAT (which…
Descriptors: Accuracy, Reaction Time, Task Analysis, College Entrance Examinations
Kuijpers, Renske E.; Visser, Ingmar; Molenaar, Dylan – Journal of Educational and Behavioral Statistics, 2021
Mixture models have been developed to enable detection of within-subject differences in responses and response times to psychometric test items. To enable mixture modeling of both responses and response times, a distributional assumption is needed for the within-state response time distribution. Since violations of the assumed response time…
Descriptors: Test Items, Responses, Reaction Time, Models
Liu, Yue; Liu, Hongyun – Journal of Educational and Behavioral Statistics, 2021
The prevalence and serious consequences of noneffortful responses from unmotivated examinees are well-known in educational measurement. In this study, we propose to apply an iterative purification process based on a response time residual method with fixed item parameter estimates to detect noneffortful responses. The proposed method is compared…
Descriptors: Response Style (Tests), Reaction Time, Test Items, Accuracy
Kang, Hyeon-Ah; Zheng, Yi; Chang, Hua-Hua – Journal of Educational and Behavioral Statistics, 2020
With the widespread use of computers in modern assessment, online calibration has become increasingly popular as a way of replenishing an item pool. The present study discusses online calibration strategies for a joint model of responses and response times. The study proposes likelihood inference methods for item paramter estimation and evaluates…
Descriptors: Adaptive Testing, Computer Assisted Testing, Item Response Theory, Reaction Time
Sinharay, Sandip; van Rijn, Peter W. – Journal of Educational and Behavioral Statistics, 2020
Response time models (RTMs) are of increasing interest in educational and psychological testing. This article focuses on the lognormal model for response times, which is one of the most popular RTMs. Several existing statistics for testing normality and the fit of factor analysis models are repurposed for testing the fit of the lognormal model. A…
Descriptors: Educational Testing, Psychological Testing, Goodness of Fit, Factor Analysis
Erps, Ryan C.; Noguchi, Kimihiro – Journal of Educational and Behavioral Statistics, 2020
A new two-sample test for comparing variability measures is proposed. To make the test robust and powerful, a new modified structural zero removal method is applied to the Brown-Forsythe transformation. The t-test-based statistic allows results to be expressed as the ratio of mean absolute deviations from median. Extensive simulation study…
Descriptors: Statistical Analysis, Comparative Analysis, Robustness (Statistics), Sample Size
Choe, Edison M.; Kern, Justin L.; Chang, Hua-Hua – Journal of Educational and Behavioral Statistics, 2018
Despite common operationalization, measurement efficiency of computerized adaptive testing should not only be assessed in terms of the number of items administered but also the time it takes to complete the test. To this end, a recent study introduced a novel item selection criterion that maximizes Fisher information per unit of expected response…
Descriptors: Computer Assisted Testing, Reaction Time, Item Response Theory, Test Items
Wang, Chun; Xu, Gongjun; Shang, Zhuoran; Kuncel, Nathan – Journal of Educational and Behavioral Statistics, 2018
The modern web-based technology greatly popularizes computer-administered testing, also known as online testing. When these online tests are administered continuously within a certain "testing window," many items are likely to be exposed and compromised, posing a type of test security concern. In addition, if the testing time is limited,…
Descriptors: Computer Assisted Testing, Cheating, Guessing (Tests), Item Response Theory
von Davier, Matthias; Khorramdel, Lale; He, Qiwei; Shin, Hyo Jeong; Chen, Haiwen – Journal of Educational and Behavioral Statistics, 2019
International large-scale assessments (ILSAs) transitioned from paper-based assessments to computer-based assessments (CBAs) facilitating the use of new item types and more effective data collection tools. This allows implementation of more complex test designs and to collect process and response time (RT) data. These new data types can be used to…
Descriptors: International Assessment, Computer Assisted Testing, Psychometrics, Item Response Theory
Ranger, Jochen; Kuhn, Jörg-Tobias – Journal of Educational and Behavioral Statistics, 2018
Diffusion-based item response theory models for responses and response times in tests have attracted increased attention recently in psychometrics. Analyzing response time data, however, is delicate as response times are often contaminated by unusual observations. This can have serious effects on the validity of statistical inference. In this…
Descriptors: Item Response Theory, Computation, Robustness (Statistics), Reaction Time
Previous Page | Next Page »
Pages: 1 | 2