Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 2 |
Descriptor
Accuracy | 2 |
Classification | 2 |
Computer Assisted Testing | 2 |
Prediction | 2 |
Adaptive Testing | 1 |
Comparative Analysis | 1 |
Decision Making | 1 |
Elementary Secondary Education | 1 |
Error Patterns | 1 |
Error of Measurement | 1 |
Evidence | 1 |
More ▼ |
Source
Educational and Psychological… | 2 |
Author
Chung, Hyewon | 1 |
Dodd, Barbara G. | 1 |
Hauser, Carl | 1 |
He, Wei | 1 |
Kim, Jiseon | 1 |
Ma, Lingling | 1 |
Park, Ryoungsun | 1 |
Thum, Yeow Meng | 1 |
Publication Type
Journal Articles | 2 |
Reports - Research | 2 |
Education Level
Elementary Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Park, Ryoungsun; Kim, Jiseon; Chung, Hyewon; Dodd, Barbara G. – Educational and Psychological Measurement, 2017
The current study proposes novel methods to predict multistage testing (MST) performance without conducting simulations. This method, called MST test information, is based on analytic derivation of standard errors of ability estimates across theta levels. We compared standard errors derived analytically to the simulation results to demonstrate the…
Descriptors: Testing, Performance, Prediction, Error of Measurement
Hauser, Carl; Thum, Yeow Meng; He, Wei; Ma, Lingling – Educational and Psychological Measurement, 2015
When conducting item reviews, analysts evaluate an array of statistical and graphical information to assess the fit of a field test (FT) item to an item response theory model. The process can be tedious, particularly when the number of human reviews (HR) to be completed is large. Furthermore, such a process leads to decisions that are susceptible…
Descriptors: Test Items, Item Response Theory, Research Methodology, Decision Making