Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 1 |
Descriptor
Source
| ETS Research Report Series | 1 |
Author
| Harmes, J. Christine | 1 |
| Kim, Sooyeon | 1 |
| Kromrey, Jeffrey D. | 1 |
| Moses, Tim | 1 |
| Parshall, Cynthia G. | 1 |
| Sentovich, Christina | 1 |
Publication Type
| Numerical/Quantitative Data | 2 |
| Reports - Research | 2 |
| Journal Articles | 1 |
| Speeches/Meeting Papers | 1 |
Education Level
| Higher Education | 1 |
| Postsecondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
| Graduate Record Examinations | 1 |
What Works Clearinghouse Rating
Kim, Sooyeon; Moses, Tim – ETS Research Report Series, 2016
The purpose of this study is to evaluate the extent to which item response theory (IRT) proficiency estimation methods are robust to the presence of aberrant responses under the "GRE"® General Test multistage adaptive testing (MST) design. To that end, a wide range of atypical response behaviors affecting as much as 10% of the test items…
Descriptors: Item Response Theory, Computation, Robustness (Statistics), Response Style (Tests)
Parshall, Cynthia G.; Kromrey, Jeffrey D.; Harmes, J. Christine; Sentovich, Christina – 2001
Computerized adaptive tests (CATs) are efficient because of their optimal item selection procedures that target maximally informative items at each estimated ability level. However, operational administration of these optimal CATs results in a relatively small subset of items given to examinees too often, while another portion of the item pool is…
Descriptors: Ability, Adaptive Testing, Computer Assisted Testing, Estimation (Mathematics)

Peer reviewed
