NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Wang, Lu; Steedle, Jeffrey – ACT, Inc., 2020
In recent ACT mode comparability studies, students testing on laptop or desktop computers earned slightly higher scores on average than students who tested on paper, especially on the ACT® reading and English tests (Li et al., 2017). Equating procedures adjust for such "mode effects" to make ACT scores comparable regardless of testing…
Descriptors: Test Format, Reading Tests, Language Tests, English
Peer reviewed Peer reviewed
Direct linkDirect link
Green, Bert F. – Applied Psychological Measurement, 2011
This article refutes a recent claim that computer-based tests produce biased scores for very proficient test takers who make mistakes on one or two initial items and that the "bias" can be reduced by using a four-parameter IRT model. Because the same effect occurs with pattern scores on nonadaptive tests, the effect results from IRT scoring, not…
Descriptors: Adaptive Testing, Computer Assisted Testing, Test Bias, Item Response Theory
Dosch, Michael P. – ProQuest LLC, 2010
The general aim of the present retrospective study was to examine the test mode effect, that is, the difference in performance when tests are taken on computer (CBT), or by paper and pencil (PnP). The specific purpose was to examine the degree to which extensive practice in CBT in graduate students in nurse anesthesia would raise scores on a…
Descriptors: Feedback (Response), Graduate Students, Grade Point Average, Nurses