NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Wang, Lu; Steedle, Jeffrey – ACT, Inc., 2020
In recent ACT mode comparability studies, students testing on laptop or desktop computers earned slightly higher scores on average than students who tested on paper, especially on the ACT® reading and English tests (Li et al., 2017). Equating procedures adjust for such "mode effects" to make ACT scores comparable regardless of testing…
Descriptors: Test Format, Reading Tests, Language Tests, English
Li, Dongmei; Yi, Qing; Harris, Deborah – ACT, Inc., 2017
In preparation for online administration of the ACT® test, ACT conducted studies to examine the comparability of scores between online and paper administrations, including a timing study in fall 2013, a mode comparability study in spring 2014, and a second mode comparability study in spring 2015. This report presents major findings from these…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Comparative Analysis, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Kurby, Christopher A.; Magliano, Joseph P.; Dandotkar, Srikanth; Woehrle, James; Gilliam, Sara; McNamara, Danielle S. – Journal of Educational Computing Research, 2012
This study assessed whether and how self-explanation reading training, provided by iSTART (Interactive Strategy Training for Active Reading and Thinking), improves the effectiveness of comprehension processes. iSTART teaches students how to self-explain and which strategies will most effectively aid comprehension from moment-to-moment. We used…
Descriptors: Computer Assisted Testing, Federal Aid, Control Groups, Experimental Groups