NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Wang, Lu; Steedle, Jeffrey – ACT, Inc., 2020
In recent ACT mode comparability studies, students testing on laptop or desktop computers earned slightly higher scores on average than students who tested on paper, especially on the ACT® reading and English tests (Li et al., 2017). Equating procedures adjust for such "mode effects" to make ACT scores comparable regardless of testing…
Descriptors: Test Format, Reading Tests, Language Tests, English
Li, Dongmei; Yi, Qing; Harris, Deborah – ACT, Inc., 2017
In preparation for online administration of the ACT® test, ACT conducted studies to examine the comparability of scores between online and paper administrations, including a timing study in fall 2013, a mode comparability study in spring 2014, and a second mode comparability study in spring 2015. This report presents major findings from these…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Comparative Analysis, Test Format
Peer reviewed Peer reviewed
PDF on ERIC Download full text
von Davier, Alina A., Ed.; Liu, Mei, Ed. – ETS Research Report Series, 2006
This report builds on and extends existent research on population invariance to new tests and issues. The authors lay the foundation for a deeper understanding of the use of population invariance measures in a wide variety of practical contexts. The invariance of linear, equipercentile and IRT equating methods are examined using data from five…
Descriptors: Equated Scores, Statistical Analysis, Data Collection, Test Format