Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 6 |
Since 2006 (last 20 years) | 9 |
Descriptor
Source
ACT, Inc. | 9 |
Author
Steedle, Jeffrey | 3 |
Li, Dongmei | 2 |
Allen, Jeff | 1 |
Chang, Hua-Hua | 1 |
Cho, YoungWoo | 1 |
Croft, Michelle | 1 |
Gao, Xiaohong | 1 |
Haisfield, Lisa | 1 |
Harris, Deborah | 1 |
Lottridge, Susan | 1 |
Mattern, Krista | 1 |
More ▼ |
Publication Type
Reports - Research | 7 |
Numerical/Quantitative Data | 2 |
Information Analyses | 1 |
Reports - Descriptive | 1 |
Education Level
Higher Education | 6 |
Postsecondary Education | 5 |
High Schools | 1 |
Secondary Education | 1 |
Audience
Policymakers | 1 |
Location
Laws, Policies, & Programs
Assessments and Surveys
ACT Assessment | 6 |
COMPASS (Computer Assisted… | 1 |
What Works Clearinghouse Rating
Wang, Shichao; Li, Dongmei; Steedle, Jeffrey – ACT, Inc., 2021
Speeded tests set time limits so that few examinees can reach all items, and power tests allow most test-takers sufficient time to attempt all items. Educational achievement tests are sometimes described as "timed power tests" because the amount of time provided is intended to allow nearly all students to complete the test, yet this…
Descriptors: Timed Tests, Test Items, Achievement Tests, Testing
Wood, Scott; Yao, Erin; Haisfield, Lisa; Lottridge, Susan – ACT, Inc., 2021
For assessment professionals who are also automated scoring (AS) professionals, there is no single set of standards of best practice. This paper reviews the assessment and AS literature to identify key standards of best practice and ethical behavior for AS professionals and codifies those standards in a single resource. Having a unified set of AS…
Descriptors: Standards, Best Practices, Computer Assisted Testing, Scoring
Wang, Lu; Steedle, Jeffrey – ACT, Inc., 2020
In recent ACT mode comparability studies, students testing on laptop or desktop computers earned slightly higher scores on average than students who tested on paper, especially on the ACT® reading and English tests (Li et al., 2017). Equating procedures adjust for such "mode effects" to make ACT scores comparable regardless of testing…
Descriptors: Test Format, Reading Tests, Language Tests, English
Westrick, Paul; Mattern, Krista – ACT, Inc., 2018
Almost two-thirds of students entering a community college and a third of students entering a 4-year college lack basic math and writing skills, and they often find themselves placed in developmental or remedial courses in their first year of college. Unfortunately, students placed into remedial math and English courses often have poorer…
Descriptors: College Readiness, College Freshmen, Student Placement, Remedial Mathematics
Steedle, Jeffrey; Pashley, Peter; Cho, YoungWoo – ACT, Inc., 2020
Three mode comparability studies were conducted on the following Saturday national ACT test dates: October 26, 2019, December 14, 2019, and February 8, 2020. The primary goal of these studies was to evaluate whether ACT scores exhibited mode effects between paper and online testing that would necessitate statistical adjustments to the online…
Descriptors: Test Format, Computer Assisted Testing, College Entrance Examinations, Scores
Croft, Michelle – ACT, Inc., 2014
Given the transition from paper-and-pencil to computer administration, policymakers must update test security laws and policies to reflect the new threats to administration. There will likely always be a need for paper-and-pencil test administration security laws and policies, given that paper test copies may be needed as an accommodation or may…
Descriptors: Computer Assisted Testing, Computer Security, Information Security, Laws
Zheng, Yi; Nozawa, Yuki; Gao, Xiaohong; Chang, Hua-Hua – ACT, Inc., 2012
Multistage adaptive tests (MSTs) have gained increasing popularity in recent years. MST is a balanced compromise between linear test forms (i.e., paper-and-pencil testing and computer-based testing) and traditional item-level computer-adaptive testing (CAT). It combines the advantages of both. On one hand, MST is adaptive (and therefore more…
Descriptors: Adaptive Testing, Heuristics, Accuracy, Item Banks
Li, Dongmei; Yi, Qing; Harris, Deborah – ACT, Inc., 2017
In preparation for online administration of the ACT® test, ACT conducted studies to examine the comparability of scores between online and paper administrations, including a timing study in fall 2013, a mode comparability study in spring 2014, and a second mode comparability study in spring 2015. This report presents major findings from these…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Comparative Analysis, Test Format
Westrick, Paul A.; Allen, Jeff – ACT, Inc., 2014
We examined the validity of using Compass® test scores and high school grade point average (GPA) for placing students in first-year college courses and for identifying students at risk of not succeeding. Consistent with other research, the combination of high school GPA and Compass scores performed better than either measure used alone. Results…
Descriptors: Grade Point Average, College Readiness, College Entrance Examinations, Computer Assisted Testing