Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 6 |
Descriptor
College Entrance Examinations | 9 |
Computer Assisted Testing | 9 |
Timed Tests | 9 |
Adaptive Testing | 3 |
College Students | 3 |
Comparative Analysis | 3 |
Mathematics Tests | 3 |
Multiple Choice Tests | 3 |
Reading Tests | 3 |
Test Format | 3 |
Test Items | 3 |
More ▼ |
Source
ACT, Inc. | 2 |
College Board | 1 |
Educational Testing Service | 1 |
Educational and Psychological… | 1 |
Graduate Management Admission… | 1 |
Grantee Submission | 1 |
Author
Bridgeman, Brent | 2 |
Li, Dongmei | 2 |
Beigman Klebanov, Beata | 1 |
Buhr, Dianne C. | 1 |
Burstein, Jill | 1 |
Cline, Frederick | 1 |
Close, Catherine N. | 1 |
Davison, Mark L. | 1 |
Guo, Fanmin | 1 |
Han, Kyung T. | 1 |
Harris, Deborah | 1 |
More ▼ |
Publication Type
Reports - Research | 9 |
Numerical/Quantitative Data | 2 |
Speeches/Meeting Papers | 2 |
Journal Articles | 1 |
Education Level
Higher Education | 6 |
Postsecondary Education | 5 |
Secondary Education | 2 |
High Schools | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Wang, Shichao; Li, Dongmei; Steedle, Jeffrey – ACT, Inc., 2021
Speeded tests set time limits so that few examinees can reach all items, and power tests allow most test-takers sufficient time to attempt all items. Educational achievement tests are sometimes described as "timed power tests" because the amount of time provided is intended to allow nearly all students to complete the test, yet this…
Descriptors: Timed Tests, Test Items, Achievement Tests, Testing
Li, Dongmei; Yi, Qing; Harris, Deborah – ACT, Inc., 2017
In preparation for online administration of the ACT® test, ACT conducted studies to examine the comparability of scores between online and paper administrations, including a timing study in fall 2013, a mode comparability study in spring 2014, and a second mode comparability study in spring 2015. This report presents major findings from these…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Comparative Analysis, Test Format
Burstein, Jill; McCaffrey, Dan; Beigman Klebanov, Beata; Ling, Guangming – Grantee Submission, 2017
No significant body of research examines writing achievement and the specific skills and knowledge in the writing domain for postsecondary (college) students in the U.S., even though many at-risk students lack the prerequisite writing skills required to persist in their education. This paper addresses this gap through a novel…
Descriptors: Computer Software, Writing Evaluation, Writing Achievement, College Students
Davison, Mark L.; Semmes, Robert; Huang, Lan; Close, Catherine N. – Educational and Psychological Measurement, 2012
Data from 181 college students were used to assess whether math reasoning item response times in computerized testing can provide valid and reliable measures of a speed dimension. The alternate forms reliability of the speed dimension was .85. A two-dimensional structural equation model suggests that the speed dimension is related to the accuracy…
Descriptors: Computer Assisted Testing, Reaction Time, Reliability, Validity
Talento-Miller, Eileen; Guo, Fanmin; Han, Kyung T. – Graduate Management Admission Council, 2012
When power tests include a time limit, it is important to assess the possibility of "speededness" for examinees. Research on differential speededness in the past has included looking at gender and ethnic subgroups in the United States on paper and pencil tests. The needs of a global audience necessitated, and the availability of computer…
Descriptors: College Entrance Examinations, Graduate Study, Business Administration Education, Timed Tests
Bridgeman, Brent; Laitusis, Cara Cahalan; Cline, Frederick – College Board, 2007
The current study used three data sources to estimate time requirements for different item types on the now current SAT Reasoning Test™. First, we estimated times from a computer-adaptive version of the SAT® (SAT CAT) that automatically recorded item times. Second, we observed students as they answered SAT questions under strict time limits and…
Descriptors: College Entrance Examinations, Test Items, Thinking Skills, Computer Assisted Testing
Bridgeman, Brent; McBride, Amanda; Monaghan, William – Educational Testing Service, 2004
Imposing time limits on tests can serve a range of important functions. Time limits are essential, for example, if speed of performance is an integral component of what is being measured, as would be the case when testing such skills as how quickly someone can type. Limiting testing time also helps contain expenses associated with test…
Descriptors: Computer Assisted Testing, Timed Tests, Test Results, Aptitude Tests
Schnipke, Deborah L.; Scrams, David J. – 1999
The availability of item response times made possible by computerized testing represents an entirely new type of information about test items. This study explores the issue of how to represent response-time information in item banks. Empirical response-time distribution functions can be fit with statistical distribution functions with known…
Descriptors: Adaptive Testing, Admission (School), Arithmetic, College Entrance Examinations
Legg, Sue M.; Buhr, Dianne C. – 1990
Possible causes of a 16-point mean score increase for the computer adaptive form of the College Level Academic Skills Test (CLAST) in reading over the paper-and-pencil test (PPT) in reading are examined. The adaptive form of the CLAST was used in a state-wide field test in which reading, writing, and computation scores for approximately 1,000…
Descriptors: Adaptive Testing, College Entrance Examinations, Community Colleges, Comparative Testing