Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 4 |
Descriptor
Computer Assisted Testing | 4 |
Raw Scores | 4 |
Statistical Analysis | 4 |
Item Response Theory | 3 |
Multiple Choice Tests | 3 |
Reading Tests | 3 |
Construct Validity | 2 |
Diagnostic Tests | 2 |
Elementary School Students | 2 |
Error Patterns | 2 |
Factor Analysis | 2 |
More ▼ |
Author
Biancarosa, Gina | 2 |
Carlson, Sarah E. | 2 |
Davison, Mark L. | 2 |
Liu, Bowen | 2 |
Seipel, Ben | 2 |
Arce-Ferrer, Alvaro J. | 1 |
Guzman, Elvira Martinez | 1 |
Harris, Deborah | 1 |
Li, Dongmei | 1 |
Yi, Qing | 1 |
Publication Type
Reports - Research | 4 |
Journal Articles | 3 |
Numerical/Quantitative Data | 1 |
Education Level
Elementary Education | 2 |
High Schools | 1 |
Higher Education | 1 |
Audience
Location
Mexico | 1 |
Laws, Policies, & Programs
Assessments and Surveys
ACT Assessment | 1 |
What Works Clearinghouse Rating
Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen – Assessment for Effective Intervention, 2018
The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Pilot Projects, Measurement
Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen – Grantee Submission, 2018
The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Pilot Projects, Measurement
Li, Dongmei; Yi, Qing; Harris, Deborah – ACT, Inc., 2017
In preparation for online administration of the ACT® test, ACT conducted studies to examine the comparability of scores between online and paper administrations, including a timing study in fall 2013, a mode comparability study in spring 2014, and a second mode comparability study in spring 2015. This report presents major findings from these…
Descriptors: College Entrance Examinations, Computer Assisted Testing, Comparative Analysis, Test Format
Arce-Ferrer, Alvaro J.; Guzman, Elvira Martinez – Educational and Psychological Measurement, 2009
This study investigates the effect of mode of administration of the Raven Standard Progressive Matrices test on distribution, accuracy, and meaning of raw scores. A random sample of high school students take counterbalanced paper-and-pencil and computer-based administrations of the test and answer a questionnaire surveying preferences for…
Descriptors: Factor Analysis, Raw Scores, Statistical Analysis, Computer Assisted Testing