Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 5 |
Descriptor
Comparative Analysis | 5 |
Multiple Choice Tests | 5 |
Test Format | 5 |
Item Response Theory | 3 |
Scores | 3 |
Test Items | 3 |
Difficulty Level | 2 |
Reading Tests | 2 |
Science Tests | 2 |
Student Evaluation | 2 |
Vignettes | 2 |
More ▼ |
Source
Grantee Submission | 5 |
Author
DeBoer, George E. | 3 |
Hardcastle, Joseph | 3 |
Herrmann-Abell, Cari F. | 3 |
Hildenbrand, Lena | 1 |
McCarthy, Kathryn S. | 1 |
McNamara, Danielle S. | 1 |
O'Reilly, Tenaha | 1 |
Sabatini, John | 1 |
Wang, Zuowei | 1 |
Wiley, Jennifer | 1 |
Publication Type
Reports - Research | 5 |
Speeches/Meeting Papers | 4 |
Education Level
Elementary Education | 1 |
High Schools | 1 |
Higher Education | 1 |
Postsecondary Education | 1 |
Secondary Education | 1 |
Audience
Location
California | 1 |
Idaho | 1 |
Oklahoma | 1 |
Laws, Policies, & Programs
Assessments and Surveys
ACT Assessment | 1 |
Flesch Kincaid Grade Level… | 1 |
Gates MacGinitie Reading Tests | 1 |
What Works Clearinghouse Rating
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2022
As implementation of the "Next Generation Science Standards" moves forward, there is a need for new assessments that can measure students' integrated three-dimensional science learning. The National Research Council has suggested that these assessments be multicomponent tasks that utilize a combination of item formats including…
Descriptors: Multiple Choice Tests, Conditioning, Test Items, Item Response Theory
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2019
The "Next Generation Science Standards" calls for new assessments that measure students' integrated three-dimensional science learning. The National Research Council has suggested that these assessments utilize a combination of item formats including constructed-response and multiple-choice. In this study, students were randomly assigned…
Descriptors: Science Tests, Multiple Choice Tests, Test Format, Test Items
Wang, Zuowei; O'Reilly, Tenaha; Sabatini, John; McCarthy, Kathryn S.; McNamara, Danielle S. – Grantee Submission, 2021
We compared high school students' performance in a traditional comprehension assessment requiring them to identify key information and draw inferences from single texts, and a scenario-based assessment (SBA) requiring them to integrate, evaluate and apply information across multiple sources. Both assessments focused on a non-academic topic.…
Descriptors: Comparative Analysis, High School Students, Inferences, Reading Tests
Hildenbrand, Lena; Wiley, Jennifer – Grantee Submission, 2021
Many studies have demonstrated that testing students on to-be-learned materials can be an effective learning activity. However, past studies have also shown that some practice test formats are more effective than others. Open-ended recall or short answer practice tests may be effective because the questions prompt deeper processing as students…
Descriptors: Test Format, Outcomes of Education, Cognitive Processes, Learning Activities
Hardcastle, Joseph; Herrmann-Abell, Cari F.; DeBoer, George E. – Grantee Submission, 2017
Can student performance on computer-based tests (CBT) and paper-and-pencil tests (PPT) be considered equivalent measures of student knowledge? States and school districts are grappling with this question, and although studies addressing this question are growing, additional research is needed. We report on the performance of students who took…
Descriptors: Academic Achievement, Computer Assisted Testing, Comparative Analysis, Student Evaluation