Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 5 |
Descriptor
Computer Assisted Testing | 5 |
Inferences | 5 |
Elementary School Students | 4 |
Reading Tests | 4 |
Reading Comprehension | 3 |
Scoring | 3 |
Cognitive Processes | 2 |
Diagnostic Tests | 2 |
Progress Monitoring | 2 |
Scores | 2 |
Test Validity | 2 |
More ▼ |
Source
Grantee Submission | 5 |
Author
Biancarosa, Gina | 2 |
Carlson, Sarah E. | 2 |
Davison, Mark L. | 2 |
Seipel, Ben | 2 |
Bulut, Okan | 1 |
Butterfuss, Reese | 1 |
Chan, Greta | 1 |
Clinton, Virginia | 1 |
Joanna E. Cannon | 1 |
Kendeou, Panayiota | 1 |
Kim, Jasmine | 1 |
More ▼ |
Publication Type
Reports - Research | 5 |
Journal Articles | 2 |
Speeches/Meeting Papers | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
Gates MacGinitie Reading Tests | 1 |
What Works Clearinghouse Rating
Magliano, Joseph P.; Lampi, Jodi P.; Ray, Melissa; Chan, Greta – Grantee Submission, 2020
Coherent mental models for successful comprehension require inferences that establish semantic "bridges" between discourse constituents and "elaborations" that incorporate relevant background knowledge. While it is established that individual differences in the extent to which postsecondary students engage in these processes…
Descriptors: Reading Comprehension, Reading Strategies, Inferences, Reading Tests
Kendeou, Panayiota; McMaster, Kristen L.; Butterfuss, Reese; Kim, Jasmine; Slater, Susan; Bulut, Okan – Grantee Submission, 2020
The overall aim of the current investigation was to develop and validate the initial version of the Minnesota Inference Assessment (MIA). MIA is a web-based measure of inference processes in K-2. MIA leverages the affordances of different media to evaluate inference processes in a nonreading context, using age-appropriate fiction and nonfiction…
Descriptors: Test Construction, Test Validity, Inferences, Computer Assisted Testing
Sterett H. Mercer; Joanna E. Cannon – Grantee Submission, 2022
We evaluated the validity of an automated approach to learning progress assessment (aLPA) for English written expression. Participants (n = 105) were students in Grades 2-12 who had parent-identified learning difficulties and received academic tutoring through a community-based organization. Participants completed narrative writing samples in the…
Descriptors: Elementary School Students, Secondary School Students, Learning Problems, Learning Disabilities
Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen – Grantee Submission, 2018
The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Pilot Projects, Measurement
Carlson, Sarah E.; Seipel, Ben; Biancarosa, Gina; Davison, Mark L.; Clinton, Virginia – Grantee Submission, 2019
This demonstration introduces and presents an innovative online cognitive diagnostic assessment, developed to identify the types of cognitive processes that readers use during comprehension; specifically, processes that distinguish between subtypes of struggling comprehenders. Cognitive diagnostic assessments are designed to provide valuable…
Descriptors: Reading Comprehension, Standardized Tests, Diagnostic Tests, Computer Assisted Testing