NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Joshua B. Gilbert; James S. Kim; Luke W. Miratrix – Annenberg Institute for School Reform at Brown University, 2022
Analyses that reveal how treatment effects vary allow researchers, practitioners, and policymakers to better understand the efficacy of educational interventions. In practice, however, standard statistical methods for addressing Heterogeneous Treatment Effects (HTE) fail to address the HTE that may exist within outcome measures. In this study, we…
Descriptors: Item Response Theory, Models, Formative Evaluation, Statistical Inference
Kendeou, Panayiota; McMaster, Kristen L.; Butterfuss, Reese; Kim, Jasmine; Slater, Susan; Bulut, Okan – Assessment for Effective Intervention, 2021
The overall aim of the current investigation was to develop and validate the initial version of the Minnesota Inference Assessment (MIA). MIA is a web-based measure of inference processes in Grades K-2. MIA leverages the affordances of different media to evaluate inference processes in a nonreading context, using age-appropriate fiction and…
Descriptors: Test Construction, Test Validity, Inferences, Computer Assisted Testing
Kendeou, Panayiota; McMaster, Kristen L.; Butterfuss, Reese; Kim, Jasmine; Slater, Susan; Bulut, Okan – Grantee Submission, 2020
The overall aim of the current investigation was to develop and validate the initial version of the Minnesota Inference Assessment (MIA). MIA is a web-based measure of inference processes in K-2. MIA leverages the affordances of different media to evaluate inference processes in a nonreading context, using age-appropriate fiction and nonfiction…
Descriptors: Test Construction, Test Validity, Inferences, Computer Assisted Testing
Sterett H. Mercer; Joanna E. Cannon – Grantee Submission, 2022
We evaluated the validity of an automated approach to learning progress assessment (aLPA) for English written expression. Participants (n = 105) were students in Grades 2-12 who had parent-identified learning difficulties and received academic tutoring through a community-based organization. Participants completed narrative writing samples in the…
Descriptors: Elementary School Students, Secondary School Students, Learning Problems, Learning Disabilities
Peer reviewed Peer reviewed
Direct linkDirect link
Auphan, Pauline; Ecalle, Jean; Magnan, Annie – European Journal of Psychology of Education, 2019
Reading difficulties in school are very challenging for teachers due to many different reader subtypes in one and the same class. Moreover, there are few easy-to-use tools enabling teachers to assess reading ability. According to the Simple View of Reading (Hoover and Gough in "Reading and Writing", 2(2), 127-160, 1990), efficient…
Descriptors: Reading Difficulties, Computer Assisted Testing, Reading Tests, Reading Comprehension
Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen – Assessment for Effective Intervention, 2018
The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Pilot Projects, Measurement
Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen – Grantee Submission, 2018
The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Pilot Projects, Measurement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Carlson, Sarah E.; Seipel, Ben; Biancarosa, Gina; Davison, Mark L.; Clinton, Virginia – Grantee Submission, 2019
This demonstration introduces and presents an innovative online cognitive diagnostic assessment, developed to identify the types of cognitive processes that readers use during comprehension; specifically, processes that distinguish between subtypes of struggling comprehenders. Cognitive diagnostic assessments are designed to provide valuable…
Descriptors: Reading Comprehension, Standardized Tests, Diagnostic Tests, Computer Assisted Testing