Publication Date
In 2025 | 1 |
Since 2024 | 2 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 3 |
Descriptor
Correlation | 3 |
Pretesting | 3 |
Randomized Controlled Trials | 3 |
Causal Models | 2 |
Difficulty Level | 2 |
Educational Assessment | 2 |
Effect Size | 2 |
Elementary School Students | 2 |
Grade 2 | 2 |
Intervention | 2 |
Item Analysis | 2 |
More ▼ |
Author
Benjamin W. Domingue | 2 |
Joshua B. Gilbert | 2 |
Luke W. Miratrix | 2 |
Mridul Joshi | 2 |
Cook, Thomas D. | 1 |
Hallberg, Kelly | 1 |
St. Clair, Travis | 1 |
Publication Type
Reports - Research | 3 |
Journal Articles | 2 |
Education Level
Early Childhood Education | 2 |
Elementary Education | 2 |
Grade 2 | 2 |
Primary Education | 2 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Joshua B. Gilbert; Luke W. Miratrix; Mridul Joshi; Benjamin W. Domingue – Journal of Educational and Behavioral Statistics, 2025
Analyzing heterogeneous treatment effects (HTEs) plays a crucial role in understanding the impacts of educational interventions. A standard practice for HTE analysis is to examine interactions between treatment status and preintervention participant characteristics, such as pretest scores, to identify how different groups respond to treatment.…
Descriptors: Causal Models, Item Response Theory, Statistical Inference, Psychometrics
Joshua B. Gilbert; Luke W. Miratrix; Mridul Joshi; Benjamin W. Domingue – Annenberg Institute for School Reform at Brown University, 2024
Analyzing heterogeneous treatment effects (HTE) plays a crucial role in understanding the impacts of educational interventions. A standard practice for HTE analysis is to examine interactions between treatment status and pre-intervention participant characteristics, such as pretest scores, to identify how different groups respond to treatment.…
Descriptors: Causal Models, Item Response Theory, Statistical Inference, Psychometrics
St. Clair, Travis; Hallberg, Kelly; Cook, Thomas D. – Journal of Educational and Behavioral Statistics, 2016
We explore the conditions under which short, comparative interrupted time-series (CITS) designs represent valid alternatives to randomized experiments in educational evaluations. To do so, we conduct three within-study comparisons, each of which uses a unique data set to test the validity of the CITS design by comparing its causal estimates to…
Descriptors: Research Methodology, Randomized Controlled Trials, Comparative Analysis, Time