NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Joshua B. Gilbert; Luke W. Miratrix; Mridul Joshi; Benjamin W. Domingue – Journal of Educational and Behavioral Statistics, 2025
Analyzing heterogeneous treatment effects (HTEs) plays a crucial role in understanding the impacts of educational interventions. A standard practice for HTE analysis is to examine interactions between treatment status and preintervention participant characteristics, such as pretest scores, to identify how different groups respond to treatment.…
Descriptors: Causal Models, Item Response Theory, Statistical Inference, Psychometrics
Joshua B. Gilbert; Luke W. Miratrix; Mridul Joshi; Benjamin W. Domingue – Annenberg Institute for School Reform at Brown University, 2024
Analyzing heterogeneous treatment effects (HTE) plays a crucial role in understanding the impacts of educational interventions. A standard practice for HTE analysis is to examine interactions between treatment status and pre-intervention participant characteristics, such as pretest scores, to identify how different groups respond to treatment.…
Descriptors: Causal Models, Item Response Theory, Statistical Inference, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Kraft, Matthew A. – Educational Researcher, 2020
Researchers commonly interpret effect sizes by applying benchmarks proposed by Jacob Cohen over a half century ago. However, effects that are small by Cohen's standards are large relative to the impacts of most field-based interventions. These benchmarks also fail to consider important differences in study features, program costs, and scalability.…
Descriptors: Effect Size, Benchmarking, Educational Research, Intervention
Kraft, Matthew A. – Annenberg Institute for School Reform at Brown University, 2019
Researchers commonly interpret effect sizes by applying benchmarks proposed by Cohen over a half century ago. However, effects that are small by Cohen's standards are large relative to the impacts of most field-based interventions. These benchmarks also fail to consider important differences in study features, program costs, and scalability. In…
Descriptors: Data Interpretation, Effect Size, Intervention, Benchmarking
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Taylor, Joseph; Kowalski, Susan; Stuhlsatz, Molly; Wilson, Christopher; Spybrook, Jessaca – Society for Research on Educational Effectiveness, 2013
The purpose of this paper is to use both conceptual and statistical approaches to explore publication bias in recent causal effects studies in science education, and to draw from this exploration implications for researchers, journal reviewers, and journal editors. This paper fills a void in the "science education" literature as no…
Descriptors: Science Education, Influences, Bias, Statistical Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
T. R. Kratochwill; J. Hitchcock; R. H. Horner; J. R. Levin; S. L. Odom; D. M Rindskopf; W. R. Shadish – What Works Clearinghouse, 2010
In an effort to expand the pool of scientific evidence available for review, the What Works Clearinghouse (WWC) assembled a panel of national experts in single-case design (SCD) and analysis to draft SCD Standards. SCDs are adaptations of interrupted time-series designs and can provide a rigorous experimental evaluation of intervention effects.…
Descriptors: Research Methodology, Standards, Causal Models, Intervention
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z.; Puma, Mike; Deke, John – National Center for Education Evaluation and Regional Assistance, 2014
This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…
Descriptors: Statistical Analysis, Evaluation Methods, Educational Research, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Nowak, Christoph; Heinrichs, Nina – Clinical Child and Family Psychology Review, 2008
A meta-analysis encompassing all studies evaluating the impact of the Triple P-Positive Parenting Program on parent and child outcome measures was conducted in an effort to identify variables that moderate the program's effectiveness. Hierarchical linear models (HLM) with three levels of data were employed to analyze effect sizes. The results (N =…
Descriptors: Intervention, Parent Education, Child Rearing, Program Effectiveness