NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing all 12 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Paul Thompson; Kaydee Owen; Richard P. Hastings – International Journal of Research & Method in Education, 2024
Traditionally, cluster randomized controlled trials are analyzed with the average intervention effect of interest. However, in populations that contain higher degrees of heterogeneity or variation may differ across different values of a covariate, which may not be optimal. Within education and social science contexts, exploring the variation in…
Descriptors: Randomized Controlled Trials, Intervention, Mathematics Education, Mathematics Skills
Peer reviewed Peer reviewed
Kenneth A. Frank; Qinyun Lin; Spiro J. Maroulis – Grantee Submission, 2024
In the complex world of educational policy, causal inferences will be debated. As we review non-experimental designs in educational policy, we focus on how to clarify and focus the terms of debate. We begin by presenting the potential outcomes/counterfactual framework and then describe approximations to the counterfactual generated from the…
Descriptors: Causal Models, Statistical Inference, Observation, Educational Policy
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
May, Henry; Jones, Akisha; Blakeney, Aly – AERA Online Paper Repository, 2019
Using an RD design provides statistically robust estimates while allowing researchers a different causal estimation tool to be used in educational environments where an RCT may not be feasible. Results from External Evaluation of the i3 Scale-Up of Reading Recovery show that impact estimates were remarkably similar between a randomized control…
Descriptors: Regression (Statistics), Research Design, Randomized Controlled Trials, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Hitchcock, John H.; Johnson, R. Burke; Schoonenboom, Judith – Research in the Schools, 2018
The central purpose of this article is to provide an overview of the many ways in which special educators can generate and think about causal inference to inform policy and practice. Consideration of causality across different lenses can be carried out by engaging in multiple method and mixed methods ways of thinking about inference. This article…
Descriptors: Causal Models, Statistical Inference, Special Education, Educational Research
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cole, Russell; Deke, John; Seftor, Neil – Society for Research on Educational Effectiveness, 2016
The What Works Clearinghouse (WWC) maintains design standards to identify rigorous, internally valid education research. As education researchers advance new methodologies, the WWC must revise its standards to include an assessment of the new designs. Recently, the WWC has revised standards for two emerging study designs: regression discontinuity…
Descriptors: Educational Research, Research Design, Regression (Statistics), Multivariate Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles – National Center for Education Evaluation and Regional Assistance, 2017
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Cheung, Alan C. K.; Slavin, Robert E. – Educational Researcher, 2016
As evidence becomes increasingly important in educational policy, it is essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. A total of 645 studies from 12 recent reviews of evaluations of preschool, reading, mathematics, and science programs were studied. Effect…
Descriptors: Effect Size, Research Methodology, Research Design, Preschool Evaluation
Conn, Katharine – ProQuest LLC, 2014
The aim of this dissertation is to identify effective educational interventions in Sub-Saharan African with an impact on student learning. This is the first meta-analysis in the field of education conducted for Sub-Saharan Africa. This paper takes an in-depth look at twelve different types of education interventions or programs and attempts to not…
Descriptors: African American Achievement, Foreign Countries, Research Methodology, Intervention
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2011
With its critical assessments of scientific evidence on the effectiveness of education programs, policies, and practices (referred to as "interventions"), and a range of products summarizing this evidence, the What Works Clearinghouse (WWC) is an important part of the Institute of Education Sciences' strategy to use rigorous and relevant…
Descriptors: Standards, Access to Information, Information Management, Guides