Publication Date
In 2025 | 2 |
Since 2024 | 2 |
Since 2021 (last 5 years) | 4 |
Since 2016 (last 10 years) | 9 |
Since 2006 (last 20 years) | 10 |
Descriptor
Source
What Works Clearinghouse | 3 |
American Journal of Evaluation | 2 |
Annenberg Institute for… | 1 |
Education Endowment Foundation | 1 |
National Center for Education… | 1 |
Research in the Schools | 1 |
Studies in Science Education | 1 |
Author
Jake Anders | 2 |
Andrew P. Jaciw | 1 |
Ben Styles | 1 |
Ben Weidmann | 1 |
Bibi Groot | 1 |
Chris Brown | 1 |
Dahlia K. Remler | 1 |
Gregg G. Van Ryzin | 1 |
Hitchcock, John H. | 1 |
Hugues Lortie-Forgues | 1 |
Jessica Heal | 1 |
More ▼ |
Publication Type
Reports - Descriptive | 10 |
Journal Articles | 4 |
Guides - General | 1 |
Guides - Non-Classroom | 1 |
Education Level
Elementary Education | 1 |
Audience
Researchers | 2 |
Practitioners | 1 |
Location
Tennessee | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Dahlia K. Remler; Gregg G. Van Ryzin – American Journal of Evaluation, 2025
This article reviews the origins and use of the terms quasi-experiment and natural experiment. It demonstrates how the terms conflate whether variation in the independent variable of interest falls short of random with whether researchers find, rather than intervene to create, that variation. Using the lens of assignment--the process driving…
Descriptors: Quasiexperimental Design, Research Design, Experiments, Predictor Variables
Andrew P. Jaciw – American Journal of Evaluation, 2025
By design, randomized experiments (XPs) rule out bias from confounded selection of participants into conditions. Quasi-experiments (QEs) are often considered second-best because they do not share this benefit. However, when results from XPs are used to generalize causal impacts, the benefit from unconfounded selection into conditions may be offset…
Descriptors: Elementary School Students, Elementary School Teachers, Generalization, Test Bias
Sam Sims; Jake Anders; Matthew Inglis; Hugues Lortie-Forgues; Ben Styles; Ben Weidmann – Annenberg Institute for School Reform at Brown University, 2023
Over the last twenty years, education researchers have increasingly conducted randomised experiments with the goal of informing the decisions of educators and policymakers. Such experiments have generally employed broad, consequential, standardised outcome measures in the hope that this would allow decisionmakers to compare effectiveness of…
Descriptors: Educational Research, Research Methodology, Randomized Controlled Trials, Program Effectiveness
What Works Clearinghouse, 2021
The What Works Clearinghouse (WWC) identifies existing research on educational interventions, assesses the quality of the research, and summarizes and disseminates the evidence from studies that meet WWC standards. The WWC aims to provide enough information so educators can use the research to make informed decisions in their settings. This…
Descriptors: Program Effectiveness, Intervention, Educational Research, Educational Quality
Experimental Research into Teaching Innovations: Responding to Methodological and Ethical Challenges
Taber, Keith S. – Studies in Science Education, 2019
Experimental studies are often employed to test the effectiveness of teaching innovations such as new pedagogy, curriculum, or learning resources. This article offers guidance on good practice in developing research designs, and in drawing conclusions from published reports. Random control trials potentially support the use of statistical…
Descriptors: Instructional Innovation, Educational Research, Research Design, Research Methodology
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Hitchcock, John H.; Johnson, R. Burke; Schoonenboom, Judith – Research in the Schools, 2018
The central purpose of this article is to provide an overview of the many ways in which special educators can generate and think about causal inference to inform policy and practice. Consideration of causality across different lenses can be carried out by engaging in multiple method and mixed methods ways of thinking about inference. This article…
Descriptors: Causal Models, Statistical Inference, Special Education, Educational Research
Jake Anders; Chris Brown; Melanie Ehren; Toby Greany; Rebecca Nelson; Jessica Heal; Bibi Groot; Michael Sanders; Rebecca Allen – Education Endowment Foundation, 2017
Evaluating the impact of complex whole-school interventions (CWSIs) is challenging. However, what evidence there is suggests that school leadership and other elements of whole-school contexts are important for pupils' attainment (Leithwood et al., 2006), suggesting that interventions aimed at changing these have significant potential to improve…
Descriptors: Leadership Styles, Program Implementation, Leadership Responsibility, Program Evaluation
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
What Works Clearinghouse, 2012
What Works Clearinghouse (WWC) quick reviews (QRs) are designed to provide education practitioners and policymakers with timely, preliminary objective assessments of the quality of the research evidence from recently released research papers and reports that have received coverage in the media. These reviews focus primarily on studies of the…
Descriptors: Educational Research, Evidence Based Practice, Research Design, Evaluation Criteria