Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 5 |
Since 2016 (last 10 years) | 11 |
Since 2006 (last 20 years) | 21 |
Descriptor
Program Effectiveness | 21 |
Randomized Controlled Trials | 21 |
Research Design | 21 |
Intervention | 10 |
Educational Research | 8 |
Program Evaluation | 6 |
Evidence | 5 |
Outcomes of Education | 5 |
Pretests Posttests | 5 |
Research Methodology | 5 |
Effect Size | 4 |
More ▼ |
Source
Author
Kelcey, Ben | 3 |
Heppen, Jessica | 2 |
Jones, Nathan | 2 |
Phelps, Geoffrey | 2 |
Spybrook, Jessaca | 2 |
Adkins, Michael | 1 |
Anglin, Kylie L. | 1 |
Annik M. Sorhaindo | 1 |
Bell, Stephen H. | 1 |
Ben Kelcey | 1 |
Blakeney, Aly | 1 |
More ▼ |
Publication Type
Education Level
Secondary Education | 6 |
Elementary Education | 4 |
High Schools | 4 |
Middle Schools | 4 |
Junior High Schools | 3 |
Early Childhood Education | 2 |
Elementary Secondary Education | 1 |
Grade 3 | 1 |
Grade 4 | 1 |
Grade 5 | 1 |
Intermediate Grades | 1 |
More ▼ |
Audience
Researchers | 2 |
Policymakers | 1 |
Practitioners | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Zuchao Shen; Ben Kelcey – Society for Research on Educational Effectiveness, 2023
I. Purpose of the Study: Detecting whether interventions work or not (through main effect analysis) can provide empirical evidence regarding the causal linkage between malleable factors (e.g., interventions) and learner outcomes. In complement, moderation analyses help delineate for whom and under what conditions intervention effects are most…
Descriptors: Intervention, Program Effectiveness, Evidence, Research Design
Brown, Seth; Song, Mengli; Cook, Thomas D.; Garet, Michael S. – American Educational Research Journal, 2023
This study examined bias reduction in the eight nonequivalent comparison group designs (NECGDs) that result from combining (a) choice of a local versus non-local comparison group, and analytic use or not of (b) a pretest measure of the study outcome and (c) a rich set of other covariates. Bias was estimated as the difference in causal estimate…
Descriptors: Research Design, Pretests Posttests, Computation, Bias
What Works Clearinghouse, 2021
The What Works Clearinghouse (WWC) identifies existing research on educational interventions, assesses the quality of the research, and summarizes and disseminates the evidence from studies that meet WWC standards. The WWC aims to provide enough information so educators can use the research to make informed decisions in their settings. This…
Descriptors: Program Effectiveness, Intervention, Educational Research, Educational Quality
Wu, Edward; Gagnon-Bartsch, Johann A. – Journal of Educational and Behavioral Statistics, 2021
In paired experiments, participants are grouped into pairs with similar characteristics, and one observation from each pair is randomly assigned to treatment. The resulting treatment and control groups should be well-balanced; however, there may still be small chance imbalances. Building on work for completely randomized experiments, we propose a…
Descriptors: Experiments, Groups, Research Design, Statistical Analysis
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Spybrook, Jessaca; Zhang, Qi; Kelcey, Ben; Dong, Nianbo – Educational Evaluation and Policy Analysis, 2020
Over the past 15 years, we have seen an increase in the use of cluster randomized trials (CRTs) to test the efficacy of educational interventions. These studies are often designed with the goal of determining whether a program works, or answering the what works question. Recently, the goals of these studies expanded to include for whom and under…
Descriptors: Randomized Controlled Trials, Educational Research, Program Effectiveness, Intervention
Wong, Vivian C.; Steiner, Peter M.; Anglin, Kylie L. – Grantee Submission, 2018
Given the widespread use of non-experimental (NE) methods for assessing program impacts, there is a strong need to know whether NE approaches yield causally valid results in field settings. In within-study comparison (WSC) designs, the researcher compares treatment effects from an NE with those obtained from a randomized experiment that shares the…
Descriptors: Evaluation Methods, Program Evaluation, Program Effectiveness, Comparative Analysis
May, Henry; Jones, Akisha; Blakeney, Aly – AERA Online Paper Repository, 2019
Using an RD design provides statistically robust estimates while allowing researchers a different causal estimation tool to be used in educational environments where an RCT may not be feasible. Results from External Evaluation of the i3 Scale-Up of Reading Recovery show that impact estimates were remarkably similar between a randomized control…
Descriptors: Regression (Statistics), Research Design, Randomized Controlled Trials, Research Methodology
Connolly, Paul; Keenan, Ciara; Urbanska, Karolina – Educational Research, 2018
Background: The use of randomised controlled trials (RCTs) in education has increased significantly over the last 15 years. However, their use has also been subject to sustained and rather trenchant criticism from significant sections of the education research community. Key criticisms have included the claims that: it is not possible to undertake…
Descriptors: Evidence Based Practice, Randomized Controlled Trials, Educational Research, Educational History
Foundation for Child Development, 2020
As the number of publicly funded early childhood education (ECE) programs increases, policymakers will need empirical evidence to justify the taxpayer investment. Such justification will require a stronger understanding of the essential components of an ECE program's design, as well as solid evidence on which components, or constellations of…
Descriptors: Early Childhood Education, Research Utilization, Outcomes of Education, Educational Research
Murphy, David; Oliver, Mary; Pourhabib, Sanam; Adkins, Michael; Hodgen, Jeremy – Education Endowment Foundation, 2017
This report examines the range of factors that might influence the decision by social care professionals on the use of boarding schools as an intervention option for Children in Need (CiN) or children on a Child Protection Plan (CPP). Attempts to conduct a randomised controlled trial (RCT) failed to recruit participants. Initially, failure to…
Descriptors: Boarding Schools, Caseworkers, Social Work, Intervention
Ruth Maisey; Svetlana Speight; Chris Bonell; Susan Purdon; Peter Keogh; Ivonne Wollny; Annik M. Sorhaindo; Kaye Wellings – Sage Research Methods Cases, 2014
In 2009, the government's Department for Education commissioned a team of researchers at NatCen Social Research to evaluate the effectiveness of the youth development/teenage pregnancy prevention programme 'Teens and Toddlers'. Previous studies had positive findings but had not been very rigorous in terms of methodology and methods used. We…
Descriptors: Youth Programs, Program Evaluation, Adolescents, Toddlers
Valentine, Jeffrey C.; Thompson, Simon G. – Research Synthesis Methods, 2013
Background: Confounding caused by selection bias is often a key difference between non-randomized studies (NRS) and randomized controlled trials (RCTs) of interventions. Key methodological issues: In this third paper of the series, we consider issues relating to the inclusion of NRS in systematic reviews on the effects of interventions. We discuss…
Descriptors: Research Design, Randomized Controlled Trials, Intervention, Bias
Kelcey, Ben; Phelps, Geoffrey; Jones, Nathan – Society for Research on Educational Effectiveness, 2013
Teacher professional development (PD) is seen as critical to improving the quality of US schools (National Commission on Teaching and America's Future, 1997). PD is increasingly viewed as one of the primary levers for improving teaching quality and ultimately student achievement (Correnti, 2007). One factor that is driving interest in PD is…
Descriptors: Faculty Development, Educational Quality, Teacher Effectiveness, Educational Research
Maynard, Brandy R.; Kjellstrand, Elizabeth K.; Thompson, Aaron M. – Society for Research on Educational Effectiveness, 2014
The present study evaluates the effectiveness of Check & Connect (C&C) in a randomly assigned sample of students who were all receiving Communities in Schools (CIS) services. The research questions for the study include: Are there differences in attendance, academics, and behavior for CIS students who also receive C&C compared to…
Descriptors: Dropout Prevention, Dropout Programs, Secondary School Students, Urban Schools
Previous Page | Next Page »
Pages: 1 | 2