Publication Date
In 2025 | 0 |
Since 2024 | 2 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 11 |
Since 2006 (last 20 years) | 13 |
Descriptor
Evaluation Methods | 13 |
Randomized Controlled Trials | 13 |
Research Design | 13 |
Educational Research | 4 |
Intervention | 4 |
Program Evaluation | 4 |
Statistical Analysis | 4 |
Comparative Analysis | 3 |
Effect Size | 3 |
Eligibility | 3 |
Hierarchical Linear Modeling | 3 |
More ▼ |
Source
Society for Research on… | 3 |
What Works Clearinghouse | 3 |
American Journal of Evaluation | 2 |
Asia Pacific Journal of… | 1 |
European Early Childhood… | 1 |
Grantee Submission | 1 |
ProQuest LLC | 1 |
Research Synthesis Methods | 1 |
Author
Steiner, Peter M. | 2 |
Anglin, Kylie L. | 1 |
Bell, Stephen H. | 1 |
Canavan, John | 1 |
Cook, Ronna J. | 1 |
Dolan, Pat | 1 |
Fives, Allyn | 1 |
Gurung, Tara | 1 |
Heid, Camilla A. | 1 |
Huey T. Chen | 1 |
Iachan, Ronaldo | 1 |
More ▼ |
Publication Type
Reports - Research | 6 |
Journal Articles | 5 |
Guides - Non-Classroom | 3 |
Reports - Evaluative | 3 |
Information Analyses | 2 |
Dissertations/Theses -… | 1 |
Education Level
Elementary Education | 2 |
High Schools | 2 |
Junior High Schools | 2 |
Middle Schools | 2 |
Secondary Education | 2 |
Early Childhood Education | 1 |
Grade 10 | 1 |
Grade 11 | 1 |
Grade 3 | 1 |
Grade 5 | 1 |
Grade 8 | 1 |
More ▼ |
Audience
Location
Texas | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Huey T. Chen; Liliana Morosanu; Victor H. Chen – Asia Pacific Journal of Education, 2024
The Campbellian validity typology has been used as a foundation for outcome evaluation and for developing evidence-based interventions for decades. As such, randomized control trials were preferred for outcome evaluation. However, some evaluators disagree with the validity typology's argument that randomized controlled trials as the best design…
Descriptors: Evaluation Methods, Systems Approach, Intervention, Evidence Based Practice
Lydia Bradford – ProQuest LLC, 2024
In randomized control trials (RCT), the recent focus has shifted to how an intervention yields positive results on its intended outcome. This aligns with the recent push of implementation science in healthcare (Bauer et al., 2015) but goes beyond this. RCTs have moved to evaluating the theoretical framing of the intervention as well as differing…
Descriptors: Hierarchical Linear Modeling, Mediation Theory, Randomized Controlled Trials, Research Design
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
What Works Clearinghouse, 2020
The What Works Clearinghouse (WWC) systematic review process is the basis of many of its products, enabling the WWC to use consistent, objective, and transparent standards and procedures in its reviews, while also ensuring comprehensive coverage of the relevant literature. The WWC systematic review process consists of five steps: (1) Developing…
Descriptors: Educational Research, Evaluation Methods, Research Reports, Standards
Wong, Vivian C.; Steiner, Peter M.; Anglin, Kylie L. – Grantee Submission, 2018
Given the widespread use of non-experimental (NE) methods for assessing program impacts, there is a strong need to know whether NE approaches yield causally valid results in field settings. In within-study comparison (WSC) designs, the researcher compares treatment effects from an NE with those obtained from a randomized experiment that shares the…
Descriptors: Evaluation Methods, Program Evaluation, Program Effectiveness, Comparative Analysis
What Works Clearinghouse, 2017
The What Works Clearinghouse (WWC) systematic review process is the basis of many of its products, enabling the WWC to use consistent, objective, and transparent standards and procedures in its reviews, while also ensuring comprehensive coverage of the relevant literature. The WWC systematic review process consists of five steps: (1) Developing…
Descriptors: Educational Research, Evaluation Methods, Research Reports, Standards
Fives, Allyn; Canavan, John; Dolan, Pat – European Early Childhood Education Research Journal, 2017
There is significant controversy over what counts as evidence in the evaluation of social interventions. It is increasingly common to use methodological criteria to rank evidence types in a hierarchy, with Randomised Controlled Trials (RCTs) at or near the highest level. Because of numerous challenges to a hierarchical approach, this article…
Descriptors: Evaluation Methods, Evaluation Research, Randomized Controlled Trials, Ethics
Westlund, Erik; Stuart, Elizabeth A. – American Journal of Evaluation, 2017
This article discusses the nonuse, misuse, and proper use of pilot studies in experimental evaluation research. The authors first show that there is little theoretical, practical, or empirical guidance available to researchers who seek to incorporate pilot studies into experimental evaluation research designs. The authors then discuss how pilot…
Descriptors: Use Studies, Pilot Projects, Evaluation Research, Experiments
Steiner, Peter M.; Wong, Vivian – Society for Research on Educational Effectiveness, 2016
Despite recent emphasis on the use of randomized control trials (RCTs) for evaluating education interventions, in most areas of education research, observational methods remain the dominant approach for assessing program effects. Over the last three decades, the within-study comparison (WSC) design has emerged as a method for evaluating the…
Descriptors: Randomized Controlled Trials, Comparative Analysis, Research Design, Evaluation Methods
Tipton, Elizabeth; Yeager, David; Iachan, Ronaldo – Society for Research on Educational Effectiveness, 2016
Questions regarding the generalizability of results from educational experiments have been at the forefront of methods development over the past five years. This work has focused on methods for estimating the effect of an intervention in a well-defined inference population (e.g., Tipton, 2013; O'Muircheartaigh and Hedges, 2014); methods for…
Descriptors: Behavioral Sciences, Behavioral Science Research, Intervention, Educational Experiments
Westine, Carl D. – American Journal of Evaluation, 2016
Little is known empirically about intraclass correlations (ICCs) for multisite cluster randomized trial (MSCRT) designs, particularly in science education. In this study, ICCs suitable for science achievement studies using a three-level (students in schools in districts) MSCRT design that block on district are estimated and examined. Estimates of…
Descriptors: Efficiency, Evaluation Methods, Science Achievement, Correlation
Robertson, Clare; Ramsay, Craig; Gurung, Tara; Mowatt, Graham; Pickard, Robert; Sharma, Pawana – Research Synthesis Methods, 2014
We describe our experience of using a modified version of the Cochrane risk of bias (RoB) tool for randomised and non-randomised comparative studies. Objectives: (1) To assess time to complete RoB assessment; (2) To assess inter-rater agreement; and (3) To explore the association between RoB and treatment effect size. Methods: Cochrane risk of…
Descriptors: Risk, Randomized Controlled Trials, Research Design, Comparative Analysis
Bell, Stephen H.; Puma, Michael J.; Cook, Ronna J.; Heid, Camilla A. – Society for Research on Educational Effectiveness, 2013
Access to Head Start has been shown to improve children's preschool experiences and school readiness on selected factors through the end of 1st grade. Two more years of follow-up, through the end of 3rd grade, can now be examined to determine whether these effects continue into the middle elementary grades. The statistical design and impact…
Descriptors: Evaluation Methods, Data Analysis, Randomized Controlled Trials, Sampling