NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Li, Wei; Dong, Nianbo; Maynarad, Rebecca; Spybrook, Jessaca; Kelcey, Ben – Journal of Research on Educational Effectiveness, 2023
Cluster randomized trials (CRTs) are commonly used to evaluate educational interventions, particularly their effectiveness. Recently there has been greater emphasis on using these trials to explore cost-effectiveness. However, methods for establishing the power of cluster randomized cost-effectiveness trials (CRCETs) are limited. This study…
Descriptors: Research Design, Statistical Analysis, Randomized Controlled Trials, Cost Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Kristin Porter; Luke Miratrix; Kristen Hunter – Society for Research on Educational Effectiveness, 2021
Background: Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs)…
Descriptors: Statistical Analysis, Hypothesis Testing, Computer Software, Randomized Controlled Trials
Peer reviewed Peer reviewed
Direct linkDirect link
Su, Yu-Xuan; Tu, Yu-Kang – Research Synthesis Methods, 2018
Network meta-analysis compares multiple treatments in terms of their efficacy and harm by including evidence from randomized controlled trials. Most clinical trials use parallel design, where patients are randomly allocated to different treatments and receive only 1 treatment. However, some trials use within person designs such as split-body,…
Descriptors: Network Analysis, Meta Analysis, Randomized Controlled Trials, Research Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben – Society for Research on Educational Effectiveness, 2017
The purpose of this paper is to present results of recent advances in power analyses to detect the moderator effects in Cluster Randomized Trials (CRTs). This paper focus on demonstration of the software PowerUp!-Moderator. This paper provides a resource for researchers seeking to design CRTs with adequate power to detect the moderator effects of…
Descriptors: Computer Software, Research Design, Randomized Controlled Trials, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Lohr, Sharon L.; Zhu, Xiaoshu – Sociological Methods & Research, 2017
Many randomized experiments in the social sciences allocate subjects to treatment arms at the time the subjects enroll. Desirable features of the mechanism used to assign subjects to treatment arms are often (1) equal numbers of subjects in intervention and control arms, (2) balanced allocation for population subgroups and across covariates, (3)…
Descriptors: Social Science Research, Randomized Controlled Trials, Research Design, Computer Software
Thoemmes, Felix; Liao, Wang; Jin, Ze – Journal of Educational and Behavioral Statistics, 2017
This article describes the analysis of regression-discontinuity designs (RDDs) using the R packages rdd, rdrobust, and rddtools. We discuss similarities and differences between these packages and provide directions on how to use them effectively. We use real data from the Carolina Abecedarian Project to show how an analysis of an RDD can be…
Descriptors: Regression (Statistics), Research Design, Robustness (Statistics), Computer Software
Peer reviewed Peer reviewed
Direct linkDirect link
Debray, Thomas P. A.; Moons, Karel G. M.; van Valkenhoef, Gert; Efthimiou, Orestis; Hummel, Noemi; Groenwold, Rolf H. H.; Reitsma, Johannes B. – Research Synthesis Methods, 2015
Individual participant data (IPD) meta-analysis is an increasingly used approach for synthesizing and investigating treatment effect estimates. Over the past few years, numerous methods for conducting an IPD meta-analysis (IPD-MA) have been proposed, often making different assumptions and modeling choices while addressing a similar research…
Descriptors: Meta Analysis, Outcomes of Treatment, Research Methodology, Literature Reviews
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cheung, Alan; Slavin, Robert – Society for Research on Educational Effectiveness, 2016
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Descriptors: Effect Size, Evidence Based Practice, Educational Change, Educational Policy