Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 3 |
Descriptor
Computer Software | 3 |
Randomized Controlled Trials | 3 |
Research Design | 3 |
Statistical Analysis | 3 |
Effect Size | 2 |
Accuracy | 1 |
Best Practices | 1 |
Computation | 1 |
Correlation | 1 |
Early Childhood Education | 1 |
Educational Change | 1 |
More ▼ |
Source
Society for Research on… | 3 |
Author
Cheung, Alan | 1 |
Dong, Nianbo | 1 |
Kelcey, Ben | 1 |
Kristen Hunter | 1 |
Kristin Porter | 1 |
Luke Miratrix | 1 |
Slavin, Robert | 1 |
Spybrook, Jessaca | 1 |
Publication Type
Reports - Research | 3 |
Education Level
Early Childhood Education | 1 |
Elementary Secondary Education | 1 |
Audience
Researchers | 2 |
Policymakers | 1 |
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Kristin Porter; Luke Miratrix; Kristen Hunter – Society for Research on Educational Effectiveness, 2021
Background: Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs)…
Descriptors: Statistical Analysis, Hypothesis Testing, Computer Software, Randomized Controlled Trials
Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben – Society for Research on Educational Effectiveness, 2017
The purpose of this paper is to present results of recent advances in power analyses to detect the moderator effects in Cluster Randomized Trials (CRTs). This paper focus on demonstration of the software PowerUp!-Moderator. This paper provides a resource for researchers seeking to design CRTs with adequate power to detect the moderator effects of…
Descriptors: Computer Software, Research Design, Randomized Controlled Trials, Statistical Analysis
Cheung, Alan; Slavin, Robert – Society for Research on Educational Effectiveness, 2016
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Descriptors: Effect Size, Evidence Based Practice, Educational Change, Educational Policy