Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 7 |
Descriptor
Intervention | 7 |
Sample Size | 7 |
Statistical Inference | 7 |
Effect Size | 4 |
Statistical Analysis | 4 |
Computation | 3 |
Educational Research | 3 |
Regression (Statistics) | 3 |
Research Design | 3 |
Statistical Bias | 3 |
Comparative Analysis | 2 |
More ▼ |
Source
AERA Online Paper Repository | 1 |
Journal of Educational and… | 1 |
Journal of Experimental… | 1 |
Journal of Research on… | 1 |
National Center for Education… | 1 |
Structural Equation Modeling:… | 1 |
What Works Clearinghouse | 1 |
Author
Andrade, Alejandro | 1 |
Beretvas, S. Natasha | 1 |
Bloom, Howard S. | 1 |
Chen, Li-Ting | 1 |
Coffman, Donna L. | 1 |
Deke, John | 1 |
Ferron, John M. | 1 |
Hanauer, Matthew James | 1 |
Kautz, Tim | 1 |
Moeyaert, Mariola | 1 |
Schochet, Peter Z. | 1 |
More ▼ |
Publication Type
Journal Articles | 4 |
Reports - Research | 3 |
Reports - Descriptive | 2 |
Guides - Non-Classroom | 1 |
Numerical/Quantitative Data | 1 |
Reports - Evaluative | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Audience
Researchers | 1 |
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Chen, Li-Ting; Andrade, Alejandro; Hanauer, Matthew James – AERA Online Paper Repository, 2017
Single-case design is a repeated-measures research approach for the study of the effect of an intervention, and its importance is increasingly being recognized in education and psychology. We propose a Bayesian approach for estimating intervention effects in SCD. A Bayesian inference does not rely on large sample theories and thus is particularly…
Descriptors: Bayesian Statistics, Research Design, Case Studies, Intervention
Bloom, Howard S.; Spybrook, Jessaca – Journal of Research on Educational Effectiveness, 2017
Multisite trials, which are being used with increasing frequency in education and evaluation research, provide an exciting opportunity for learning about how the effects of interventions or programs are distributed across sites. In particular, these studies can produce rigorous estimates of a cross-site mean effect of program assignment…
Descriptors: Program Effectiveness, Program Evaluation, Sample Size, Evaluation Research
Deke, John; Wei, Thomas; Kautz, Tim – National Center for Education Evaluation and Regional Assistance, 2017
Evaluators of education interventions are increasingly designing studies to detect impacts much smaller than the 0.20 standard deviations that Cohen (1988) characterized as "small." While the need to detect smaller impacts is based on compelling arguments that such impacts are substantively meaningful, the drive to detect smaller impacts…
Descriptors: Intervention, Educational Research, Research Problems, Statistical Bias
Ugille, Maaike; Moeyaert, Mariola; Beretvas, S. Natasha; Ferron, John M.; Van den Noortgate, Wim – Journal of Experimental Education, 2014
A multilevel meta-analysis can combine the results of several single-subject experimental design studies. However, the estimated effects are biased if the effect sizes are standardized and the number of measurement occasions is small. In this study, the authors investigated 4 approaches to correct for this bias. First, the standardized effect…
Descriptors: Effect Size, Statistical Bias, Sample Size, Regression (Statistics)
Coffman, Donna L. – Structural Equation Modeling: A Multidisciplinary Journal, 2011
Mediation is usually assessed by a regression-based or structural equation modeling (SEM) approach that we refer to as the classical approach. This approach relies on the assumption that there are no confounders that influence both the mediator, "M", and the outcome, "Y". This assumption holds if individuals are randomly…
Descriptors: Structural Equation Models, Simulation, Regression (Statistics), Probability
Schochet, Peter Z. – Journal of Educational and Behavioral Statistics, 2010
Pretest-posttest experimental designs often are used in randomized control trials (RCTs) in the education field to improve the precision of the estimated treatment effects. For logistic reasons, however, pretest data often are collected after random assignment, so that including them in the analysis could bias the posttest impact estimates. Thus,…
Descriptors: Pretests Posttests, Scores, Intervention, Scientific Methodology
What Works Clearinghouse, 2014
This "What Works Clearinghouse Procedures and Standards Handbook (Version 3.0)" provides a detailed description of the standards and procedures of the What Works Clearinghouse (WWC). The remaining chapters of this Handbook are organized to take the reader through the basic steps that the WWC uses to develop a review protocol, identify…
Descriptors: Educational Research, Guides, Intervention, Classification