NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Location
Germany1
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Ethan R. Van Norman; David A. Klingbeil; Adelle K. Sturgell – Grantee Submission, 2024
Single-case experimental designs (SCEDs) have been used with increasing frequency to identify evidence-based interventions in education. The purpose of this study was to explore how several procedural characteristics, including within-phase variability (i.e., measurement error), number of baseline observations, and number of intervention…
Descriptors: Research Design, Case Studies, Effect Size, Error of Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Timothy Lycurgus; Daniel Almirall – Society for Research on Educational Effectiveness, 2024
Background: Education scientists are increasingly interested in constructing interventions that are adaptive over time to suit the evolving needs of students, classrooms, or schools. Such "adaptive interventions" (also referred to as dynamic treatment regimens or dynamic instructional regimes) determine which treatment should be offered…
Descriptors: Educational Research, Research Design, Randomized Controlled Trials, Intervention
Ella Patrona; John Ferron; Arnold Olszewski; Elizabeth Kelley; Howard Goldstein – Grantee Submission, 2022
Purpose: Systematic reviews of literature are routinely conducted to identify practices that are effective in addressing educational and clinical problems. One complication, however, is how best to combine data from both group experimental design (GED) studies and single-case experimental design (SCED) studies. Percent of Goal Obtained (PoGO) has…
Descriptors: Preschool Children, Vocabulary Development, Intervention, Error of Measurement
Ella Patrona; John Ferron; Arnold Olszewski; Elizabeth Kelley; Howard Goldstein – Journal of Speech, Language, and Hearing Research, 2022
Purpose: Systematic reviews of literature are routinely conducted to identify practices that are effective in addressing educational and clinical problems. One complication, however, is how best to combine data from both group experimental design (GED) studies and single-case experimental design (SCED) studies. Percent of Goal Obtained (PoGO) has…
Descriptors: Preschool Children, Vocabulary Development, Intervention, Error of Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Stallasch, Sophie E.; Lüdtke, Oliver; Artelt, Cordula; Brunner, Martin – Journal of Research on Educational Effectiveness, 2021
To plan cluster-randomized trials with sufficient statistical power to detect intervention effects on student achievement, researchers need multilevel design parameters, including measures of between-classroom and between-school differences and the amounts of variance explained by covariates at the student, classroom, and school level. Previous…
Descriptors: Foreign Countries, Randomized Controlled Trials, Intervention, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Porter, Kristin E.; Reardon, Sean F.; Unlu, Fatih; Bloom, Howard S.; Cimpian, Joseph R. – Journal of Research on Educational Effectiveness, 2017
A valuable extension of the single-rating regression discontinuity design (RDD) is a multiple-rating RDD (MRRDD). To date, four main methods have been used to estimate average treatment effects at the multiple treatment frontiers of an MRRDD: the "surface" method, the "frontier" method, the "binding-score" method, and…
Descriptors: Regression (Statistics), Intervention, Quasiexperimental Design, Simulation
Peer reviewed Peer reviewed
Direct linkDirect link
Moeyaert, Mariola; Ugille, Maaike; Ferron, John M.; Onghena, Patrick; Heyvaert, Mieke; Beretvas, S. Natasha; Van den Noortgate, Wim – School Psychology Quarterly, 2015
The purpose of this study is to illustrate the multilevel meta-analysis of results from single-subject experimental designs of different types, including AB phase designs, multiple-baseline designs, ABAB reversal designs, and alternating treatment designs. Current methodological work on the meta-analysis of single-subject experimental designs…
Descriptors: Intervention, Multivariate Analysis, Meta Analysis, Research Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Deke, John; Wei, Thomas; Kautz, Tim – National Center for Education Evaluation and Regional Assistance, 2017
Evaluators of education interventions are increasingly designing studies to detect impacts much smaller than the 0.20 standard deviations that Cohen (1988) characterized as "small." While the need to detect smaller impacts is based on compelling arguments that such impacts are substantively meaningful, the drive to detect smaller impacts…
Descriptors: Intervention, Educational Research, Research Problems, Statistical Bias
Porter, Kristin E.; Reardon, Sean F.; Unlu, Fatih; Bloom, Howard S.; Robinson-Cimpian, Joseph P. – MDRC, 2014
A valuable extension of the single-rating regression discontinuity design (RDD) is a multiple-rating RDD (MRRDD). To date, four main methods have been used to estimate average treatment effects at the multiple treatment frontiers of an MRRDD: the "surface" method, the "frontier" method, the "binding-score" method, and…
Descriptors: Regression (Statistics), Research Design, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2009
This paper examines the estimation of two-stage clustered RCT designs in education research using the Neyman causal inference framework that underlies experiments. The key distinction between the considered causal models is whether potential treatment and control group outcomes are considered to be fixed for the study population (the…
Descriptors: Control Groups, Causal Models, Statistical Significance, Computation
Peer reviewed Peer reviewed
Direct linkDirect link
Jenson, William R.; Clark, Elaine; Kircher, John C.; Kristjansson, Sean D. – Psychology in the Schools, 2007
Evidence-based practice approaches to interventions has come of age and promises to provide a new standard of excellence for school psychologists. This article describes several definitions of evidence-based practice and the problems associated with traditional statistical analyses that rely on rejection of the null hypothesis for the…
Descriptors: School Psychologists, Statistical Analysis, Hypothesis Testing, Intervention