Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 5 |
Descriptor
Author
Spybrook, Jessaca | 5 |
Kelcey, Ben | 2 |
Dong, Nianbo | 1 |
Jones, Nathan | 1 |
Phelps, Geoffrey | 1 |
Raudenbush, Stephen W. | 1 |
Westine, Carl | 1 |
Zhang, Jiaqi | 1 |
Zhang, Qi | 1 |
Publication Type
Reports - Research | 4 |
Journal Articles | 3 |
Information Analyses | 1 |
Numerical/Quantitative Data | 1 |
Reports - Evaluative | 1 |
Education Level
Elementary Education | 2 |
Elementary Secondary Education | 2 |
Junior High Schools | 2 |
Middle Schools | 2 |
Secondary Education | 2 |
Early Childhood Education | 1 |
High Schools | 1 |
Preschool Education | 1 |
Audience
Location
Texas | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Spybrook, Jessaca; Zhang, Qi; Kelcey, Ben; Dong, Nianbo – Educational Evaluation and Policy Analysis, 2020
Over the past 15 years, we have seen an increase in the use of cluster randomized trials (CRTs) to test the efficacy of educational interventions. These studies are often designed with the goal of determining whether a program works, or answering the what works question. Recently, the goals of these studies expanded to include for whom and under…
Descriptors: Randomized Controlled Trials, Educational Research, Program Effectiveness, Intervention
Spybrook, Jessaca – Journal of Experimental Education, 2014
The Institute of Education Sciences has funded more than 100 experiments to evaluate educational interventions in an effort to generate scientific evidence of program effectiveness on which to base education policy and practice. In general, these studies are designed with the goal of having adequate statistical power to detect the average…
Descriptors: Intervention, Educational Research, Research Methodology, Statistical Analysis
Westine, Carl; Spybrook, Jessaca – Society for Research on Educational Effectiveness, 2013
The capacity of the field to conduct power analyses for group randomized trials (GRTs) of educational interventions has improved over the past decade (Authors, 2009). However, a power analysis depends on estimates of design parameters. Hence it is critical to build the empirical base of design parameters for GRTs across a variety of outcomes and…
Descriptors: Randomized Controlled Trials, Research Design, Correlation, Program Effectiveness
Kelcey, Ben; Spybrook, Jessaca; Zhang, Jiaqi; Phelps, Geoffrey; Jones, Nathan – Society for Research on Educational Effectiveness, 2015
With research indicating substantial differences among teachers in terms of their effectiveness (Nye, Konstantopoulous, & Hedges, 2004), a major focus of recent research in education has been on improving teacher quality through professional development (Desimone, 2009; Institute of Educations Sciences [IES], 2012; Measures of Effective…
Descriptors: Teacher Effectiveness, Faculty Development, Program Design, Educational Research
Spybrook, Jessaca; Raudenbush, Stephen W. – Educational Evaluation and Policy Analysis, 2009
This article examines the power analyses for the first wave of group-randomized trials funded by the Institute of Education Sciences. Specifically, it assesses the precision and technical accuracy of the studies. The authors identified the appropriate experimental design and estimated the minimum detectable standardized effect size (MDES) for each…
Descriptors: Research Design, Research Methodology, Effect Size, Correlation