NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 12 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Li, Wei; Dong, Nianbo; Maynarad, Rebecca; Spybrook, Jessaca; Kelcey, Ben – Journal of Research on Educational Effectiveness, 2023
Cluster randomized trials (CRTs) are commonly used to evaluate educational interventions, particularly their effectiveness. Recently there has been greater emphasis on using these trials to explore cost-effectiveness. However, methods for establishing the power of cluster randomized cost-effectiveness trials (CRCETs) are limited. This study…
Descriptors: Research Design, Statistical Analysis, Randomized Controlled Trials, Cost Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca; Anderson, Dustin; Maynard, Rebecca – Journal of Research on Educational Effectiveness, 2019
The Society for Research on Educational Effectiveness, with funding from the Institute of Education Sciences (IES), has been working to develop and implement a registry for education studies, the Registry of Efficacy and Effectiveness Studies (REES) (https://www. sree.org/pages/registry.php). REES aims to increase transparency, rigor, and…
Descriptors: Educational Research, Databases, Research Design, Database Design
Spybrook, Jessaca; Zhang, Qi; Kelcey, Ben; Dong, Nianbo – Educational Evaluation and Policy Analysis, 2020
Over the past 15 years, we have seen an increase in the use of cluster randomized trials (CRTs) to test the efficacy of educational interventions. These studies are often designed with the goal of determining whether a program works, or answering the what works question. Recently, the goals of these studies expanded to include for whom and under…
Descriptors: Randomized Controlled Trials, Educational Research, Program Effectiveness, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Kelcey, Ben; Spybrook, Jessaca; Dong, Nianbo; Bai, Fangxing – Journal of Research on Educational Effectiveness, 2020
Professional development for teachers is regarded as one of the principal pathways through which we can understand and cultivate effective teaching and improve student outcomes. A critical component of studies that seek to improve teaching through professional development is the detailed assessment of the intermediate teacher development processes…
Descriptors: Faculty Development, Educational Research, Randomized Controlled Trials, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Dong, Nianbo; Kelcey, Benjamin; Spybrook, Jessaca – Journal of Experimental Education, 2018
Researchers are often interested in whether the effects of an intervention differ conditional on individual- or group-moderator variables such as children's characteristics (e.g., gender), teacher's background (e.g., years of teaching), and school's characteristics (e.g., urbanity); that is, the researchers seek to examine for whom and under what…
Descriptors: Statistical Analysis, Randomized Controlled Trials, Intervention, Effect Size
Peer reviewed Peer reviewed
Direct linkDirect link
Westine, Carl D.; Unlu, Fatih; Taylor, Joseph; Spybrook, Jessaca; Zhang, Qi; Anderson, Brent – Journal of Research on Educational Effectiveness, 2020
Experimental research in education and training programs typically involves administering treatment to whole groups of individuals. As such, researchers rely on the estimation of design parameter values to conduct power analyses to efficiently plan their studies to detect desired effects. In this study, we present design parameter estimates from a…
Descriptors: Outcome Measures, Science Education, Mathematics Education, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca; Hedges, Larry; Borenstein, Michael – Journal of Research on Educational Effectiveness, 2014
Research designs in which clusters are the unit of randomization are quite common in the social sciences. Given the multilevel nature of these studies, the power analyses for these studies are more complex than in a simple individually randomized trial. Tools are now available to help researchers conduct power analyses for cluster randomized…
Descriptors: Statistical Analysis, Research Design, Vocabulary, Coding
Peer reviewed Peer reviewed
Direct linkDirect link
Kelcey, Ben; Spybrook, Jessaca; Phelps, Geoffrey; Jones, Nathan; Zhang, Jiaqi – Journal of Experimental Education, 2017
We develop a theoretical and empirical basis for the design of teacher professional development studies. We build on previous work by (a) developing estimates of intraclass correlation coefficients for teacher outcomes using two- and three-level data structures, (b) developing estimates of the variance explained by covariates, and (c) modifying…
Descriptors: Faculty Development, Research Design, Teacher Effectiveness, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca – Journal of Experimental Education, 2014
The Institute of Education Sciences has funded more than 100 experiments to evaluate educational interventions in an effort to generate scientific evidence of program effectiveness on which to base education policy and practice. In general, these studies are designed with the goal of having adequate statistical power to detect the average…
Descriptors: Intervention, Educational Research, Research Methodology, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca; Puente, Anne Cullen; Lininger, Monica – Journal of Research on Educational Effectiveness, 2013
This article examines changes in the research design, sample size, and precision between the planning phase and implementation phase of group randomized trials (GRTs) funded by the Institute of Education Sciences. Thirty-eight GRTs funded between 2002 and 2006 were examined. Three studies revealed changes in the experimental design. Ten studies…
Descriptors: Educational Research, Research Design, Sample Size, Accuracy
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca; Raudenbush, Stephen W. – Educational Evaluation and Policy Analysis, 2009
This article examines the power analyses for the first wave of group-randomized trials funded by the Institute of Education Sciences. Specifically, it assesses the precision and technical accuracy of the studies. The authors identified the appropriate experimental design and estimated the minimum detectable standardized effect size (MDES) for each…
Descriptors: Research Design, Research Methodology, Effect Size, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca – Journal of Research on Educational Effectiveness, 2008
This study examines the reporting of power analyses in the group randomized trials funded by the Institute of Education Sciences from 2002 to 2006. A detailed power analysis provides critical information that allows reviewers to (a) replicate the power analysis and (b) assess whether the parameters used in the power analysis are reasonable.…
Descriptors: Statistical Analysis, Correlation, Research Methodology, Research Design