Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 3 |
Descriptor
Program Evaluation | 3 |
Research Design | 3 |
Correlation | 2 |
Educational Research | 2 |
Effect Size | 2 |
Evaluation Research | 2 |
Program Effectiveness | 2 |
Sample Size | 2 |
Statistical Analysis | 2 |
Algebra | 1 |
Comparative Analysis | 1 |
More ▼ |
Author
Spybrook, Jessaca | 3 |
Cullen, Anne | 1 |
Jones, Nathan | 1 |
Kelcey, Ben | 1 |
Lininger, Monica | 1 |
Phelps, Geoffrey | 1 |
Raudenbush, Stephen W. | 1 |
Zhang, Jiaqi | 1 |
Publication Type
Reports - Research | 2 |
Journal Articles | 1 |
Reports - Evaluative | 1 |
Education Level
Elementary Education | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Kelcey, Ben; Spybrook, Jessaca; Zhang, Jiaqi; Phelps, Geoffrey; Jones, Nathan – Society for Research on Educational Effectiveness, 2015
With research indicating substantial differences among teachers in terms of their effectiveness (Nye, Konstantopoulous, & Hedges, 2004), a major focus of recent research in education has been on improving teacher quality through professional development (Desimone, 2009; Institute of Educations Sciences [IES], 2012; Measures of Effective…
Descriptors: Teacher Effectiveness, Faculty Development, Program Design, Educational Research
Spybrook, Jessaca; Lininger, Monica; Cullen, Anne – Society for Research on Educational Effectiveness, 2011
The purpose of this study is to extend the work of Spybrook and Raudenbush (2009) and examine how the research designs and sample sizes changed from the planning phase to the implementation phase in the first wave of studies funded by IES. The authors examine the impact of the changes in terms of the changes in the precision of the study from the…
Descriptors: Evaluation Criteria, Sampling, Research Design, Planning
Spybrook, Jessaca; Raudenbush, Stephen W. – Educational Evaluation and Policy Analysis, 2009
This article examines the power analyses for the first wave of group-randomized trials funded by the Institute of Education Sciences. Specifically, it assesses the precision and technical accuracy of the studies. The authors identified the appropriate experimental design and estimated the minimum detectable standardized effect size (MDES) for each…
Descriptors: Research Design, Research Methodology, Effect Size, Correlation