Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 7 |
Since 2006 (last 20 years) | 17 |
Descriptor
Program Evaluation | 25 |
Research Design | 25 |
Sample Size | 25 |
Educational Research | 11 |
Program Effectiveness | 11 |
Sampling | 10 |
Research Methodology | 9 |
Effect Size | 8 |
Statistical Analysis | 8 |
Intervention | 7 |
Evaluation Methods | 5 |
More ▼ |
Source
Author
Slavin, Robert E. | 3 |
Cheung, Alan C. K. | 2 |
Deke, John | 2 |
Kautz, Tim | 2 |
Schochet, Peter Z. | 2 |
Slavin, Robert | 2 |
Spybrook, Jessaca | 2 |
Wei, Thomas | 2 |
Bell, Stephen H. | 1 |
Bickman, Leonard | 1 |
Burghardt, John | 1 |
More ▼ |
Publication Type
Journal Articles | 10 |
Reports - Research | 10 |
Reports - Evaluative | 8 |
Guides - Non-Classroom | 4 |
Reports - Descriptive | 2 |
Books | 1 |
Numerical/Quantitative Data | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Elementary Education | 4 |
Elementary Secondary Education | 4 |
Grade 3 | 2 |
Grade 4 | 2 |
Grade 5 | 2 |
Grade 6 | 2 |
Grade 7 | 2 |
Grade 8 | 2 |
Early Childhood Education | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
More ▼ |
Audience
Researchers | 2 |
Policymakers | 1 |
Location
Florida | 2 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Deke, John; Wei, Thomas; Kautz, Tim – Journal of Research on Educational Effectiveness, 2021
Evaluators of education interventions are increasingly designing studies to detect impacts much smaller than the 0.20 standard deviations that Cohen characterized as "small." While the need to detect smaller impacts is based on compelling arguments that such impacts are substantively meaningful, the drive to detect smaller impacts may…
Descriptors: Intervention, Program Evaluation, Sample Size, Randomized Controlled Trials
Wolf, Rebecca; Morrison, Jennifer; Inns, Amanda; Slavin, Robert; Risman, Kelsey – Journal of Research on Educational Effectiveness, 2020
Rigorous evidence of program effectiveness has become increasingly important with the 2015 passage of the Every Student Succeeds Act (ESSA). One question that has not yet been fully explored is whether program evaluations carried out or commissioned by developers produce larger effect sizes than evaluations conducted by independent third parties.…
Descriptors: Program Evaluation, Program Effectiveness, Effect Size, Sample Size
Deke, John; Wei, Thomas; Kautz, Tim – Society for Research on Educational Effectiveness, 2018
Evaluators of education interventions increasingly need to design studies to detect impacts much smaller than the 0.20 standard deviations that Cohen (1988) characterized as "small." For example, an evaluation of Response to Intervention from the Institute of Education Sciences (IES) detected impacts ranging from 0.13 to 0.17 standard…
Descriptors: Intervention, Program Evaluation, Sample Size, Randomized Controlled Trials
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Louie, Josephine; Rhoads, Christopher; Mark, June – American Journal of Evaluation, 2016
Interest in the regression discontinuity (RD) design as an alternative to randomized control trials (RCTs) has grown in recent years. There is little practical guidance, however, on conditions that would lead to a successful RD evaluation or the utility of studies with underpowered RD designs. This article describes the use of RD design to…
Descriptors: Regression (Statistics), Program Evaluation, Algebra, Supplementary Education
What Works Clearinghouse, 2016
This document provides step-by-step instructions on how to complete the Study Review Guide (SRG, Version S3, V2) for single-case designs (SCDs). Reviewers will complete an SRG for every What Works Clearinghouse (WWC) review. A completed SRG should be a reviewer's independent assessment of the study, relative to the criteria specified in the review…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Harvill, Eleanor L.; Peck, Laura R.; Bell, Stephen H. – American Journal of Evaluation, 2013
Using exogenous characteristics to identify endogenous subgroups, the approach discussed in this method note creates symmetric subsets within treatment and control groups, allowing the analysis to take advantage of an experimental design. In order to maintain treatment--control symmetry, however, prior work has posited that it is necessary to use…
Descriptors: Experimental Groups, Control Groups, Research Design, Sampling
Cheung, Alan; Slavin, Robert – Society for Research on Educational Effectiveness, 2016
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Descriptors: Effect Size, Evidence Based Practice, Educational Change, Educational Policy
Kelcey, Ben; Spybrook, Jessaca; Zhang, Jiaqi; Phelps, Geoffrey; Jones, Nathan – Society for Research on Educational Effectiveness, 2015
With research indicating substantial differences among teachers in terms of their effectiveness (Nye, Konstantopoulous, & Hedges, 2004), a major focus of recent research in education has been on improving teacher quality through professional development (Desimone, 2009; Institute of Educations Sciences [IES], 2012; Measures of Effective…
Descriptors: Teacher Effectiveness, Faculty Development, Program Design, Educational Research
Spybrook, Jessaca; Lininger, Monica; Cullen, Anne – Society for Research on Educational Effectiveness, 2011
The purpose of this study is to extend the work of Spybrook and Raudenbush (2009) and examine how the research designs and sample sizes changed from the planning phase to the implementation phase in the first wave of studies funded by IES. The authors examine the impact of the changes in terms of the changes in the precision of the study from the…
Descriptors: Evaluation Criteria, Sampling, Research Design, Planning
Cheung, Alan C. K.; Slavin, Robert E. – Center for Research and Reform in Education, 2012
This review examines the effectiveness of educational technology applications in improving the reading achievement of struggling readers in elementary schools. The review applies consistent inclusion standards to focus on studies that met high methodological standards. A total of 20 studies based on about 7,000 students in grades K-6 were included…
Descriptors: Reading Achievement, Educational Technology, Reading Difficulties, Reading Programs
Cheung, Alan C. K.; Slavin, Robert E. – Center for Research and Reform in Education, 2012
The purpose of this review is to learn from rigorous evaluations of alternative technology applications how features of using technology programs and characteristics of their evaluations affect reading outcomes for students in grades K-12. The review applies consistent inclusion standards to focus on studies that met high methodological standards.…
Descriptors: Reading Achievement, Elementary Secondary Education, Educational Technology, Meta Analysis
Slavin, Robert E. – Educational Researcher, 2008
Syntheses of research on educational programs have taken on increasing policy importance. Procedures for performing such syntheses must therefore produce reliable, unbiased, and meaningful information on the strength of evidence behind each program. Because evaluations of any given program are few in number, syntheses of program evaluations must…
Descriptors: Research Design, Program Evaluation, Effect Size, Synthesis
Toby, Megan; Jaciw, Andrew; Ma, Boya; Lipton, Akiko – Empirical Education Inc., 2011
PCI Education conducted a three-year longitudinal study to determine the comparative effectiveness of the "PCI Reading Program" ("PCI") for students with severe disabilities as implemented in Florida's Brevard Public Schools and Miami-Dade County Public Schools. The primary question addressed by the study is whether students…
Descriptors: Reading Programs, Disabilities, Urban Schools, Public Schools
Schochet, Peter Z. – Journal of Educational and Behavioral Statistics, 2008
This article examines theoretical and empirical issues related to the statistical power of impact estimates for experimental evaluations of education programs. The author considers designs where random assignment is conducted at the school, classroom, or student level, and employs a unified analytic framework using statistical methods from the…
Descriptors: Elementary School Students, Research Design, Standardized Tests, Program Evaluation
Previous Page | Next Page ยป
Pages: 1 | 2