Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 4 |
Descriptor
Source
Coalition for Evidence-Based… | 1 |
Educational Researcher | 1 |
National Center for Education… | 1 |
Society for Research on… | 1 |
Author
Blalock, Ann Bonar | 1 |
Cheung, Alan | 1 |
Deke, John | 1 |
Eno, Jared | 1 |
Hallberg, Kelly | 1 |
Madhere, Serge | 1 |
Puma, Mike | 1 |
Schochet, Peter Z. | 1 |
Slavin, Robert | 1 |
Stromsdorfer, Ernst | 1 |
Swanlund, Andrew | 1 |
More ▼ |
Publication Type
Guides - Non-Classroom | 3 |
Reports - Descriptive | 3 |
Information Analyses | 1 |
Journal Articles | 1 |
Reports - General | 1 |
Reports - Research | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Elementary Secondary Education | 2 |
Early Childhood Education | 1 |
Audience
Researchers | 6 |
Policymakers | 2 |
Practitioners | 2 |
Administrators | 1 |
Location
Ohio | 1 |
Laws, Policies, & Programs
Job Training Partnership Act… | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew; Eno, Jared – Educational Researcher, 2018
Short comparative interrupted times series (CITS) designs are increasingly being used in education research to assess the effectiveness of school-level interventions. These designs can be implemented relatively inexpensively, often drawing on publicly available data on aggregate school performance. However, the validity of this approach hinges on…
Descriptors: Educational Research, Research Methodology, Comparative Analysis, Time
Coalition for Evidence-Based Policy, 2014
This guide is addressed to policy officials, program providers, and researchers who are seeking to: (1) identify and implement social programs backed by valid evidence of effectiveness; or (2) sponsor or conduct an evaluation to determine whether a program is effective. The guide provides a brief overview of which studies can produce valid…
Descriptors: Program Effectiveness, Program Design, Evidence, Social Work
Cheung, Alan; Slavin, Robert – Society for Research on Educational Effectiveness, 2016
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Descriptors: Effect Size, Evidence Based Practice, Educational Change, Educational Policy
Schochet, Peter Z.; Puma, Mike; Deke, John – National Center for Education Evaluation and Regional Assistance, 2014
This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…
Descriptors: Statistical Analysis, Evaluation Methods, Educational Research, Intervention
Stromsdorfer, Ernst; Blalock, Ann Bonar – 1986
This guide is intended to assist states and service delivery areas (SDAs) in addressing the new oversight responsibilities and opportunities stipulated by the Job Training Partnership Act (JTPA) with respect to net impact evaluations. It is divided into two main parts. The first part, which deals with issues in evaluating costs and benefits,…
Descriptors: Comparative Analysis, Cost Effectiveness, Cost Estimates, Educational Benefits
Madhere, Serge – 1986
One of the most appropriate quasiexperimental approaches to compensatory education is the regression-discontinuity design. However, it remains underutilized, in part because of the need to clarify the link between the mathematical model and administrative decision-making. This paper explains the derivation of a program efficiency index congruent…
Descriptors: Compensatory Education, Cutting Scores, Effect Size, Elementary Education