NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)7
Since 2006 (last 20 years)13
Publication Type
Reports - Descriptive13
Journal Articles7
Guides - Non-Classroom3
Information Analyses1
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing all 13 results Save | Export
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew; Eno, Jared – Educational Researcher, 2018
Short comparative interrupted times series (CITS) designs are increasingly being used in education research to assess the effectiveness of school-level interventions. These designs can be implemented relatively inexpensively, often drawing on publicly available data on aggregate school performance. However, the validity of this approach hinges on…
Descriptors: Educational Research, Research Methodology, Comparative Analysis, Time
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Wing, Coady; Bello-Gomez, Ricardo A. – American Journal of Evaluation, 2018
Treatment effect estimates from a "regression discontinuity design" (RDD) have high internal validity. However, the arguments that support the design apply to a subpopulation that is narrower and usually different from the population of substantive interest in evaluation research. The disconnect between RDD population and the…
Descriptors: Regression (Statistics), Research Design, Validity, Evaluation Methods
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Jake Anders; Chris Brown; Melanie Ehren; Toby Greany; Rebecca Nelson; Jessica Heal; Bibi Groot; Michael Sanders; Rebecca Allen – Education Endowment Foundation, 2017
Evaluating the impact of complex whole-school interventions (CWSIs) is challenging. However, what evidence there is suggests that school leadership and other elements of whole-school contexts are important for pupils' attainment (Leithwood et al., 2006), suggesting that interventions aimed at changing these have significant potential to improve…
Descriptors: Leadership Styles, Program Implementation, Leadership Responsibility, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Vanwesenbeeck, Ine; Westeneng, Judith; de Boer, Thilly; Reinders, Jo; van Zorge, Ruth – Sex Education: Sexuality, Society and Learning, 2016
Today, more than half of the world population is under the age of 25 years and one in four is under age 18. The urgency of expanding access to Comprehensive Sexuality Education (CSE) notably for children and young people in Africa and Asia is greater than ever before. However, many challenges to the implementation and delivery of CSE in resource…
Descriptors: Sex Education, Program Implementation, Program Effectiveness, Low Income
Peer reviewed Peer reviewed
Direct linkDirect link
Sample McMeeking, Laura B.; Basile, Carole; Cobb, R. Brian – Evaluation and Program Planning, 2012
Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its…
Descriptors: Program Evaluation, Integrity, Evaluation Methods, Research Projects
MDRC, 2016
Community colleges that are exploring ways to dramatically improve outcomes for their students frequently seek a better understanding of the relationship between two "branded" approaches receiving significant publicity: Accelerated Study in Associate Programs (ASAP) and guided pathways. ASAP was created by the City University of New York…
Descriptors: Community Colleges, Two Year College Students, Acceleration (Education), Models
Coalition for Evidence-Based Policy, 2014
This guide is addressed to policy officials, program providers, and researchers who are seeking to: (1) identify and implement social programs backed by valid evidence of effectiveness; or (2) sponsor or conduct an evaluation to determine whether a program is effective. The guide provides a brief overview of which studies can produce valid…
Descriptors: Program Effectiveness, Program Design, Evidence, Social Work
Peer reviewed Peer reviewed
Direct linkDirect link
Lane, Forrest C.; To, Yen M.; Shelley, Kyna; Henson, Robin K. – Career and Technical Education Research, 2012
Researchers may be interested in examining the impact of programs that prepare youth and adults for successful careers but unable to implement experimental designs with true randomization of participants. As a result, these studies can be compromised by underlying factors that impact group selection and thus lead to potentially biased results.…
Descriptors: Vocational Education, Educational Research, Research Methodology, Research Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z.; Puma, Mike; Deke, John – National Center for Education Evaluation and Regional Assistance, 2014
This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…
Descriptors: Statistical Analysis, Evaluation Methods, Educational Research, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Llosa, Lorena; Slayton, Julie – Language Teaching Research, 2009
The purpose of this paper is to discuss how program evaluation can be conducted and communicated in ways that meaningfully affect the education of English language learners (ELLs) in US schools. First, the paper describes the Waterford Early Reading Program Evaluation, a large-scale evaluation of a reading intervention implemented in schools with…
Descriptors: Urban Schools, Program Evaluation, Early Reading, Reading Programs
Peer reviewed Peer reviewed
Direct linkDirect link
Schweigert, Francis J. – American Journal of Evaluation, 2006
In the present climate of public accountability, there is increasing demand to show "what works" and what return is gained for the public from investments to improve communities. This increasing demand for accountability is being met with growing confidence in the field of philanthropy during the past 10 years that the impact or effectiveness of…
Descriptors: Evaluation Methods, Private Financial Support, Accountability, Philanthropic Foundations