NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)3
Since 2006 (last 20 years)11
Audience
Location
Germany2
Arizona1
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles – National Center for Education Evaluation and Regional Assistance, 2017
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Mueller, Christoph Emanuel; Gaus, Hansjoerg – American Journal of Evaluation, 2015
In this article, we test an alternative approach to creating a counterfactual basis for estimating individual and average treatment effects. Instead of using control/comparison groups or before-measures, the so-called Counterfactual as Self-Estimated by Program Participants (CSEPP) relies on program participants' self-estimations of their own…
Descriptors: Intervention, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bloom, Howard S.; Porter, Kristin E.; Weiss, Michael J.; Raudenbush, Stephen – Society for Research on Educational Effectiveness, 2013
To date, evaluation research and policy analysis have focused mainly on average program impacts and paid little systematic attention to their variation. Recently, the growing number of multi-site randomized trials that are being planned and conducted make it increasingly feasible to study "cross-site" variation in impacts. Important…
Descriptors: Research Methodology, Policy, Evaluation Research, Randomized Controlled Trials
Peer reviewed Peer reviewed
Direct linkDirect link
Mueller, Christoph Emanuel; Gaus, Hansjoerg; Rech, Joerg – American Journal of Evaluation, 2014
This article proposes an innovative approach to estimating the counterfactual without the necessity of generating information from either a control group or a before-measure. Building on the idea that program participants are capable of estimating the hypothetical state they would be in had they not participated, the basics of the Roy-Rubin model…
Descriptors: Research Design, Program Evaluation, Research Methodology, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Spybrook, Jessaca; Raudenbush, Stephen W. – Educational Evaluation and Policy Analysis, 2009
This article examines the power analyses for the first wave of group-randomized trials funded by the Institute of Education Sciences. Specifically, it assesses the precision and technical accuracy of the studies. The authors identified the appropriate experimental design and estimated the minimum detectable standardized effect size (MDES) for each…
Descriptors: Research Design, Research Methodology, Effect Size, Correlation
Rosenthal, James A. – Springer, 2011
Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…
Descriptors: Statistics, Data Interpretation, Social Work, Social Science Research
Puma, Michael; Bell, Stephen; Cook, Ronna; Heid, Camilla; Shapiro, Gary; Broene, Pam; Jenkins, Frank; Fletcher, Philip; Quinn, Liz; Friedman, Janet; Ciarico, Janet; Rohacek, Monica; Adams, Gina; Spier, Elizabeth – Administration for Children & Families, 2010
This Technical Report is designed to provide technical detail to support the analysis and findings presented in the "Head Start Impact Study Final Report" (U.S. Department of Health and Human Services, January 2010). Chapter 1 provides an overview of the Head Start Impact Study and its findings. Chapter 2 provides technical information on the…
Descriptors: Preschool Children, Disadvantaged Youth, Low Income Groups, Kindergarten
Peer reviewed Peer reviewed
Stover, John; Bertrand, Jane T.; Shelton, James D. – Evaluation Review, 2000
Presents conversion factors to be used to translate the quality of the respective contraception methods distributed to a single measure of protection for calculating couple-years of protection in family planning studies. Discusses the implications for the evaluation of family planning programs. (SLD)
Descriptors: Computation, Contraception, Evaluation Methods, Family Planning
Peer reviewed Peer reviewed
Direct linkDirect link
Friesner, Donald E.; Peck, Laura R. – Journal of MultiDisciplinary Evaluation, 2007
The evaluation literature on Lesbian, Gay, Bisexual, Transgender, and Queer (LGBTQ) youth programs is largely absent of quantitative studies, likely in part due to the challenge of using experimental or quasi-experimental evaluation designs. This paper proposes the creative use of a national data set to overcome the problem of estimating a…
Descriptors: Youth Programs, Program Evaluation, Suicide, Metropolitan Areas
Peer reviewed Peer reviewed
Direct linkDirect link
Stuart, Elizabeth A. – Educational Researcher, 2007
Education researchers, practitioners, and policymakers alike are committed to identifying interventions that teach students more effectively. Increased emphasis on evaluation and accountability has increased desire for sound evaluations of these interventions; and at the same time, school-level data have become increasingly available. This article…
Descriptors: Research Methodology, Computation, Causal Models, Intervention
Schochet, Peter Z. – Mathematica Policy Research, Inc., 2005
This paper examines issues related to the statistical power of impact estimates for experimental evaluations of education programs. The focus is on "group-based" experimental designs, because many studies of education programs involve random assignment at the group level (for example, at the school or classroom level) rather than at the student…
Descriptors: Statistical Analysis, Evaluation Methods, Program Evaluation, Research Design