NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)2
Since 2006 (last 20 years)8
Location
Delaware1
Laws, Policies, & Programs
Assessments and Surveys
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Porter, Kristin E.; Reardon, Sean F.; Unlu, Fatih; Bloom, Howard S.; Cimpian, Joseph R. – Journal of Research on Educational Effectiveness, 2017
A valuable extension of the single-rating regression discontinuity design (RDD) is a multiple-rating RDD (MRRDD). To date, four main methods have been used to estimate average treatment effects at the multiple treatment frontiers of an MRRDD: the "surface" method, the "frontier" method, the "binding-score" method, and…
Descriptors: Regression (Statistics), Intervention, Quasiexperimental Design, Simulation
Peer reviewed Peer reviewed
Direct linkDirect link
Weiss, Michael J.; Bloom, Howard S.; Verbitsky-Savitz, Natalya; Gupta, Himani; Vigil, Alma E.; Cullinan, Daniel N. – Journal of Research on Educational Effectiveness, 2017
Multisite trials, in which individuals are randomly assigned to alternative treatment arms within sites, offer an excellent opportunity to estimate the cross-site average effect of treatment assignment (intent to treat or ITT) "and" the amount by which this impact varies across sites. Although both of these statistics are substantively…
Descriptors: Randomized Controlled Trials, Evidence, Models, Intervention
Porter, Kristin E.; Reardon, Sean F.; Unlu, Fatih; Bloom, Howard S.; Robinson-Cimpian, Joseph P. – MDRC, 2014
A valuable extension of the single-rating regression discontinuity design (RDD) is a multiple-rating RDD (MRRDD). To date, four main methods have been used to estimate average treatment effects at the multiple treatment frontiers of an MRRDD: the "surface" method, the "frontier" method, the "binding-score" method, and…
Descriptors: Regression (Statistics), Research Design, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Bloom, Howard S. – Journal of Research on Educational Effectiveness, 2012
This article provides a detailed discussion of the theory and practice of modern regression discontinuity (RD) analysis for estimating the effects of interventions or treatments. Part 1 briefly chronicles the history of RD analysis and summarizes its past applications. Part 2 explains how in theory an RD analysis can identify an average effect of…
Descriptors: Regression (Statistics), Research Design, Cutting Scores, Computation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bloom, Howard S.; Porter, Kristin E.; Weiss, Michael J.; Raudenbush, Stephen – Society for Research on Educational Effectiveness, 2013
To date, evaluation research and policy analysis have focused mainly on average program impacts and paid little systematic attention to their variation. Recently, the growing number of multi-site randomized trials that are being planned and conducted make it increasingly feasible to study "cross-site" variation in impacts. Important…
Descriptors: Research Methodology, Policy, Evaluation Research, Randomized Controlled Trials
Bloom, Howard S.; Porter, Kristin E. – Society for Research on Educational Effectiveness, 2012
In recent years, the regression discontinuity design (RDD) has gained widespread recognition as a quasi-experimental method that when used correctly, can produce internally valid estimates of causal effects of a treatment, a program or an intervention (hereafter referred to as treatment effects). In an RDD study, subjects or groups of subjects…
Descriptors: Regression (Statistics), Research Design, Computation, Generalizability Theory
Bloom, Howard S. – MDRC, 2006
This chapter examines the core analytic elements of randomized experiments for social research. Its goal is to provide a compact discussion for faculty members, graduate students, and applied researchers of the design and analysis of randomized experiments for measuring the impacts of social or educational interventions. Design issues considered…
Descriptors: Research Methodology, Research Design, Experiments, Social Science Research
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Garet, Michael S.; Cronen, Stephanie; Eaton, Marian; Kurki, Anja; Ludwig, Meredith; Jones, Wehmah; Uekawa, Kazuaki; Falk, Audrey; Bloom, Howard S.; Doolittle, Fred; Zhu, Pei; Sztejnberg, Laura – National Center for Education Evaluation and Regional Assistance, 2008
To help states and districts make informed decisions about the professional development (PD) they implement to improve reading instruction, the U.S. Department of Education commissioned the Early Reading PD Interventions Study to examine the impact of two research-based PD interventions for reading instruction: (1) a content-focused teacher…
Descriptors: Early Reading, Reading Instruction, Professional Development, Intervention
Peer reviewed Peer reviewed
Bloom, Howard S. – Evaluation Review, 2002
Introduces an new approach for measuring the impact of whole school reforms. The approach, based on "short" interrupted time-series analysis, is explained, its statistical procedures are outlined, and how it was used in the evaluation of a major whole-school reform, Accelerated Schools is described (H. Bloom and others, 2001). (SLD)
Descriptors: Educational Change, Elementary Education, Evaluation Methods, Research Design
Peer reviewed Peer reviewed
Bloom, Howard S. – Evaluation Review, 1995
A simple way to assess the statistical power of experimental designs, based on the concept of a minimum detectable effect, is described. How to compute minimum detectable effects and how to apply the method of assessment of alternative experimental designs are illustrated. (SLD)
Descriptors: Estimation (Mathematics), Evaluation Methods, Experiments, Power (Statistics)
Peer reviewed Peer reviewed
Bloom, Howard S. – Evaluation Review, 1987
This article presents lessons learned from an innovative employment and training program for dislocated workers. It provides specific information about program design and serves as a prototype for how social experimentation can be used by state and local governments. (Author/LMO)
Descriptors: Adults, Dislocated Workers, Employment Practices, Program Evaluation
Bloom, Howard S.; Richburg-Hayes, Lashawn; Black, Alison Rebeck – MDRC, 2005
This paper examines how controlling statistically for baseline covariates (especially pretests) improves the precision of studies that randomize schools to measure the impacts of educational interventions on student achievement. Part I of the paper introduces the concepts, issues, and options involved. Parts II and III present empirical findings…
Descriptors: Program Effectiveness, Reading Achievement, Mathematics Achievement, Research Methodology
Bloom, Howard S.; Michalopoulos, Charles; Hill, Carolyn J.; Lei, Ying – 2002
A study explored which nonexperimental comparison group methods provide the most accurate estimates of the impacts of mandatory welfare-to-work programs and whether the best methods work well enough to substitute for random assignment experiments. Findings were compared for nonexperimental comparison groups and statistical adjustment procedures…
Descriptors: Adult Education, Comparative Analysis, Control Groups, Error of Measurement