Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 1 |
Descriptor
Program Evaluation | 3 |
Research Design | 3 |
Research Methodology | 3 |
Statistical Analysis | 3 |
Adult Education | 1 |
Bias | 1 |
Comparative Analysis | 1 |
Computation | 1 |
Control Groups | 1 |
Data Analysis | 1 |
Data Collection | 1 |
More ▼ |
Source
MDRC | 1 |
Society for Research on… | 1 |
Author
Bloom, Howard S. | 3 |
Black, Alison Rebeck | 1 |
Hill, Carolyn J. | 1 |
Lei, Ying | 1 |
Michalopoulos, Charles | 1 |
Porter, Kristin E. | 1 |
Raudenbush, Stephen | 1 |
Richburg-Hayes, Lashawn | 1 |
Weiss, Michael J. | 1 |
Publication Type
Reports - Research | 2 |
Information Analyses | 1 |
Numerical/Quantitative Data | 1 |
Reports - Evaluative | 1 |
Audience
Policymakers | 1 |
Practitioners | 1 |
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Bloom, Howard S.; Porter, Kristin E.; Weiss, Michael J.; Raudenbush, Stephen – Society for Research on Educational Effectiveness, 2013
To date, evaluation research and policy analysis have focused mainly on average program impacts and paid little systematic attention to their variation. Recently, the growing number of multi-site randomized trials that are being planned and conducted make it increasingly feasible to study "cross-site" variation in impacts. Important…
Descriptors: Research Methodology, Policy, Evaluation Research, Randomized Controlled Trials
Bloom, Howard S.; Richburg-Hayes, Lashawn; Black, Alison Rebeck – MDRC, 2005
This paper examines how controlling statistically for baseline covariates (especially pretests) improves the precision of studies that randomize schools to measure the impacts of educational interventions on student achievement. Part I of the paper introduces the concepts, issues, and options involved. Parts II and III present empirical findings…
Descriptors: Program Effectiveness, Reading Achievement, Mathematics Achievement, Research Methodology
Bloom, Howard S.; Michalopoulos, Charles; Hill, Carolyn J.; Lei, Ying – 2002
A study explored which nonexperimental comparison group methods provide the most accurate estimates of the impacts of mandatory welfare-to-work programs and whether the best methods work well enough to substitute for random assignment experiments. Findings were compared for nonexperimental comparison groups and statistical adjustment procedures…
Descriptors: Adult Education, Comparative Analysis, Control Groups, Error of Measurement