Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 10 |
Since 2006 (last 20 years) | 14 |
Descriptor
Computation | 14 |
Randomized Controlled Trials | 14 |
Regression (Statistics) | 14 |
Intervention | 8 |
Statistical Analysis | 7 |
Educational Research | 6 |
Evaluation Methods | 6 |
Program Effectiveness | 6 |
Models | 5 |
Program Evaluation | 5 |
Quasiexperimental Design | 5 |
More ▼ |
Source
Society for Research on… | 4 |
National Center for Education… | 3 |
Journal of Educational and… | 2 |
What Works Clearinghouse | 2 |
Grantee Submission | 1 |
Journal of Research on… | 1 |
ProQuest LLC | 1 |
Author
Schochet, Peter Z. | 4 |
Hallberg, Kelly | 2 |
Yoon, HyeonJin | 2 |
Chan, Wendy | 1 |
Cook, Thomas D. | 1 |
Figlio, David | 1 |
Hansen, Ben B. | 1 |
Hedges, Larry V. | 1 |
Henderson, Brit | 1 |
Jin, Ze | 1 |
Kautz, Tim | 1 |
More ▼ |
Publication Type
Reports - Research | 7 |
Journal Articles | 3 |
Dissertations/Theses -… | 2 |
Guides - Non-Classroom | 2 |
Reports - Descriptive | 2 |
Book/Product Reviews | 1 |
Numerical/Quantitative Data | 1 |
Reports - Evaluative | 1 |
Education Level
Elementary Education | 3 |
Early Childhood Education | 2 |
Kindergarten | 2 |
Primary Education | 2 |
Elementary Secondary Education | 1 |
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Researchers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Indiana Statewide Testing for… | 1 |
What Works Clearinghouse Rating
Peter Schochet – Society for Research on Educational Effectiveness, 2021
Background: When RCTs are not feasible and time series data are available, panel data methods can be used to estimate treatment effects on outcomes, by exploiting variation in policies and conditions over time and across locations. A complication with these methods, however, is that treatment timing often varies across the sample, for example, due…
Descriptors: Statistical Analysis, Computation, Randomized Controlled Trials, COVID-19
Miratrix, Luke W.; Weiss, Michael J.; Henderson, Brit – Journal of Research on Educational Effectiveness, 2021
Researchers face many choices when conducting large-scale multisite individually randomized control trials. One of the most common quantities of interest in multisite RCTs is the overall average effect. Even this quantity is non-trivial to define and estimate. The researcher can target the average effect across individuals or sites. Furthermore,…
Descriptors: Computation, Randomized Controlled Trials, Error of Measurement, Regression (Statistics)
Sales, Adam C.; Hansen, Ben B. – Journal of Educational and Behavioral Statistics, 2020
Conventionally, regression discontinuity analysis contrasts a univariate regression's limits as its independent variable, "R," approaches a cut point, "c," from either side. Alternative methods target the average treatment effect in a small region around "c," at the cost of an assumption that treatment assignment,…
Descriptors: Regression (Statistics), Computation, Statistical Inference, Robustness (Statistics)
Yoon, HyeonJin – ProQuest LLC, 2018
In basic regression discontinuity (RD) designs, causal inference is limited to the local area near a single cutoff. To strengthen the generality of the RD treatment estimate, a design with multiple cutoffs along the assignment variable continuum can be applied. The availability of multiple cutoffs allows estimation of a pooled average treatment…
Descriptors: Regression (Statistics), Program Evaluation, Computation, Statistical Analysis
Yoon, HyeonJin – Grantee Submission, 2018
In basic regression discontinuity (RD) designs, causal inference is limited to the local area near a single cutoff. To strengthen the generality of the RD treatment estimate, a design with multiple cutoffs along the assignment variable continuum can be applied. The availability of multiple cutoffs allows estimation of a pooled average treatment…
Descriptors: Regression (Statistics), Program Evaluation, Computation, Statistical Analysis
Thoemmes, Felix; Liao, Wang; Jin, Ze – Journal of Educational and Behavioral Statistics, 2017
This article describes the analysis of regression-discontinuity designs (RDDs) using the R packages rdd, rdrobust, and rddtools. We discuss similarities and differences between these packages and provide directions on how to use them effectively. We use real data from the Carolina Abecedarian Project to show how an analysis of an RDD can be…
Descriptors: Regression (Statistics), Research Design, Robustness (Statistics), Computer Software
What Works Clearinghouse, 2020
The What Works Clearinghouse (WWC) is an initiative of the U.S. Department of Education's Institute of Education Sciences (IES), which was established under the Education Sciences Reform Act of 2002. It is an important part of IES's strategy to use rigorous and relevant research, evaluation, and statistics to improve the nation's education system.…
Descriptors: Educational Research, Evaluation Methods, Evidence, Statistical Significance
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles – National Center for Education Evaluation and Regional Assistance, 2017
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Tipton, Elizabeth; Hallberg, Kelly; Hedges, Larry V.; Chan, Wendy – Society for Research on Educational Effectiveness, 2015
Policy-makers are frequently interested in understanding how effective a particular intervention may be for a specific (and often broad) population. In many fields, particularly education and social welfare, the ideal form of these evaluations is a large-scale randomized experiment. Recent research has highlighted that sites in these large-scale…
Descriptors: Generalization, Program Effectiveness, Sample Size, Computation
Hallberg, Kelly; Cook, Thomas D.; Figlio, David – Society for Research on Educational Effectiveness, 2013
The goal of this paper is to provide guidance for applied education researchers in using multi-level data to study the effects of interventions implemented at the school level. Two primary approaches are currently employed in observational studies of the effect of school-level interventions. One approach employs intact school matching: matching…
Descriptors: Matched Groups, Intervention, Randomized Controlled Trials, Elementary Schools
Schochet, Peter Z. – Society for Research on Educational Effectiveness, 2013
In randomized control trials (RCTs) of educational interventions, there is a growing literature on impact estimation methods to adjust for missing student outcome data using such methods as multiple imputation, the construction of nonresponse weights, casewise deletion, and maximum likelihood methods (see, for example, Allison, 2002; Graham, 2009;…
Descriptors: Control Groups, Experimental Groups, Educational Research, Data Analysis
What Works Clearinghouse, 2011
With its critical assessments of scientific evidence on the effectiveness of education programs, policies, and practices (referred to as "interventions"), and a range of products summarizing this evidence, the What Works Clearinghouse (WWC) is an important part of the Institute of Education Sciences' strategy to use rigorous and relevant…
Descriptors: Standards, Access to Information, Information Management, Guides