NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Researchers1
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing 1 to 15 of 19 results Save | Export
Sam Sims; Jake Anders; Matthew Inglis; Hugues Lortie-Forgues; Ben Styles; Ben Weidmann – Annenberg Institute for School Reform at Brown University, 2023
Over the last twenty years, education researchers have increasingly conducted randomised experiments with the goal of informing the decisions of educators and policymakers. Such experiments have generally employed broad, consequential, standardised outcome measures in the hope that this would allow decisionmakers to compare effectiveness of…
Descriptors: Educational Research, Research Methodology, Randomized Controlled Trials, Program Effectiveness
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Cameron, Ewan – Journal of Education Policy, 2020
In the 2016-2017 school year, the Liberian government launched Partnership Schools For Liberia (PSL), a pilot program in which the management of 93 primary schools was transferred to 8 private contractors. The pilot owed much to the importation of western policy models and was facilitated by the British organisation ARK and involved BIA, a private…
Descriptors: Foreign Countries, Partnerships in Education, Privatization, Democracy
Peer reviewed Peer reviewed
Direct linkDirect link
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew – Journal of Research on Educational Effectiveness, 2020
More aggregate data on school performance is available than ever before, opening up new possibilities for applied researchers interested in assessing the effectiveness of school-level interventions quickly and at a relatively low cost by implementing comparative interrupted times series (CITS) designs. We examine the extent to which effect…
Descriptors: Data Use, Research Methodology, Program Effectiveness, Design
Peer reviewed Peer reviewed
Direct linkDirect link
May, Henry; Jones, Akisha; Blakeney, Aly – AERA Online Paper Repository, 2019
Using an RD design provides statistically robust estimates while allowing researchers a different causal estimation tool to be used in educational environments where an RCT may not be feasible. Results from External Evaluation of the i3 Scale-Up of Reading Recovery show that impact estimates were remarkably similar between a randomized control…
Descriptors: Regression (Statistics), Research Design, Randomized Controlled Trials, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Joyce, Kathryn E.; Cartwright, Nancy – American Educational Research Journal, 2020
This article addresses the gap between what works in research and what works in practice. Currently, research in evidence-based education policy and practice focuses on randomized controlled trials. These can support causal ascriptions ("It worked") but provide little basis for local effectiveness predictions ("It will work…
Descriptors: Theory Practice Relationship, Educational Policy, Evidence Based Practice, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Moerbeek, Mirjam; Safarkhani, Maryam – Journal of Educational and Behavioral Statistics, 2018
Data from cluster randomized trials do not always have a pure hierarchical structure. For instance, students are nested within schools that may be crossed by neighborhoods, and soldiers are nested within army units that may be crossed by mental health-care professionals. It is important that the random cross-classification is taken into account…
Descriptors: Randomized Controlled Trials, Classification, Research Methodology, Military Personnel
Peer reviewed Peer reviewed
Direct linkDirect link
Gopalan, Maithreyi; Rosinger, Kelly; Ahn, Jee Bin – Review of Research in Education, 2020
In the past few decades, we have seen a rapid proliferation in the use of quasi-experimental research designs in education research. This trend, stemming in part from the "credibility revolution" in the social sciences, particularly economics, is notable along with the increasing use of randomized controlled trials in the strive toward…
Descriptors: Educational Research, Quasiexperimental Design, Research Problems, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Uwimpuhwe, Germaine; Singh, Akansha; Higgins, Steve; Coux, Mickael; Xiao, ZhiMin; Shkedy, Ziv; Kasim, Adetayo – Journal of Experimental Education, 2022
Educational stakeholders are keen to know the magnitude and importance of different interventions. However, the way evidence is communicated to support understanding of the effectiveness of an intervention is controversial. Typically studies in education have used the standardised mean difference as a measure of the impact of interventions. This…
Descriptors: Program Effectiveness, Intervention, Multivariate Analysis, Bayesian Statistics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles – National Center for Education Evaluation and Regional Assistance, 2017
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Graham, Carolyn W.; West, Michael D.; Bourdon, Jessica L.; Inge, Katherine J.; Seward, Hannah E. – Campbell Collaboration, 2016
Individuals with traumatic brain injury (TBI) often struggle to obtain competitive employment after sustaining a TBI, commonly as a result of the post-injury difficulties they exhibit (Andelic, Stevens, Sigurdardottir, Arango-Lasprilla, & Roe, 2009; Mansfield et al., 2015). The currently reported unemployment rate for people with TBI is…
Descriptors: Head Injuries, Brain, Job Skills, Intervention
Ruth Maisey; Svetlana Speight; Chris Bonell; Susan Purdon; Peter Keogh; Ivonne Wollny; Annik M. Sorhaindo; Kaye Wellings – Sage Research Methods Cases, 2014
In 2009, the government's Department for Education commissioned a team of researchers at NatCen Social Research to evaluate the effectiveness of the youth development/teenage pregnancy prevention programme 'Teens and Toddlers'. Previous studies had positive findings but had not been very rigorous in terms of methodology and methods used. We…
Descriptors: Youth Programs, Program Evaluation, Adolescents, Toddlers
Peer reviewed Peer reviewed
Direct linkDirect link
Valentine, Jeffrey C.; Thompson, Simon G. – Research Synthesis Methods, 2013
Background: Confounding caused by selection bias is often a key difference between non-randomized studies (NRS) and randomized controlled trials (RCTs) of interventions. Key methodological issues: In this third paper of the series, we consider issues relating to the inclusion of NRS in systematic reviews on the effects of interventions. We discuss…
Descriptors: Research Design, Randomized Controlled Trials, Intervention, Bias
Previous Page | Next Page ยป
Pages: 1  |  2