NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
No Child Left Behind Act 20011
Assessments and Surveys
Indiana Statewide Testing for…1
What Works Clearinghouse Rating
Showing 1 to 15 of 30 results Save | Export
A. Brooks Bowden – AERA Open, 2023
Although experimental evaluations have been labeled the "gold standard" of evidence for policy (U.S. Department of Education, 2003), evaluations without an analysis of costs are not sufficient for policymaking (Monk, 1995; Ross et al., 2007). Funding organizations now require cost-effectiveness data in most evaluations of effects. Yet,…
Descriptors: Cost Effectiveness, Program Evaluation, Economics, Educational Finance
Sam Sims; Jake Anders; Matthew Inglis; Hugues Lortie-Forgues; Ben Styles; Ben Weidmann – Annenberg Institute for School Reform at Brown University, 2023
Over the last twenty years, education researchers have increasingly conducted randomised experiments with the goal of informing the decisions of educators and policymakers. Such experiments have generally employed broad, consequential, standardised outcome measures in the hope that this would allow decisionmakers to compare effectiveness of…
Descriptors: Educational Research, Research Methodology, Randomized Controlled Trials, Program Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Kelly Hallberg; Andrew Swanlund; Ryan Williams – Society for Research on Educational Effectiveness, 2021
Background: The COVID-19 pandemic and the subsequent public health response led to an unprecedented disruption in educational instruction in the U.S. and around the world. Many schools quickly moved to virtual learning for the bulk of the 2020 spring term and many states cancelled annual assessments of student learning. The 2020-21 school year…
Descriptors: Research Problems, Educational Research, Research Design, Randomized Controlled Trials
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Dora Gicheva; Julie Edmunds; Beth Thrift; Marie Hull; Jeremy Bray – Sage Research Methods Cases, 2020
Our research team worked with Wake Technical Community College to evaluate an intervention to redesign the delivery of a set of introductory online courses. To obtain unbiased estimates of the treatment effects of the intervention, we conducted a randomized control trial in which students who enrolled in the study courses were assigned randomly to…
Descriptors: Randomized Controlled Trials, Educational Research, Electronic Learning, Introductory Courses
Peer reviewed Peer reviewed
Direct linkDirect link
Wrigley, Terry; McCusker, Sean – Educational Research and Evaluation, 2019
This paper examines the insistent claims by advocates of evidence-based teaching that it is a rigorous scientific approach. The paper questions the view that randomised controlled trials and meta-analyses are the only truly scientific methods in educational research. It suggests these claims are often based on a rhetorical appeal which relies on…
Descriptors: Evidence Based Practice, Educational Research, Research Methodology, Athletics
Peer reviewed Peer reviewed
Direct linkDirect link
Cameron, Ewan – Journal of Education Policy, 2020
In the 2016-2017 school year, the Liberian government launched Partnership Schools For Liberia (PSL), a pilot program in which the management of 93 primary schools was transferred to 8 private contractors. The pilot owed much to the importation of western policy models and was facilitated by the British organisation ARK and involved BIA, a private…
Descriptors: Foreign Countries, Partnerships in Education, Privatization, Democracy
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sean Demack – Education Endowment Foundation, 2019
Cluster Randomized Trial (CRT) designs are inherently multilevel and reflect the hierarchical structure of schools and the wider education system. To capture this multilevel nature, CRTs are commonly analysed using multilevel (or hierarchical) linear models. It is fairly common for CRT designs and analyses to include school and individual/pupil…
Descriptors: Educational Research, Randomized Controlled Trials, Research Design, Hierarchical Linear Modeling
Peer reviewed Peer reviewed
Direct linkDirect link
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew – Journal of Research on Educational Effectiveness, 2020
More aggregate data on school performance is available than ever before, opening up new possibilities for applied researchers interested in assessing the effectiveness of school-level interventions quickly and at a relatively low cost by implementing comparative interrupted times series (CITS) designs. We examine the extent to which effect…
Descriptors: Data Use, Research Methodology, Program Effectiveness, Design
Peer reviewed Peer reviewed
Direct linkDirect link
Gore, Jennifer M. – Australian Educational Researcher, 2017
The field of educational research encompasses a vast array of paradigmatic and methodological perspectives. Arguably, this range has both expanded and limited our achievements in the name of educational research. In Australia, the ascendancy of certain research perspectives has profoundly shaped the field and its likely future. We (are expected…
Descriptors: Foreign Countries, Randomized Controlled Trials, Educational Research, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Taber, Keith S. – Studies in Science Education, 2019
Experimental studies are often employed to test the effectiveness of teaching innovations such as new pedagogy, curriculum, or learning resources. This article offers guidance on good practice in developing research designs, and in drawing conclusions from published reports. Random control trials potentially support the use of statistical…
Descriptors: Instructional Innovation, Educational Research, Research Design, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
May, Henry; Jones, Akisha; Blakeney, Aly – AERA Online Paper Repository, 2019
Using an RD design provides statistically robust estimates while allowing researchers a different causal estimation tool to be used in educational environments where an RCT may not be feasible. Results from External Evaluation of the i3 Scale-Up of Reading Recovery show that impact estimates were remarkably similar between a randomized control…
Descriptors: Regression (Statistics), Research Design, Randomized Controlled Trials, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Joyce, Kathryn E.; Cartwright, Nancy – American Educational Research Journal, 2020
This article addresses the gap between what works in research and what works in practice. Currently, research in evidence-based education policy and practice focuses on randomized controlled trials. These can support causal ascriptions ("It worked") but provide little basis for local effectiveness predictions ("It will work…
Descriptors: Theory Practice Relationship, Educational Policy, Evidence Based Practice, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Gopalan, Maithreyi; Rosinger, Kelly; Ahn, Jee Bin – Review of Research in Education, 2020
In the past few decades, we have seen a rapid proliferation in the use of quasi-experimental research designs in education research. This trend, stemming in part from the "credibility revolution" in the social sciences, particularly economics, is notable along with the increasing use of randomized controlled trials in the strive toward…
Descriptors: Educational Research, Quasiexperimental Design, Research Problems, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Uwimpuhwe, Germaine; Singh, Akansha; Higgins, Steve; Coux, Mickael; Xiao, ZhiMin; Shkedy, Ziv; Kasim, Adetayo – Journal of Experimental Education, 2022
Educational stakeholders are keen to know the magnitude and importance of different interventions. However, the way evidence is communicated to support understanding of the effectiveness of an intervention is controversial. Typically studies in education have used the standardised mean difference as a measure of the impact of interventions. This…
Descriptors: Program Effectiveness, Intervention, Multivariate Analysis, Bayesian Statistics
Previous Page | Next Page ยป
Pages: 1  |  2