NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 116 results Save | Export
Heather C. Hill; Kathleen Lynch; Joshua R. Polanin – Annenberg Institute for School Reform at Brown University, 2024
Inconsistent reporting of critical facets of classroom interventions and their related impact evaluations hinders the field's ability to describe and synthesize the existing evidence base. In this essay, we present a set of reporting guidelines intended to steer authors of classroom intervention studies toward providing more systematic reporting…
Descriptors: Disclosure, Guidelines, Intervention, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Catherine Sheehan; Judith Butler; Cian O' Neill – Early Childhood Education Journal, 2025
Advances in the field of Early Childhood Education & Care (ECEC) are catalysing an important paradigm shift in the understanding of childhood trauma throughout the life course. While there is little dispute regarding the unique role of ECEC practitioners in providing critical support to children who experience trauma, international provisions…
Descriptors: Trauma, Trauma Informed Approach, Training, Early Childhood Education
Peer reviewed Peer reviewed
Direct linkDirect link
Troyer, Margaret – Journal of Research in Reading, 2022
Background: Randomised controlled trials (RCTs) have long been considered the gold standard in education research. Federal funds are allocated to evaluations that meet What Works Clearinghouse standards; RCT designs are required in order to meet these standards without reservations. Schools seek out interventions that are research based, in other…
Descriptors: Educational Research, Randomized Controlled Trials, Adolescents, Reading Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Deke, John; Wei, Thomas; Kautz, Tim – Journal of Research on Educational Effectiveness, 2021
Evaluators of education interventions are increasingly designing studies to detect impacts much smaller than the 0.20 standard deviations that Cohen characterized as "small." While the need to detect smaller impacts is based on compelling arguments that such impacts are substantively meaningful, the drive to detect smaller impacts may…
Descriptors: Intervention, Program Evaluation, Sample Size, Randomized Controlled Trials
Hugh, Maria L.; Ahlers, Kaitlyn; Joshi, Mahima; Locke, Jill – Grantee Submission, 2021
Purpose of Review: The purpose of this review is to provide an update on the recent research (2016-2021) that evaluates the effectiveness of school-implemented interventions for students with autism (3-21 years old) from preschool to high school. Recent Findings: Overall, the recent literature demonstrated that there are EBPs that help students…
Descriptors: Intervention, Preschool Education, Elementary Secondary Education, Autism Spectrum Disorders
Peer reviewed Peer reviewed
Direct linkDirect link
Apunyo, Robert; White, Howard; Otike, Caroline; Katairo, Thomas; Puerto, Sussana; Gardiner, Drew; Kinengyere, Alison A.; Eyers, John; Saran, Ashrita; Obuku, Ekwaro A. – Campbell Systematic Reviews, 2021
The research question guiding the production of the youth employment evidence and gap map (EGM) is stated as follows: What is the nature and extent of the evidence base of impact evaluations and systematic reviews on youth employment programmes in the world? The primary objective of is to catalogue impact evaluations and systematic reviews on…
Descriptors: Intervention, Youth Programs, Employment Programs, Evidence Based Practice
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Anthony Clairmont – ProQuest LLC, 2020
Educational researchers are often tasked with postulating and measuring unknown processes that account for observed outcomes. Logically, this requires inductive reasoning about which constructs are relevant to the situation we seek to understand, prior to attempts at measurement. Exploratory mixed methods designs, along with a few classic designs…
Descriptors: Measurement, Program Evaluation, Intervention, Disproportionate Representation
Peer reviewed Peer reviewed
Direct linkDirect link
Kraft, Matthew A. – Educational Researcher, 2020
Researchers commonly interpret effect sizes by applying benchmarks proposed by Jacob Cohen over a half century ago. However, effects that are small by Cohen's standards are large relative to the impacts of most field-based interventions. These benchmarks also fail to consider important differences in study features, program costs, and scalability.…
Descriptors: Effect Size, Benchmarking, Educational Research, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Lortie-Forgues, Hugues; Inglis, Matthew – Educational Researcher, 2019
There are a growing number of large-scale educational randomized controlled trials (RCTs). Considering their expense, it is important to reflect on the effectiveness of this approach. We assessed the magnitude and precision of effects found in those large-scale RCTs commissioned by the UK-based Education Endowment Foundation and the U.S.-based…
Descriptors: Randomized Controlled Trials, Educational Research, Effect Size, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Cotter, Robin; Kiser, Stacey; Rasmussen, Jed; Vemu, Sheela – New Directions for Community Colleges, 2022
Biology education research (BER), currently conducted mostly at four-year colleges and universities, is changing the culture of teaching biology and improving student success. We are community college faculty participating in the NSF-funded CC Bio INSITES network, getting training and support in BER to ask questions to improve student success in…
Descriptors: Biology, Science Instruction, College Faculty, Community Colleges
Peer reviewed Peer reviewed
Direct linkDirect link
Lindsay M. Fallon; Emily R. DeFouw; Sadie C. Cathcart; Talia S. Berkman; Patrick Robinson-Link; Breda V. O'Keeffe; George Sugai – Journal of Behavioral Education, 2022
School discipline disproportionality has long been documented in educational research, primarily impacting Black/African American and non-White Hispanic/Latinx students. In response, federal policymakers have encouraged educators to change their disciplinary practice, emphasizing that more proactive support is critical to promoting students'…
Descriptors: Discipline, Student Behavior, Behavior Modification, Social Development
Peer reviewed Peer reviewed
Direct linkDirect link
Heather C. Hill; Anna Erickson – Educational Researcher, 2019
Poor program implementation constitutes one explanation for null results in trials of educational interventions. For this reason, researchers often collect data about implementation fidelity when conducting such trials. In this article, we document whether and how researchers report and measure program fidelity in recent cluster-randomized trials.…
Descriptors: Fidelity, Program Implementation, Program Effectiveness, Intervention
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2017
An aspect of a study is considered a confounding factor if it is not possible to tell whether the difference in outcomes is due to the intervention, the confounding factor, or both. In What Works Clearinghouse (WWC) study reviews, certified reviewers look for a specific type of confounding factor: those that occur when a component of the study…
Descriptors: Research Methodology, Intervention, Control Groups, Experimental Groups
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Deke, John; Wei, Thomas; Kautz, Tim – Society for Research on Educational Effectiveness, 2018
Evaluators of education interventions increasingly need to design studies to detect impacts much smaller than the 0.20 standard deviations that Cohen (1988) characterized as "small." For example, an evaluation of Response to Intervention from the Institute of Education Sciences (IES) detected impacts ranging from 0.13 to 0.17 standard…
Descriptors: Intervention, Program Evaluation, Sample Size, Randomized Controlled Trials
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8