NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
No Child Left Behind Act 20011
Assessments and Surveys
What Works Clearinghouse Rating
Meets WWC Standards with or without Reservations1
Showing 1 to 15 of 80 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Dahlia K. Remler; Gregg G. Van Ryzin – American Journal of Evaluation, 2025
This article reviews the origins and use of the terms quasi-experiment and natural experiment. It demonstrates how the terms conflate whether variation in the independent variable of interest falls short of random with whether researchers find, rather than intervene to create, that variation. Using the lens of assignment--the process driving…
Descriptors: Quasiexperimental Design, Research Design, Experiments, Predictor Variables
Peer reviewed Peer reviewed
Direct linkDirect link
Heining Cham; Hyunjung Lee; Igor Migunov – Asia Pacific Education Review, 2024
The randomized control trial (RCT) is the primary experimental design in education research due to its strong internal validity for causal inference. However, in situations where RCTs are not feasible or ethical, quasi-experiments are alternatives to establish causal inference. This paper serves as an introduction to several quasi-experimental…
Descriptors: Causal Models, Educational Research, Quasiexperimental Design, Research Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2021
The What Works Clearinghouse (WWC) identifies existing research on educational interventions, assesses the quality of the research, and summarizes and disseminates the evidence from studies that meet WWC standards. The WWC aims to provide enough information so educators can use the research to make informed decisions in their settings. This…
Descriptors: Program Effectiveness, Intervention, Educational Research, Educational Quality
Peer reviewed Peer reviewed
Direct linkDirect link
Toste, Jessica R.; Logan, Jessica A. R.; Shogren, Karrie A.; Boyd, Brian A. – Exceptional Children, 2023
Group design research studies can provide evidence to draw conclusions about "what works," "for whom," and "under what conditions" in special education. The quality indicators introduced by Gersten and colleagues (2005) have contributed to increased rigor in group design research, which has provided substantial…
Descriptors: Research Design, Educational Research, Special Education, Educational Indicators
Luke W. Miratrix – Grantee Submission, 2022
We are sometimes forced to use the Interrupted Time Series (ITS) design as an identification strategy for potential policy change, such as when we only have a single treated unit and cannot obtain comparable controls. For example, with recent county- and state-wide criminal justice reform efforts, where judicial bodies have changed bail setting…
Descriptors: Causal Models, Case Studies, Quasiexperimental Design, Monte Carlo Methods
Peer reviewed Peer reviewed
Direct linkDirect link
van Velzen, Joke H. – Research in the Schools, 2019
Research on teaching effectiveness, especially those concerning teaching approaches and instructional strategies, rarely shows conclusively the effectiveness of an intervention. Then, these (quasi-) experimental studies on teaching effectiveness often involve distal measures of teaching components. In this article, the focus is on another likely…
Descriptors: Quasiexperimental Design, Educational Research, Research Design, Teacher Behavior
Peer reviewed Peer reviewed
Direct linkDirect link
Andrew P. Jaciw – American Journal of Evaluation, 2025
By design, randomized experiments (XPs) rule out bias from confounded selection of participants into conditions. Quasi-experiments (QEs) are often considered second-best because they do not share this benefit. However, when results from XPs are used to generalize causal impacts, the benefit from unconfounded selection into conditions may be offset…
Descriptors: Elementary School Students, Elementary School Teachers, Generalization, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Taber, Keith S. – Studies in Science Education, 2019
Experimental studies are often employed to test the effectiveness of teaching innovations such as new pedagogy, curriculum, or learning resources. This article offers guidance on good practice in developing research designs, and in drawing conclusions from published reports. Random control trials potentially support the use of statistical…
Descriptors: Instructional Innovation, Educational Research, Research Design, Research Methodology
Sam Sims; Jake Anders; Matthew Inglis; Hugues Lortie-Forgues; Ben Styles; Ben Weidmann – Annenberg Institute for School Reform at Brown University, 2023
Over the last twenty years, education researchers have increasingly conducted randomised experiments with the goal of informing the decisions of educators and policymakers. Such experiments have generally employed broad, consequential, standardised outcome measures in the hope that this would allow decisionmakers to compare effectiveness of…
Descriptors: Educational Research, Research Methodology, Randomized Controlled Trials, Program Effectiveness
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Wing, Coady; Bello-Gomez, Ricardo A. – American Journal of Evaluation, 2018
Treatment effect estimates from a "regression discontinuity design" (RDD) have high internal validity. However, the arguments that support the design apply to a subpopulation that is narrower and usually different from the population of substantive interest in evaluation research. The disconnect between RDD population and the…
Descriptors: Regression (Statistics), Research Design, Validity, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Watson, Scott B.; Barthlow, Michelle J. – Science Teacher, 2020
Ms. Jones stared at the stack of biology quizzes and wondered what went wrong. She thought about the lesson plans and wondered what she should have done differently. Ms. Jones is not alone in wondering how to improve student learning and classroom instruction. To improve student achievement, educators must determine what is working and what is…
Descriptors: Action Research, Science Teachers, Science Instruction, Instructional Improvement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Jake Anders; Chris Brown; Melanie Ehren; Toby Greany; Rebecca Nelson; Jessica Heal; Bibi Groot; Michael Sanders; Rebecca Allen – Education Endowment Foundation, 2017
Evaluating the impact of complex whole-school interventions (CWSIs) is challenging. However, what evidence there is suggests that school leadership and other elements of whole-school contexts are important for pupils' attainment (Leithwood et al., 2006), suggesting that interventions aimed at changing these have significant potential to improve…
Descriptors: Leadership Styles, Program Implementation, Leadership Responsibility, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Haegele, Justin A.; Hodge, Samuel R. – Physical Educator, 2015
Emerging professionals, particularly senior-level undergraduate and graduate students in kinesiology who have an interest in physical education for individuals with and without disabilities, should understand the basic assumptions of the quantitative research paradigm. Knowledge of basic assumptions is critical for conducting, analyzing, and…
Descriptors: Statistical Analysis, Educational Research, Physical Education, Adapted Physical Education
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5  |  6