NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing 1 to 15 of 22 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
David Rutkowski; Leslie Rutkowski; Greg Thompson; Yusuf Canbolat – Large-scale Assessments in Education, 2024
This paper scrutinizes the increasing trend of using international large-scale assessment (ILSA) data for causal inferences in educational research, arguing that such inferences are often tenuous. We explore the complexities of causality within ILSAs, highlighting the methodological constraints that challenge the validity of causal claims derived…
Descriptors: International Assessment, Data Use, Causal Models, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Peter Z. Schochet – Journal of Educational and Behavioral Statistics, 2025
Random encouragement designs evaluate treatments that aim to increase participation in a program or activity. These randomized controlled trials (RCTs) can also assess the mediated effects of participation itself on longer term outcomes using a complier average causal effect (CACE) estimation framework. This article considers power analysis…
Descriptors: Statistical Analysis, Computation, Causal Models, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Heining Cham; Hyunjung Lee; Igor Migunov – Asia Pacific Education Review, 2024
The randomized control trial (RCT) is the primary experimental design in education research due to its strong internal validity for causal inference. However, in situations where RCTs are not feasible or ethical, quasi-experiments are alternatives to establish causal inference. This paper serves as an introduction to several quasi-experimental…
Descriptors: Causal Models, Educational Research, Quasiexperimental Design, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Joshua Weidlich; Ben Hicks; Hendrik Drachsler – Educational Technology Research and Development, 2024
Researchers tasked with understanding the effects of educational technology innovations face the challenge of providing evidence of causality. Given the complexities of studying learning in authentic contexts interwoven with technological affordances, conducting tightly-controlled randomized experiments is not always feasible nor desirable. Today,…
Descriptors: Educational Research, Educational Technology, Research Design, Structural Equation Models
Peer reviewed Peer reviewed
Direct linkDirect link
Peter Schochet – Society for Research on Educational Effectiveness, 2024
Random encouragement designs are randomized controlled trials (RCTs) that test interventions aimed at increasing participation in a program or activity whose take up is not universal. In these RCTs, instead of randomizing individuals or clusters directly into treatment and control groups to participate in a program or activity, the randomization…
Descriptors: Statistical Analysis, Computation, Causal Models, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Ting Ye; Ted Westling; Lindsay Page; Luke Keele – Grantee Submission, 2024
The clustered observational study (COS) design is the observational study counterpart to the clustered randomized trial. In a COS, a treatment is assigned to intact groups, and all units within the group are exposed to the treatment. However, the treatment is non-randomly assigned. COSs are common in both education and health services research. In…
Descriptors: Nonparametric Statistics, Identification, Causal Models, Multivariate Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Pitkäniemi, Harri – Educational Process: International Journal, 2020
Recently, inspirational articles on research methodology have been written on the development of the mixed-methods approach. This area of study concerns methodological trends in the construction of research designs. One may ask, whether it is possible to construct a notional piece of investigation, potentially highlighting a research design that…
Descriptors: Mixed Methods Research, Research Design, Educational Research, Causal Models
Hill, Heather C.; Mancenido, Zid; Loeb, Susanna – Annenberg Institute for School Reform at Brown University, 2021
Despite calls for more evidence regarding the effectiveness of teacher education practices, causal research in the field remains rare. One reason is that we lack designs and measurement approaches that appropriately meet the challenges of causal inference in the context of teacher education programs. This article provides a framework for how to…
Descriptors: Educational Research, Educational Practices, Program Effectiveness, Teacher Education Programs
Kaplan, Avi; Cromley, Jennifer; Perez, Tony; Dai, Ting; Mara, Kyle; Balsai, Michael – Educational Researcher, 2020
In this commentary, we complement other constructive critiques of educational randomized control trials (RCTs) by calling attention to the commonly ignored role of context in causal mechanisms undergirding educational phenomena. We argue that evidence for the central role of context in causal mechanisms challenges the assumption that RCT findings…
Descriptors: Context Effect, Educational Research, Randomized Controlled Trials, Causal Models
Kaplan, Avi; Cromley, Jennifer; Perez, Tony; Dai, Ting; Mara, Kyle; Balsai, Michael – Grantee Submission, 2020
In this commentary, we complement other constructive critiques of educational randomized control trials (RCTs) by calling attention to the commonly ignored role of context in causal mechanisms undergirding educational phenomena. We argue that evidence for the central role of context in causal mechanisms challenges the assumption that RCT findings…
Descriptors: Context Effect, Educational Research, Randomized Controlled Trials, Causal Models
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Kraft, Matthew A. – Annenberg Institute for School Reform at Brown University, 2019
Researchers commonly interpret effect sizes by applying benchmarks proposed by Cohen over a half century ago. However, effects that are small by Cohen's standards are large relative to the impacts of most field-based interventions. These benchmarks also fail to consider important differences in study features, program costs, and scalability. In…
Descriptors: Data Interpretation, Effect Size, Intervention, Benchmarking
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Motz, Benjamin A.; Carvalho, Paulo F.; de Leeuw, Joshua R.; Goldstone, Robert L. – Journal of Learning Analytics, 2018
To identify the ways teachers and educational systems can improve learning, researchers need to make causal inferences. Analyses of existing datasets play an important role in detecting causal patterns, but conducting experiments also plays an indispensable role in this research. In this article, we advocate for experiments to be embedded in real…
Descriptors: Causal Models, Statistical Inference, Inferences, Educational Experiments
Peer reviewed Peer reviewed
Direct linkDirect link
Sandoval, William – Journal of the Learning Sciences, 2014
Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…
Descriptors: Concept Mapping, Educational Research, Instructional Design, Research Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kelcey, Ben; Phelps, Geoffrey; Jones, Nathan – Society for Research on Educational Effectiveness, 2013
Teacher professional development (PD) is seen as critical to improving the quality of US schools (National Commission on Teaching and America's Future, 1997). PD is increasingly viewed as one of the primary levers for improving teaching quality and ultimately student achievement (Correnti, 2007). One factor that is driving interest in PD is…
Descriptors: Faculty Development, Educational Quality, Teacher Effectiveness, Educational Research
Previous Page | Next Page »
Pages: 1  |  2