NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing 1 to 15 of 27 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Lucy Beasant; Alba Realpe; Sarah Douglas; Lorcan Kenny; Dheeraj Rai; Nicola Mills – Autism: The International Journal of Research and Practice, 2024
The purpose of this study is to explore the views of autistic adults on randomised controlled trials, specifically on processes such as randomisation and blinding, to understand the barriers and facilitators for recruiting autistic people to randomised controlled trials involving medications. We conducted one-to-one interviews with 49 autistic…
Descriptors: Autism Spectrum Disorders, Adults, Attitudes, Randomized Controlled Trials
A. Brooks Bowden – AERA Open, 2023
Although experimental evaluations have been labeled the "gold standard" of evidence for policy (U.S. Department of Education, 2003), evaluations without an analysis of costs are not sufficient for policymaking (Monk, 1995; Ross et al., 2007). Funding organizations now require cost-effectiveness data in most evaluations of effects. Yet,…
Descriptors: Cost Effectiveness, Program Evaluation, Economics, Educational Finance
Peer reviewed Peer reviewed
Direct linkDirect link
Paul Thompson; Kaydee Owen; Richard P. Hastings – International Journal of Research & Method in Education, 2024
Traditionally, cluster randomized controlled trials are analyzed with the average intervention effect of interest. However, in populations that contain higher degrees of heterogeneity or variation may differ across different values of a covariate, which may not be optimal. Within education and social science contexts, exploring the variation in…
Descriptors: Randomized Controlled Trials, Intervention, Mathematics Education, Mathematics Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Emma Law; Isabel Smith – Research Ethics, 2024
During the COVID-19 pandemic, the race to find an effective vaccine or treatment saw an 'extraordinary number' of clinical trials being conducted. While there were some key success stories, not all trials produced results that informed patient care. There was a significant amount of waste in clinical research during the pandemic which is said to…
Descriptors: Ethics, Research Methodology, Integrity, COVID-19
Peer reviewed Peer reviewed
Direct linkDirect link
Bulus, Metin; Dong, Nianbo – Journal of Experimental Education, 2021
Sample size determination in multilevel randomized trials (MRTs) and multilevel regression discontinuity designs (MRDDs) can be complicated due to multilevel structure, monetary restrictions, differing marginal costs per treatment and control units, and range restrictions in sample size at one or more levels. These issues have sparked a set of…
Descriptors: Sampling, Research Methodology, Costs, Research Design
Peer reviewed Peer reviewed
Kenneth A. Frank; Qinyun Lin; Spiro J. Maroulis – Grantee Submission, 2024
In the complex world of educational policy, causal inferences will be debated. As we review non-experimental designs in educational policy, we focus on how to clarify and focus the terms of debate. We begin by presenting the potential outcomes/counterfactual framework and then describe approximations to the counterfactual generated from the…
Descriptors: Causal Models, Statistical Inference, Observation, Educational Policy
Peer reviewed Peer reviewed
Direct linkDirect link
Kelly Hallberg; Andrew Swanlund; Ryan Williams – Society for Research on Educational Effectiveness, 2021
Background: The COVID-19 pandemic and the subsequent public health response led to an unprecedented disruption in educational instruction in the U.S. and around the world. Many schools quickly moved to virtual learning for the bulk of the 2020 spring term and many states cancelled annual assessments of student learning. The 2020-21 school year…
Descriptors: Research Problems, Educational Research, Research Design, Randomized Controlled Trials
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sean Demack – Education Endowment Foundation, 2019
Cluster Randomized Trial (CRT) designs are inherently multilevel and reflect the hierarchical structure of schools and the wider education system. To capture this multilevel nature, CRTs are commonly analysed using multilevel (or hierarchical) linear models. It is fairly common for CRT designs and analyses to include school and individual/pupil…
Descriptors: Educational Research, Randomized Controlled Trials, Research Design, Hierarchical Linear Modeling
Peer reviewed Peer reviewed
Direct linkDirect link
Taber, Keith S. – Studies in Science Education, 2019
Experimental studies are often employed to test the effectiveness of teaching innovations such as new pedagogy, curriculum, or learning resources. This article offers guidance on good practice in developing research designs, and in drawing conclusions from published reports. Random control trials potentially support the use of statistical…
Descriptors: Instructional Innovation, Educational Research, Research Design, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
May, Henry; Jones, Akisha; Blakeney, Aly – AERA Online Paper Repository, 2019
Using an RD design provides statistically robust estimates while allowing researchers a different causal estimation tool to be used in educational environments where an RCT may not be feasible. Results from External Evaluation of the i3 Scale-Up of Reading Recovery show that impact estimates were remarkably similar between a randomized control…
Descriptors: Regression (Statistics), Research Design, Randomized Controlled Trials, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cole, Russell; Deke, John; Seftor, Neil – Society for Research on Educational Effectiveness, 2016
The What Works Clearinghouse (WWC) maintains design standards to identify rigorous, internally valid education research. As education researchers advance new methodologies, the WWC must revise its standards to include an assessment of the new designs. Recently, the WWC has revised standards for two emerging study designs: regression discontinuity…
Descriptors: Educational Research, Research Design, Regression (Statistics), Multivariate Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Weijer, Charles; Taljaard, Monica; Grimshaw, Jeremy M.; Edwards, Sarah J. L.; Eccles, Martin P. – Research Ethics, 2015
Owing to unique features of their design, cluster randomized trials complicate the interpretation of standard ethics guidelines. The recently published Ottawa statement on the ethical design and conduct of cluster randomized trials provides researchers and research ethics committees with detailed guidance on the design, conduct, and review of…
Descriptors: Ethics, Research Methodology, Research Design, Committees
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Steiner, Peter M.; Wong, Vivian – Society for Research on Educational Effectiveness, 2016
Despite recent emphasis on the use of randomized control trials (RCTs) for evaluating education interventions, in most areas of education research, observational methods remain the dominant approach for assessing program effects. Over the last three decades, the within-study comparison (WSC) design has emerged as a method for evaluating the…
Descriptors: Randomized Controlled Trials, Comparative Analysis, Research Design, Evaluation Methods
Previous Page | Next Page ยป
Pages: 1  |  2