NotesFAQContact Us
Collection
Advanced
Search Tips
Location
United States1
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing all 10 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Claire Allen-Platt; Clara-Christina Gerstner; Robert Boruch; Alan Ruby – Society for Research on Educational Effectiveness, 2021
Background/Context: When a researcher tests an educational program, product, or policy in a randomized controlled trial (RCT) and detects a significant effect on an outcome, the intervention is usually classified as something that "works." When the expected effects are not found, however, there is seldom an orderly and transparent…
Descriptors: Educational Assessment, Randomized Controlled Trials, Evidence, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Qi, Cathy H.; Barton, Erin E.; Collier, Margo; Lin, Yi-Ling; Montoya, Charisse – Focus on Autism and Other Developmental Disabilities, 2018
The purpose of this systematic review was to synthesize 22 single-case research design (SCRD) studies on social stories intervention for individuals with autism spectrum disorder (ASD). We used the What Works Clearinghouse (WWC) SCRD standards to analyze study rigor and evidence of a causal relation. We calculated four nonoverlap indices to…
Descriptors: Intervention, Research Design, Pervasive Developmental Disorders, Autism
Peer reviewed Peer reviewed
Direct linkDirect link
Wong, Stephen E.; O'Driscoll, Janice – Journal of Teaching in Social Work, 2017
A course teaching graduate social work students to use an evidence-based model and to evaluate their own practice was replicated and evaluated. Students conducted a project in which they reviewed published research to achieve a clinical goal, applied quantitative measures for ongoing assessment, implemented evidence-based interventions, and…
Descriptors: Social Work, Professional Education, Graduate Students, Caseworker Approach
Peer reviewed Peer reviewed
Direct linkDirect link
VanHoudnos, Nathan M.; Greenhouse, Joel B. – Journal of Educational and Behavioral Statistics, 2016
When cluster randomized experiments are analyzed as if units were independent, test statistics for treatment effects can be anticonservative. Hedges proposed a correction for such tests by scaling them to control their Type I error rate. This article generalizes the Hedges correction from a posttest-only experimental design to more common designs…
Descriptors: Statistical Analysis, Randomized Controlled Trials, Error of Measurement, Scaling
Peer reviewed Peer reviewed
Direct linkDirect link
Kourea, Lefki; Lo, Ya-yu – International Journal of Research & Method in Education, 2016
Improving academic, behavioural, and social outcomes of students through empirical research has been a firm commitment among researchers, policy-makers, and other professionals in education across Europe and the United States (U.S.). To assist in building scientific evidences, executive bodies such as the European Commission and the Institute for…
Descriptors: Evidence Based Practice, Validity, Randomized Controlled Trials, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Jitendra, Asha K.; Nelson, Gena; Pulles, Sandra M.; Kiss, Allyson J.; Houseworth, James – Exceptional Children, 2016
The purpose of the present review was to evaluate the quality of the research and evidence base for representation of problems as a strategy to enhance the mathematical performance of students with learning disabilities and those at risk for mathematics difficulties. The authors evaluated 25 experimental and quasiexperimental studies according to…
Descriptors: Mathematics Instruction, Mathematics Skills, Mathematics Achievement, Learning Problems
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cheung, Alan; Slavin, Robert – Society for Research on Educational Effectiveness, 2016
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Descriptors: Effect Size, Evidence Based Practice, Educational Change, Educational Policy
Green, Sheridan – ProQuest LLC, 2013
Increasing demands for design rigor and an emphasis on evidence-based practice on a national level indicated a need for further guidance related to successful implementation of randomized studies in education. Rigorous and meaningful experimental research and its conclusions help establish a valid theoretical and evidence base for educational…
Descriptors: Federal Programs, Early Childhood Education, Preschool Children, Disadvantaged Youth
McMillan, James H.; Schumacher, Sally – Pearson, 2010
This substantially revised text provides a comprehensive, highly accessible, and student friendly introduction to the principles, concepts, and methods currently used in educational research. This text provides a balanced combination of quantitative and qualitative methods and enables students to master skills in reading, understanding,…
Descriptors: Educational Research, Inquiry, Evidence Based Practice, Mixed Methods Research
Rosenthal, James A. – Springer, 2011
Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…
Descriptors: Statistics, Data Interpretation, Social Work, Social Science Research