NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
Gates MacGinitie Reading Tests1
What Works Clearinghouse Rating
Showing all 10 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Zuchao Shen; Ben Kelcey – Society for Research on Educational Effectiveness, 2023
I. Purpose of the Study: Detecting whether interventions work or not (through main effect analysis) can provide empirical evidence regarding the causal linkage between malleable factors (e.g., interventions) and learner outcomes. In complement, moderation analyses help delineate for whom and under what conditions intervention effects are most…
Descriptors: Intervention, Program Effectiveness, Evidence, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew – Journal of Research on Educational Effectiveness, 2020
More aggregate data on school performance is available than ever before, opening up new possibilities for applied researchers interested in assessing the effectiveness of school-level interventions quickly and at a relatively low cost by implementing comparative interrupted times series (CITS) designs. We examine the extent to which effect…
Descriptors: Data Use, Research Methodology, Program Effectiveness, Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Karras-Jean Gilles, Juliana; Astuto, Jennifer; Gjicali, Kalina; Allen, LaRue – American Journal of Evaluation, 2019
Secondary data analysis was employed to scrutinize factors affecting sample retention in a randomized evaluation of an early childhood intervention. Retention was measured by whether data were collected at 3 points over 2 years. The participants were diverse, immigrant, and U.S.-born families of color from urban, low-income communities. We…
Descriptors: Early Childhood Education, Intervention, Persistence, Recruitment
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2016
This document provides step-by-step instructions on how to complete the Study Review Guide (SRG, Version S3, V2) for single-case designs (SCDs). Reviewers will complete an SRG for every What Works Clearinghouse (WWC) review. A completed SRG should be a reviewer's independent assessment of the study, relative to the criteria specified in the review…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Pettus-Davis, Carrie; Howard, Matthew Owen; Dunnigan, Allison; Scheyett, Anna M.; Roberts-Lewis, Amelia – Research on Social Work Practice, 2016
Randomized controlled trials (RCTs) are rarely used to evaluate social and behavioral interventions designed for releasing prisoners. Objective: We use a pilot RCT of a social support intervention (Support Matters) as a case example to discuss obstacles and strategies for conducting RCT intervention evaluations that span prison and community…
Descriptors: Institutionalized Persons, Correctional Institutions, Intervention, Sampling
Peer reviewed Peer reviewed
Direct linkDirect link
Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon – Journal of Research on Educational Effectiveness, 2016
An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated…
Descriptors: Educational Research, Research Design, Intervention, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Joyce, Ted; Remler, Dahlia K.; Jaeger, David A.; Altindag, Onur; O'Connell, Stephen D.; Crockett, Sean – Journal of Policy Analysis and Management, 2017
Randomized experiments provide unbiased estimates of treatment effects, but are costly and time consuming. We demonstrate how a randomized experiment can be leveraged to measure selection bias by conducting a subsequent observational study that is identical in every way except that subjects choose their treatment--a quasi-doubly randomized…
Descriptors: Randomized Controlled Trials, Quasiexperimental Design, Selection Criteria, Selection Tools
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Tipton, Elizabeth; Yeager, David; Iachan, Ronaldo – Society for Research on Educational Effectiveness, 2016
Questions regarding the generalizability of results from educational experiments have been at the forefront of methods development over the past five years. This work has focused on methods for estimating the effect of an intervention in a well-defined inference population (e.g., Tipton, 2013; O'Muircheartaigh and Hedges, 2014); methods for…
Descriptors: Behavioral Sciences, Behavioral Science Research, Intervention, Educational Experiments
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castillo, Veronica – Society for Research on Educational Effectiveness, 2015
Randomized experiments are commonly used to evaluate if particular interventions improve student achievement. While these experiments can establish that a treatment actually "causes" changes, typically the participants are not randomly selected from a well-defined population and therefore the results do not readily generalize. Three…
Descriptors: Site Selection, Randomized Controlled Trials, Educational Experiments, Research Methodology