Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 9 |
Since 2006 (last 20 years) | 10 |
Descriptor
Intervention | 10 |
Randomized Controlled Trials | 10 |
Sampling | 10 |
Research Methodology | 6 |
Research Design | 5 |
Sample Size | 4 |
Statistical Analysis | 4 |
Program Evaluation | 3 |
Quasiexperimental Design | 3 |
Barriers | 2 |
Case Studies | 2 |
More ▼ |
Source
Society for Research on… | 3 |
Journal of Research on… | 2 |
What Works Clearinghouse | 2 |
American Journal of Evaluation | 1 |
Journal of Policy Analysis… | 1 |
Research on Social Work… | 1 |
Author
Tipton, Elizabeth | 2 |
Allen, LaRue | 1 |
Altindag, Onur | 1 |
Astuto, Jennifer | 1 |
Ben Kelcey | 1 |
Borman, Geoffrey | 1 |
Caverly, Sarah | 1 |
Crockett, Sean | 1 |
Dunnigan, Allison | 1 |
Fellers, Lauren | 1 |
Fletcher, Jack M. | 1 |
More ▼ |
Publication Type
Reports - Research | 8 |
Journal Articles | 5 |
Guides - Non-Classroom | 2 |
Reports - Descriptive | 1 |
Education Level
Early Childhood Education | 1 |
Elementary Education | 1 |
Grade 4 | 1 |
Grade 9 | 1 |
High Schools | 1 |
Higher Education | 1 |
Intermediate Grades | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Postsecondary Education | 1 |
Secondary Education | 1 |
More ▼ |
Audience
Location
New York (New York) | 1 |
North Carolina | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Gates MacGinitie Reading Tests | 1 |
What Works Clearinghouse Rating
Zuchao Shen; Ben Kelcey – Society for Research on Educational Effectiveness, 2023
I. Purpose of the Study: Detecting whether interventions work or not (through main effect analysis) can provide empirical evidence regarding the causal linkage between malleable factors (e.g., interventions) and learner outcomes. In complement, moderation analyses help delineate for whom and under what conditions intervention effects are most…
Descriptors: Intervention, Program Effectiveness, Evidence, Research Design
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew – Journal of Research on Educational Effectiveness, 2020
More aggregate data on school performance is available than ever before, opening up new possibilities for applied researchers interested in assessing the effectiveness of school-level interventions quickly and at a relatively low cost by implementing comparative interrupted times series (CITS) designs. We examine the extent to which effect…
Descriptors: Data Use, Research Methodology, Program Effectiveness, Design
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Karras-Jean Gilles, Juliana; Astuto, Jennifer; Gjicali, Kalina; Allen, LaRue – American Journal of Evaluation, 2019
Secondary data analysis was employed to scrutinize factors affecting sample retention in a randomized evaluation of an early childhood intervention. Retention was measured by whether data were collected at 3 points over 2 years. The participants were diverse, immigrant, and U.S.-born families of color from urban, low-income communities. We…
Descriptors: Early Childhood Education, Intervention, Persistence, Recruitment
What Works Clearinghouse, 2016
This document provides step-by-step instructions on how to complete the Study Review Guide (SRG, Version S3, V2) for single-case designs (SCDs). Reviewers will complete an SRG for every What Works Clearinghouse (WWC) review. A completed SRG should be a reviewer's independent assessment of the study, relative to the criteria specified in the review…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Pettus-Davis, Carrie; Howard, Matthew Owen; Dunnigan, Allison; Scheyett, Anna M.; Roberts-Lewis, Amelia – Research on Social Work Practice, 2016
Randomized controlled trials (RCTs) are rarely used to evaluate social and behavioral interventions designed for releasing prisoners. Objective: We use a pilot RCT of a social support intervention (Support Matters) as a case example to discuss obstacles and strategies for conducting RCT intervention evaluations that span prison and community…
Descriptors: Institutionalized Persons, Correctional Institutions, Intervention, Sampling
Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon – Journal of Research on Educational Effectiveness, 2016
An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated…
Descriptors: Educational Research, Research Design, Intervention, Statistical Analysis
Joyce, Ted; Remler, Dahlia K.; Jaeger, David A.; Altindag, Onur; O'Connell, Stephen D.; Crockett, Sean – Journal of Policy Analysis and Management, 2017
Randomized experiments provide unbiased estimates of treatment effects, but are costly and time consuming. We demonstrate how a randomized experiment can be leveraged to measure selection bias by conducting a subsequent observational study that is identical in every way except that subjects choose their treatment--a quasi-doubly randomized…
Descriptors: Randomized Controlled Trials, Quasiexperimental Design, Selection Criteria, Selection Tools
Tipton, Elizabeth; Yeager, David; Iachan, Ronaldo – Society for Research on Educational Effectiveness, 2016
Questions regarding the generalizability of results from educational experiments have been at the forefront of methods development over the past five years. This work has focused on methods for estimating the effect of an intervention in a well-defined inference population (e.g., Tipton, 2013; O'Muircheartaigh and Hedges, 2014); methods for…
Descriptors: Behavioral Sciences, Behavioral Science Research, Intervention, Educational Experiments
Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castillo, Veronica – Society for Research on Educational Effectiveness, 2015
Randomized experiments are commonly used to evaluate if particular interventions improve student achievement. While these experiments can establish that a treatment actually "causes" changes, typically the participants are not randomly selected from a well-defined population and therefore the results do not readily generalize. Three…
Descriptors: Site Selection, Randomized Controlled Trials, Educational Experiments, Research Methodology