NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 76 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Alexander D. Latham; David A. Klingbeil – Grantee Submission, 2024
The visual analysis of data presented in time-series graphs are common in single-case design (SCD) research and applied practice in school psychology. A growing body of research suggests that visual analysts' ratings are often influenced by construct-irrelevant features including Y-axis truncation and compression of the number of data points per…
Descriptors: Intervention, School Psychologists, Graphs, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Huey T. Chen; Liliana Morosanu; Victor H. Chen – Asia Pacific Journal of Education, 2024
The Campbellian validity typology has been used as a foundation for outcome evaluation and for developing evidence-based interventions for decades. As such, randomized control trials were preferred for outcome evaluation. However, some evaluators disagree with the validity typology's argument that randomized controlled trials as the best design…
Descriptors: Evaluation Methods, Systems Approach, Intervention, Evidence Based Practice
Peer reviewed Peer reviewed
Direct linkDirect link
Toluchuri Shalini Shanker Rao; Kaushal Kumar Bhagat – Educational Technology Research and Development, 2024
Computational thinking (CT) has received growing interest as a research subject in the last decade, with research contributions attempting to capitalize on the benefits that CT may provide. This study included a systematic analysis aimed at revealing current trends in the CT subject, identifying educational interventions, and emerging assessment…
Descriptors: Computation, Thinking Skills, Educational Research, Skill Development
Peer reviewed Peer reviewed
Direct linkDirect link
Manolov, Rumen; Tanious, René; Fernández-Castilla, Belén – Journal of Applied Behavior Analysis, 2022
In science in general and in the context of single-case experimental designs, replication of the effects of the intervention within and/or across participants or experiments is crucial for establishing causality and for assessing the generality of the intervention effect. Specific developments and proposals for assessing whether an effect has been…
Descriptors: Intervention, Behavioral Science Research, Replication (Evaluation), Research Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Ashley R. Gibbs; Christopher A. Tullis – Review Journal of Autism and Developmental Disorders, 2021
An increasing number of investigations have described various interventions to promote the emergence of derived responding within the relation of sameness for individuals with autism and other intellectual and developmental disabilities. Systematic searches identified 53 studies published since 2013 that met inclusion criteria. These studies were…
Descriptors: Autism Spectrum Disorders, Intellectual Disability, Developmental Disabilities, Literature Reviews
Daniels, Katherine Nelson – ProQuest LLC, 2018
Traditional pre-test (TpT)/post-test (PT) and retrospective pre-test (RpT)/post-test (PT) designs are used to collect data on self-reported measures to assess the magnitude of change that occurs from interventions. If measurement invariance does not exist across the measurement occasions within these research designs, it is inappropriate to…
Descriptors: Pretests Posttests, Evaluation Methods, Intervention, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Radley, Keith C.; Dart, Evan H.; Wright, Sarah J. – School Psychology Quarterly, 2018
Research based on single-case designs (SCD) are frequently utilized in educational settings to evaluate the effect of an intervention on student behavior. Visual analysis is the primary method of evaluation of SCD, despite research noting concerns regarding reliability of the procedure. Recent research suggests that characteristics of the graphic…
Descriptors: Graphs, Evaluation Methods, Data, Intervention
Zimmerman, Kathleen N.; Ledford, Jennifer R.; Severini, Katherine E.; Pustejovsky, James E.; Barton, Erin E.; Lloyd, Blair P. – Grantee Submission, 2018
Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional…
Descriptors: Research Design, Evaluation Methods, Synthesis, Validity
Zimmerman, Kathleen N.; Pustejovsky, James E.; Ledford, Jennifer R.; Barton, Erin E.; Severini, Katherine E.; Lloyd, Blair P. – Grantee Submission, 2018
Varying methods for evaluating the outcomes of single case research designs (SCD) are currently used in reviews and meta-analyses of interventions. Quantitative effect size measures are often presented alongside visual analysis conclusions. Six measures across two classes--overlap measures (percentage non-overlapping data, improvement rate…
Descriptors: Research Design, Evaluation Methods, Synthesis, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Fives, Allyn; Canavan, John; Dolan, Pat – European Early Childhood Education Research Journal, 2017
There is significant controversy over what counts as evidence in the evaluation of social interventions. It is increasingly common to use methodological criteria to rank evidence types in a hierarchy, with Randomised Controlled Trials (RCTs) at or near the highest level. Because of numerous challenges to a hierarchical approach, this article…
Descriptors: Evaluation Methods, Evaluation Research, Randomized Controlled Trials, Ethics
Peer reviewed Peer reviewed
Direct linkDirect link
Porter, Kristin E.; Reardon, Sean F.; Unlu, Fatih; Bloom, Howard S.; Cimpian, Joseph R. – Journal of Research on Educational Effectiveness, 2017
A valuable extension of the single-rating regression discontinuity design (RDD) is a multiple-rating RDD (MRRDD). To date, four main methods have been used to estimate average treatment effects at the multiple treatment frontiers of an MRRDD: the "surface" method, the "frontier" method, the "binding-score" method, and…
Descriptors: Regression (Statistics), Intervention, Quasiexperimental Design, Simulation
Peer reviewed Peer reviewed
Direct linkDirect link
Wei, Yan; Lombardi, Allison; Simonsen, Brandi; Coyne, Michael; Faggella-Luby, Michael; Freeman, Jennifer; Kearns, Devin – Learning Disabilities: A Multidisciplinary Journal, 2017
A single-subject AB multiple-baseline design across participants was utilized to investigate the effectiveness of the Revised Tier Three Instructional Planning (T-TIP) tool on teacher lesson planning, with a focus on corrective and elaborative feedback within intensive literacy instructional settings in secondary schools. Findings revealed that…
Descriptors: Reading Instruction, Instructional Development, Lesson Plans, Feedback (Response)
Peer reviewed Peer reviewed
Direct linkDirect link
Higgins, Julian P. T.; Ramsay, Craig; Reeves, Barnaby C.; Deeks, Jonathan J.; Shea, Beverley; Valentine, Jeffrey C.; Tugwell, Peter; Wells, George – Research Synthesis Methods, 2013
Non-randomized studies may provide valuable evidence on the effects of interventions. They are the main source of evidence on the intended effects of some types of interventions and often provide the only evidence about the effects of interventions on long-term outcomes, rare events or adverse effects. Therefore, systematic reviews on the effects…
Descriptors: Research Methodology, Intervention, Program Effectiveness, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Tipton, Elizabeth; Yeager, David; Iachan, Ronaldo – Society for Research on Educational Effectiveness, 2016
Questions regarding the generalizability of results from educational experiments have been at the forefront of methods development over the past five years. This work has focused on methods for estimating the effect of an intervention in a well-defined inference population (e.g., Tipton, 2013; O'Muircheartaigh and Hedges, 2014); methods for…
Descriptors: Behavioral Sciences, Behavioral Science Research, Intervention, Educational Experiments
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6