NotesFAQContact Us
Collection
Advanced
Search Tips
Assessments and Surveys
What Works Clearinghouse Rating
Showing 1 to 15 of 116 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew; Eno, Jared – Educational Researcher, 2018
Short comparative interrupted times series (CITS) designs are increasingly being used in education research to assess the effectiveness of school-level interventions. These designs can be implemented relatively inexpensively, often drawing on publicly available data on aggregate school performance. However, the validity of this approach hinges on…
Descriptors: Educational Research, Research Methodology, Comparative Analysis, Time
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2016
This document provides step-by-step instructions on how to complete the Study Review Guide (SRG, Version S3, V2) for single-case designs (SCDs). Reviewers will complete an SRG for every What Works Clearinghouse (WWC) review. A completed SRG should be a reviewer's independent assessment of the study, relative to the criteria specified in the review…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Coalition for Evidence-Based Policy, 2014
This guide is addressed to policy officials, program providers, and researchers who are seeking to: (1) identify and implement social programs backed by valid evidence of effectiveness; or (2) sponsor or conduct an evaluation to determine whether a program is effective. The guide provides a brief overview of which studies can produce valid…
Descriptors: Program Effectiveness, Program Design, Evidence, Social Work
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Akers, Lauren; Resch, Alexandra; Berk, Jillian – National Center for Education Evaluation and Regional Assistance, 2014
This guide for district and school leaders shows how to recognize opportunities to embed randomized controlled trials (RCTs) into planned policies or programs. Opportunistic RCTs can generate strong evidence for informing education decisions--with minimal added cost and disruption. The guide also outlines the key steps to conduct RCTs and responds…
Descriptors: School Districts, Educational Research, Guides, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Institute of Education Sciences, 2013
In January 2011, a Joint Committee of representatives from the U.S. Department of Education (ED) and the U.S. National Science Foundation (NSF) began work to establish cross-agency guidelines for improving the quality, coherence, and pace of knowledge development in science, technology, engineering and mathematics (STEM) education. Although the…
Descriptors: STEM Education, Research and Development, Intervention, Educational Improvement
Unger, Karen V. – Substance Abuse and Mental Health Services Administration, 2011
Key stakeholders who implement Supported Education may find themselves asking two questions: (1) Has Supported Education been implemented as planned?; and (2) Has Supported Education resulted in the expected outcomes? Asking these two questions and using the answers to help improve Supported Education are critical for ensuring the success of one's…
Descriptors: Disabilities, Evidence, Outcome Measures, Quality Control
Phillips, Rob; McNaught, Carmel; Kennedy, Gregor – Routledge, Taylor & Francis Group, 2011
How can the average educator who teaches online, without experience in evaluating emerging technologies, build on what is successful and modify what is not? Written for educators who feel ill-prepared when required to evaluate e-learning initiatives, "Evaluating e-Learning" offers step-by-step guidance for conducting an evaluation plan of…
Descriptors: Electronic Learning, Educational Research, Online Courses, Web Based Instruction
Cicchinelli, Louis F.; Barley, Zoe – Mid-continent Research for Education and Learning (McREL), 2010
This guide is designed to help districts and schools evaluate reform efforts related to the American Recovery and Reinvestment Act (ARRA), including School Improvement Grants. It includes practical, how-to tips for conducting program evaluations, whether one takes a "do-it-yourself" approach or seeks outside assistance to evaluate reform…
Descriptors: Administrator Guides, Program Evaluation, Evaluation Methods, Educational Change
Rosenthal, James A. – Springer, 2011
Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…
Descriptors: Statistics, Data Interpretation, Social Work, Social Science Research
Coalition for Evidence-Based Policy, 2007
The purpose of this Guide is to provide education officials, program providers, and others seeking to sponsor a rigorous program evaluation with practical advice on finding a capable evaluator. More specifically, this Guide offers practical advice on finding an evaluator to conduct a rigorous evaluation to measure the effect of an educational…
Descriptors: Evaluators, Program Evaluation, Program Effectiveness, Evaluation Methods
Deno, Stanley L.; And Others – 1979
The monograph describes an approach to special education research that addresses both the promise of immediate payoff for decision makers inherent in program evaluation/validation, and the need to identify effective intervention variables of model programs that could be incorporated into other service settings. The approach, termed "program…
Descriptors: Disabilities, Learning Disabilities, Program Evaluation, Research Methodology
Coalition for Evidence-Based Policy, 2007
The purpose of this Guide is to advise researchers, policymakers, and others on when it is possible to conduct a high-quality randomized controlled trial in education at reduced cost. Well-designed randomized controlled trials are recognized as the gold standard for evaluating the effectiveness of an intervention (i.e., program or practice) in…
Descriptors: Costs, Scores, Data, Research Design
Peer reviewed Peer reviewed
Wolery, Mark – Topics in Early Childhood Special Education, 1987
The paper maintains that impact evaluation studies of programs serving handicapped infants and preschoolers should be restricted to scientifically defendable investigations. Specific types of impact studies and an evaluation process are suggested. It is suggested that program leaders should focus evaluation activities on the status of project…
Descriptors: Disabilities, Evaluation Methods, Infants, Intervention
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8