NotesFAQContact Us
Collection
Advanced
Search Tips
Assessments and Surveys
What Works Clearinghouse Rating
Showing 1 to 15 of 70 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Faria, Ann-Marie; Hawkinson, Laura; Metzger, Ivan; Bouacha, Nora; Cantave, Michelle – Regional Educational Laboratory Midwest, 2017
A quality rating and improvement system (QRIS) is a voluntary state assessment system that uses multidimensional data on early childhood education programs to rate program quality, support quality improvement efforts, and provide information to families about the quality of available early childhood education programs. QRISs have two components:…
Descriptors: Early Childhood Education, Educational Quality, Educational Improvement, Educational Practices
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2016
This document provides step-by-step instructions on how to complete the Study Review Guide (SRG, Version S3, V2) for single-case designs (SCDs). Reviewers will complete an SRG for every What Works Clearinghouse (WWC) review. A completed SRG should be a reviewer's independent assessment of the study, relative to the criteria specified in the review…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Institute of Education Sciences, 2013
In January 2011, a Joint Committee of representatives from the U.S. Department of Education (ED) and the U.S. National Science Foundation (NSF) began work to establish cross-agency guidelines for improving the quality, coherence, and pace of knowledge development in science, technology, engineering and mathematics (STEM) education. Although the…
Descriptors: STEM Education, Research and Development, Intervention, Educational Improvement
Cicchinelli, Louis F.; Barley, Zoe – Mid-continent Research for Education and Learning (McREL), 2010
This guide is designed to help districts and schools evaluate reform efforts related to the American Recovery and Reinvestment Act (ARRA), including School Improvement Grants. It includes practical, how-to tips for conducting program evaluations, whether one takes a "do-it-yourself" approach or seeks outside assistance to evaluate reform…
Descriptors: Administrator Guides, Program Evaluation, Evaluation Methods, Educational Change
Rosenthal, James A. – Springer, 2011
Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…
Descriptors: Statistics, Data Interpretation, Social Work, Social Science Research
Coalition for Evidence-Based Policy, 2007
The purpose of this Guide is to advise researchers, policymakers, and others on when it is possible to conduct a high-quality randomized controlled trial in education at reduced cost. Well-designed randomized controlled trials are recognized as the gold standard for evaluating the effectiveness of an intervention (i.e., program or practice) in…
Descriptors: Costs, Scores, Data, Research Design
National Forum on Early Childhood Program Evaluation, 2007
Increasing demands for evidence-based early childhood services and the need by policymakers to know whether a program is effective or whether it warrants a significant investment of public and/or private funds--coupled with the often-politicized debate around these topics--make it imperative for policymakers and civic leaders to have independent…
Descriptors: Evaluation Research, Program Evaluation, Young Children, Evaluation Methods
Jordan, Debra J. – Camping Magazine, 1994
Conducting research about camp programs begins with asking a question and determining the nature of the question--whether it requires a needs assessment, an evaluation process, original data, or a cost/benefit analysis. Discusses factors to consider in deciding whether to allow existing staff members to conduct a study, or to contract with an…
Descriptors: Camping, Evaluation Needs, Program Evaluation, Program Improvement
Peer reviewed Peer reviewed
Direct linkDirect link
Cohen, Stepehn L. – Performance Improvement, 2005
Although the need for evaluation of training programs administered in organizations is recognized and widely discussed, program evaluation is not easy, and there exists a gap between discussion and practice. Cohen notes an absence of research efforts demonstrating the application of evaluation methods, particularly within control groups. This…
Descriptors: Research Design, Program Evaluation, Control Groups, Evaluation Methods
US Department of Education, 2008
This guide is designed as a resource for leaders and evaluators of K-12 online learning programs. In this guide, the term "online learning" is used to refer to a range of education programs and resources in the K-12 arena, including distance learning courses offered by universities, private providers, or teachers at other schools;…
Descriptors: Elementary Secondary Education, Distance Education, Online Courses, Web Sites
Bergstrom, Joan M.; Reis, Janet – 1979
This paper provides guidlines for formally evaluating extended-day programs. Extended-day programs are defined as those attended before and after school by children between the ages of 5 and 14. A seven step evaluation process, in which the practitioner responsible for program administration plays a key role, is outlined and discussed. (Author/RH)
Descriptors: Administrator Responsibility, After School Day Care, Extended School Day, Program Evaluation
Peer reviewed Peer reviewed
Chen, Huey-Tsyh; Rossi, Peter H. – Evaluation Review, 1983
The use of theoretical models in impact assessment can heighten the power of experimental designs and compensate for some deficiencies of quasi-experimental designs. Theoretical models of implementation processes are examined, arguing that these processes are a major obstacle to fully effective programs. (Author/CM)
Descriptors: Evaluation Criteria, Evaluation Methods, Models, Program Evaluation
Peer reviewed Peer reviewed
Ross, John A. – Evaluation and Program Planning: An International Journal, 1981
Professional evaluators are often called upon to analyze data produced by a catastrophically inadequate evaluation design. A remedial strategy involving diagnosis of error, application of a corrective procedure and sensitization of program personnel of the need for a more sophisticated stance, is proposed and illustrated with a case study.…
Descriptors: Data Analysis, Meta Evaluation, Models, Program Evaluation
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5