NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)1
Since 2006 (last 20 years)6
Audience
Researchers33
Policymakers7
Practitioners2
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Showing 1 to 15 of 33 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2017
The What Works Clearinghouse (WWC) evaluates research studies that look at the effectiveness of education programs, products, practices, and policies, which the WWC calls "interventions." Many studies of education interventions make claims about impacts on students' outcomes. Some studies have designs that enable readers to make causal…
Descriptors: Program Design, Program Development, Program Effectiveness, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Harris, John – Language Teaching Research, 2009
The teaching and learning of Irish in primary school is both an important educational issue and central to the national language revitalization effort. The findings of Irish-language programme evaluations, therefore, are invariably scrutinized very closely by different sectors. This paper examines how the later stages of a major evaluation took…
Descriptors: Language Planning, Evaluators, Official Languages, Language of Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Cooksy, Leslie J. – American Journal of Evaluation, 2007
All evaluators face the challenge of striving to adhere to the highest possible standards of ethical conduct. Translating the AEA's Guiding Principles and the Joint Committee's Program Evaluation Standards into everyday practice, however, can be a complex, uncertain, and frustrating endeavor. Moreover, acting in an ethical fashion can require…
Descriptors: Program Evaluation, Evaluators, Ethics, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Lawrenz, Frances; Gullickson, Arlen; Toal, Stacie – American Journal of Evaluation, 2007
Use of evaluation findings is a valued outcome for most evaluators. However, to optimize use, the findings need to be disseminated to potential users in formats that facilitate use of the information. This reflective case narrative uses a national evaluation of a multisite National Science Foundation (NSF) program as the setting for describing the…
Descriptors: Evaluators, Audiences, Strategic Planning, Information Dissemination
Aas, Gro Hanne; Askling, Berit; Dittrich, Karl; Froestad, Wenche; Haug, Peder; Lycke, Kirsten Hofgaard; Moitus, Sirpa; Pyykko, Riitta; Sorskar, Anne Karine – ENQA (European Association for Quality Assurance in Higher Education), 2009
This report is a product of an European Association for Quality Assurance in Higher Education (ENQA) Workshop "Assessing educational quality: Knowledge production and the role of experts" hosted by the Norwegian Agency for Quality Assurance in Education (NOKUT) in Oslo in February, 2008. The workshop gathered representatives from higher…
Descriptors: Higher Education, Educational Quality, Quality Control, Workshops
Coalition for Evidence-Based Policy, 2007
The purpose of this Guide is to advise researchers, policymakers, and others on when it is possible to conduct a high-quality randomized controlled trial in education at reduced cost. Well-designed randomized controlled trials are recognized as the gold standard for evaluating the effectiveness of an intervention (i.e., program or practice) in…
Descriptors: Costs, Scores, Data, Research Design
Patton, Michael Quinn – 1985
This paper reviews what has been learned about evaluation ultilization during the past 20 years. Evaluation utilization is discussed in terms of what is used, who uses evaluation, when evaluation is used, how evaluation is used, where evaluation is used, and why evaluation is used. It is suggested that the personal factor - the interests and…
Descriptors: Evaluation, Evaluation Methods, Evaluation Needs, Evaluation Utilization
Fillos, Rita M.; Manger, Katherine M. – 1984
Brief case studies of three projects are presented to illustrate the steps which open "Wonderful Programs" to evaluation research. The evaluator is provided with the perspective needed to believe the evaluation is worth doing. The project staff is then redirected to the need for a new type of evidence. An ongoing review of the commitment…
Descriptors: Communication Problems, Evaluation Methods, Evaluation Needs, Evaluators
Sherwood-Fabre, Liese – 1986
This paper examines the concepts of program monitoring and program evaluation in the literature, and offers working definitions based on two dimensions of measurement: focus (what questions are addressed) and timing (how often the measures are taken). Focus can be on inputs to the program or outcomes from it; timing can be one-shot or continuous.…
Descriptors: Evaluation Methods, Evaluators, Formative Evaluation, Program Administration
Peer reviewed Peer reviewed
Lincoln, Yvonna Seossions – Evaluation Practice, 1991
The various arts and sciences that comprise the field of program evaluation are discussed. It is argued that emphasis on rigor and expressive content has left other aspects of evaluation unexplored. Educational evaluators need to consider what programs mean and how they contribute to understanding. (SLD)
Descriptors: Evaluation Methods, Evaluators, Program Effectiveness, Program Evaluation
Thompson, Bruce; Miller, Leslie A. – 1983
There has been much literature on the effectiveness of program evaluation. Many feel evaluation has lost its justification; others are more optimistic and feel evaluation can make a difference if the use levels are viewed realistically. Administrators must not regard program evaluation as merely complying with various agency requirements or…
Descriptors: Administrator Attitudes, Evaluation Utilization, Evaluators, Factor Analysis
Hoffman, Lee McGraw; And Others – 1984
Examples from the evaluation of a program in which data collection systems were developed jointly by the program's staff and evaluators are described. The Louisiana SPUR (Special Plan Upgrading Reading) Project was evaluated by the Louisiana Department of Education Bureau of Evaluation. SPUR involves 63 of the state's 66 public school systems and…
Descriptors: Data Collection, Databases, Elementary Education, Evaluation Methods
Gray, Wayne D. – 1984
This paper presents a framework for monitoring implementation which defines a process for implementing new programs into organizations. The process requires a team of monitors who examine the adequacy of implementation plans and look at the effect of plan execution upon the organization, individual, and new program. Immediate feedback is provided…
Descriptors: Data Interpretation, Evaluation Methods, Evaluators, Feedback
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2005
The purpose of this guide is to provide clear, practical advice on reporting the results of an evaluation of an educational program or practice ("intervention"). Specifically, this is a guide for researchers, and those who sponsor and use research, to reporting the results of "impact" studies--that is, studies that evaluate the…
Descriptors: Evaluation Methods, Program Evaluation, Intervention, Research Reports
St. John, Mark – 1984
The Service Delivery Assessment (SDA) model is a human services evaluation tool which offers a viable alternative to more traditional approaches and is appropriate for small local evaluations as well as large national studies. There are five phases to an SDA study: assignment, pre-assessment, design, analysis, and communication of the findings.…
Descriptors: Delivery Systems, Evaluation Methods, Evaluators, Human Services
Previous Page | Next Page ยป
Pages: 1  |  2  |  3