Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 4 |
Descriptor
Program Evaluation | 14 |
Research Design | 14 |
Evaluation Methods | 12 |
Case Studies | 8 |
Evaluators | 5 |
Models | 5 |
Evaluation Research | 4 |
Research Methodology | 3 |
Context Effect | 2 |
Evaluation Criteria | 2 |
Integrated Activities | 2 |
More ▼ |
Source
New Directions for Evaluation | 14 |
Author
Publication Type
Journal Articles | 14 |
Reports - Evaluative | 10 |
Reports - Descriptive | 3 |
Collected Works - General | 1 |
Collected Works - Serials | 1 |
Information Analyses | 1 |
Opinion Papers | 1 |
Education Level
Elementary Education | 2 |
Adult Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Ferraro, Paul J. – New Directions for Evaluation, 2009
Impact evaluations assess the degree to which changes in outcomes can be attributed to an intervention rather than to other factors. Such attribution requires knowing what outcomes would have looked like in the absence of the intervention. This counterfactual world can be inferred only indirectly through evaluation designs that control for…
Descriptors: Intervention, Program Evaluation, Policy, Conservation (Environment)
Compton, Donald W. – New Directions for Evaluation, 2009
Donald W. Compton, the first director of evaluation services at the National Home Office (Atlanta) of the American Cancer Society, tells the story of building the unit in conditions of high demand and a limited budget. Along the way, evaluation was brought to regional divisions and to local offices in part as a response to United Way and to his…
Descriptors: Evaluators, Evaluation Methods, Program Evaluation, Evaluation Research
Greene, Jennifer C.; Lipsey, Mark W.; Schwandt, Thomas A.; Smith, Nick L.; Tharp, Roland G. – New Directions for Evaluation, 2007
Productive dialogue is informed best by multiple and diverse voices. Five seasoned evaluators, representing a range of evaluation perspectives, offer their views in two- to three-page discussant contributions. These individuals were asked to reflect and comment on the previous chapters in the spirit of critical review as a key source of evidence…
Descriptors: Evaluators, Evaluation Methods, Research Design, Inquiry

Caracelli, Valerie J.; Greene, Jennifer C. – New Directions for Evaluation, 1997
Two broad classes of mixed-method designs--component and integrated--that have the potential to combine elements of different inquiry traditions are described. The conceptual ideas advanced in the first chapter are illustrated through selected examples of several mixed-method integrated models. (Author/SLD)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation

Datta, Lois-ellin – New Directions for Evaluation, 1997
A pragmatic framework for making decisions about mixed-method designs is proposed and then applied to illustrative evaluation case studies to help identify the strengths and limitations of making practical, contextual, and consequential considerations a primary basis for evaluation design decisions. (Author)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation

Chen, Huey-tsyh – New Directions for Evaluation, 1997
Illustrative case studies support a contingency approach to mixed-method evaluation in which the evaluation team bases its selection of methods on the information to be provided, the availability of data, and the degree to which the program environment is an open or closed system. (SLD)
Descriptors: Case Studies, Context Effect, Evaluation Methods, Models

Riggin, Leslie J. C. – New Directions for Evaluation, 1997
Case examples in the articles in this volume are analyzed in the mixed-method conceptual frameworks presented in the early chapters. The richer understanding derived from these approaches support the call to advance mixed-method inquiry beyond models and toward inquiry attributes that can be combined. (Author/SLD)
Descriptors: Case Studies, Evaluation Methods, Integrated Activities, Models
Julnes, George; Rog, Debra J. – New Directions for Evaluation, 2007
This final chapter summarizes the areas of consensus in the debate on method choice, including considering the nature of the primary evaluation questions, the nature of the phenomenon being evaluated, the constraints on the evaluation, and ethical issues. Pragmatic suggestions based on these areas as well as areas still in contention are offered.…
Descriptors: Questioning Techniques, Ethics, Pragmatics, Research Methodology

Mark, Melvin M.; Feller, Irwin; Button, Scott B. – New Directions for Evaluation, 1997
A review of qualitative methods used in a predominantly quantitative evaluation indicates a variety of roles for such a mixing of methods, including framing and revising research questions, assessing the validity of measures and adaptations to program implementation, and gauging the degree of uncertainty and generalizability of conclusions.…
Descriptors: Case Studies, Integrated Activities, Models, Program Evaluation

Newcomer, Kathryn E., Ed. – New Directions for Evaluation, 1997
The seven chapters of this volume review current design and use of performance measurement in public and nonprofit programs. The contexts surrounding design and implementation of performance measurement systems and best practices are discussed, and examples of the use of performance measurement in government and the nonprofit sector are presented.…
Descriptors: Evaluation Methods, Government (Administrative Body), Nonprofit Organizations, Performance Based Assessment
Donaldson, Stewart I. – New Directions for Evaluation, 2005
Program theory-driven evaluation science uses substantive knowledge, as opposed to method proclivities, to guide program evaluations. It aspires to update, clarify, simplify, and make more accessible the evolving theory of evaluation practice commonly referred to as theory-driven or theory-based evaluation. The evaluator in this chapter provides a…
Descriptors: Evaluators, Program Evaluation, Evaluation Methods, Evaluation Criteria

Braverman, Marc T., Ed.; Slater, Jana Kay, Ed. – New Directions for Evaluation, 1996
The seven articles of this special issue focus on theory and research related to survey methods and whether evaluators need information that is absent from the larger survey literature. These articles reflect the view that the considerations relating to good survey research apply also to using surveys in evaluation. (SLD)
Descriptors: Data Collection, Evaluation Methods, Evaluators, Needs Assessment
King, Jean A. – New Directions for Evaluation, 2005
The author describes potential evaluation capacity-building activities in contrast to the specifics of an evaluation design. Her response to the case of the Bunche-Da Vinci Learning Partnership Academy is developed in three parts: (1) an initial framing of the Bunche-Da Vinci situation; (2) what should be done before signing a contract; and (3)…
Descriptors: Evaluators, Program Evaluation, Action Research, Program Content
Greene, Jennifer C. – New Directions for Evaluation, 2005
In 2001, the Bunche Academy was chosen by its district to join in partnership with the Da Vinci Learning Corporation to embark on an ambitious whole-school reform initiative, especially designed by the corporation for low-performing schools. In this chapter, the author describes how, as illustrated in the Bunche-Da Vinci Learning Academy context,…
Descriptors: School Restructuring, Educational Change, Evaluation Methods, Evaluation Research