Descriptor
Evaluation Methods | 9 |
Program Evaluation | 7 |
Case Studies | 3 |
Research Design | 3 |
Data Analysis | 2 |
Democracy | 2 |
Evaluation Criteria | 2 |
Evaluation Problems | 2 |
Evaluation Utilization | 2 |
Methods Research | 2 |
Models | 2 |
More ▼ |
Source
New Directions for Evaluation | 4 |
Educational Evaluation and… | 2 |
Evaluation and Program… | 2 |
Evaluation Review | 1 |
Author
Greene, Jennifer C. | 9 |
Caracelli, Valerie J. | 3 |
Mathie, Alison | 1 |
Publication Type
Journal Articles | 9 |
Reports - Evaluative | 9 |
Speeches/Meeting Papers | 3 |
Education Level
Elementary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating

Greene, Jennifer C.; Caracelli, Valerie J. – New Directions for Evaluation, 1997
Three primary stances on the wisdom of mixing evaluation models while mixing evaluation methods frame the challenges to defensible mixed-method evaluative inquiry. These challenges are addressed by shifting the mixed-method controversy from models toward other critical features of disparate traditions of inquiry. (Author/SLD)
Descriptors: Definitions, Evaluation Methods, Models, Program Evaluation

Greene, Jennifer C. – New Directions for Evaluation, 2000
Reflects on an evaluation that aspired to be inclusive but generally failed to provide a backdrop for a discussion of inclusive evaluation. Identifies issues of absence of significant stakeholders, making of values by method, and the limited authority of the evaluation. Shows how easily deliberative intentions are distorted. (SLD)
Descriptors: Democracy, Evaluation Methods, Evaluation Problems, Program Evaluation

Caracelli, Valerie J.; Greene, Jennifer C. – New Directions for Evaluation, 1997
Two broad classes of mixed-method designs--component and integrated--that have the potential to combine elements of different inquiry traditions are described. The conceptual ideas advanced in the first chapter are illustrated through selected examples of several mixed-method integrated models. (Author/SLD)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation

Caracelli, Valerie J.; Greene, Jennifer C. – Educational Evaluation and Policy Analysis, 1993
The following four integrative data analysis strategies for mixed-method evaluation designs are derived from and illustrated by empirical practice: (1) data transformation; (2) typology development; (3) extreme case analysis; and (4) data consolidation and merging. Use of these methods to realize the full potential of mixed-method approaches is…
Descriptors: Classification, Data Analysis, Evaluation Methods, Program Design

Greene, Jennifer C.; And Others – Evaluation Review, 1988
Field experiences with external (qualitative) evaluation audits based on the work of E. G. Guba and Y. S. Lincoln (1981, 1985) are detailed, including evaluation contexts and purposes, procedures, and findings. Such audits represent a viable meta-evaluative tool for assessing the quality of naturalistic evaluation results. (SLD)
Descriptors: Audits (Verification), Evaluation Methods, Evaluation Utilization, Field Studies

Greene, Jennifer C.; And Others – Educational Evaluation and Policy Analysis, 1989
A mixed-method conceptual framework was developed from the literature and refined through an analysis of 57 empirical mixed-method evaluations. Five purposes for mixed-method evaluation are identified (triangulation, complementarity, development, initiation, and expansion) and analyzed using the framework of 7 design characteristics. (TJH)
Descriptors: Data Analysis, Evaluation Methods, Methods Research, Multitrait Multimethod Techniques

Mathie, Alison; Greene, Jennifer C. – Evaluation and Program Planning, 1997
The grounded experiences of two participatory evaluation case studies indicate that, when action is the desired outcome of an evaluation, somewhat less rather than more diversity of stakeholder participation is actually what is wanted. A narrowing of diversity is not necessarily in violation of democratic participatory aims. (SLD)
Descriptors: Case Studies, Cultural Pluralism, Democracy, Evaluation Methods

Greene, Jennifer C. – Evaluation and Program Planning, 1990
Use-oriented approaches to evaluation have been challenged by contentions that the responsiveness of such evaluations undermines their technical quality. This discussion analyzes purported conflicts between these evaluation criteria of utility and accuracy by examining underlying assumptions related to evaluation's purpose and politics. (TJH)
Descriptors: Educational Assessment, Evaluation Criteria, Evaluation Methods, Evaluation Problems
Greene, Jennifer C. – New Directions for Evaluation, 2005
In 2001, the Bunche Academy was chosen by its district to join in partnership with the Da Vinci Learning Corporation to embark on an ambitious whole-school reform initiative, especially designed by the corporation for low-performing schools. In this chapter, the author describes how, as illustrated in the Bunche-Da Vinci Learning Academy context,…
Descriptors: School Restructuring, Educational Change, Evaluation Methods, Evaluation Research