NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 9 results Save | Export
Peer reviewed Peer reviewed
Greene, Jennifer C.; Caracelli, Valerie J. – New Directions for Evaluation, 1997
Three primary stances on the wisdom of mixing evaluation models while mixing evaluation methods frame the challenges to defensible mixed-method evaluative inquiry. These challenges are addressed by shifting the mixed-method controversy from models toward other critical features of disparate traditions of inquiry. (Author/SLD)
Descriptors: Definitions, Evaluation Methods, Models, Program Evaluation
Peer reviewed Peer reviewed
Greene, Jennifer C. – New Directions for Evaluation, 2000
Reflects on an evaluation that aspired to be inclusive but generally failed to provide a backdrop for a discussion of inclusive evaluation. Identifies issues of absence of significant stakeholders, making of values by method, and the limited authority of the evaluation. Shows how easily deliberative intentions are distorted. (SLD)
Descriptors: Democracy, Evaluation Methods, Evaluation Problems, Program Evaluation
Peer reviewed Peer reviewed
Caracelli, Valerie J.; Greene, Jennifer C. – New Directions for Evaluation, 1997
Two broad classes of mixed-method designs--component and integrated--that have the potential to combine elements of different inquiry traditions are described. The conceptual ideas advanced in the first chapter are illustrated through selected examples of several mixed-method integrated models. (Author/SLD)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation
Peer reviewed Peer reviewed
Caracelli, Valerie J.; Greene, Jennifer C. – Educational Evaluation and Policy Analysis, 1993
The following four integrative data analysis strategies for mixed-method evaluation designs are derived from and illustrated by empirical practice: (1) data transformation; (2) typology development; (3) extreme case analysis; and (4) data consolidation and merging. Use of these methods to realize the full potential of mixed-method approaches is…
Descriptors: Classification, Data Analysis, Evaluation Methods, Program Design
Peer reviewed Peer reviewed
Greene, Jennifer C.; And Others – Evaluation Review, 1988
Field experiences with external (qualitative) evaluation audits based on the work of E. G. Guba and Y. S. Lincoln (1981, 1985) are detailed, including evaluation contexts and purposes, procedures, and findings. Such audits represent a viable meta-evaluative tool for assessing the quality of naturalistic evaluation results. (SLD)
Descriptors: Audits (Verification), Evaluation Methods, Evaluation Utilization, Field Studies
Peer reviewed Peer reviewed
Greene, Jennifer C.; And Others – Educational Evaluation and Policy Analysis, 1989
A mixed-method conceptual framework was developed from the literature and refined through an analysis of 57 empirical mixed-method evaluations. Five purposes for mixed-method evaluation are identified (triangulation, complementarity, development, initiation, and expansion) and analyzed using the framework of 7 design characteristics. (TJH)
Descriptors: Data Analysis, Evaluation Methods, Methods Research, Multitrait Multimethod Techniques
Peer reviewed Peer reviewed
Mathie, Alison; Greene, Jennifer C. – Evaluation and Program Planning, 1997
The grounded experiences of two participatory evaluation case studies indicate that, when action is the desired outcome of an evaluation, somewhat less rather than more diversity of stakeholder participation is actually what is wanted. A narrowing of diversity is not necessarily in violation of democratic participatory aims. (SLD)
Descriptors: Case Studies, Cultural Pluralism, Democracy, Evaluation Methods
Peer reviewed Peer reviewed
Greene, Jennifer C. – Evaluation and Program Planning, 1990
Use-oriented approaches to evaluation have been challenged by contentions that the responsiveness of such evaluations undermines their technical quality. This discussion analyzes purported conflicts between these evaluation criteria of utility and accuracy by examining underlying assumptions related to evaluation's purpose and politics. (TJH)
Descriptors: Educational Assessment, Evaluation Criteria, Evaluation Methods, Evaluation Problems
Peer reviewed Peer reviewed
Direct linkDirect link
Greene, Jennifer C. – New Directions for Evaluation, 2005
In 2001, the Bunche Academy was chosen by its district to join in partnership with the Da Vinci Learning Corporation to embark on an ambitious whole-school reform initiative, especially designed by the corporation for low-performing schools. In this chapter, the author describes how, as illustrated in the Bunche-Da Vinci Learning Academy context,…
Descriptors: School Restructuring, Educational Change, Evaluation Methods, Evaluation Research