Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 4 |
Descriptor
Models | 22 |
Program Evaluation | 22 |
Evaluation Methods | 18 |
Case Studies | 7 |
Research Design | 5 |
Program Effectiveness | 4 |
Context Effect | 3 |
Logical Thinking | 3 |
Theories | 3 |
Academic Achievement | 2 |
Educational Change | 2 |
More ▼ |
Source
New Directions for Evaluation | 22 |
Author
Caracelli, Valerie J. | 2 |
Greene, Jennifer C. | 2 |
Baizerman, Michael | 1 |
Boyd, Heather H. | 1 |
Button, Scott B. | 1 |
Chen, Huey-tsyh | 1 |
Compton, Donald W. | 1 |
Datta, Lois-ellin | 1 |
Engle, Molly | 1 |
Feller, Irwin | 1 |
Hacsi, Timothy A. | 1 |
More ▼ |
Publication Type
Journal Articles | 22 |
Reports - Evaluative | 13 |
Reports - Descriptive | 7 |
Collected Works - Serials | 3 |
Information Analyses | 2 |
Book/Product Reviews | 1 |
Reports - Research | 1 |
Education Level
Higher Education | 2 |
Postsecondary Education | 2 |
Adult Education | 1 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Government Performance and… | 1 |
No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
Zantal-Wiener, Kathy; Horwood, Thomas J. – New Directions for Evaluation, 2010
The authors propose a comprehensive evaluation framework to prepare for evaluating school emergency management programs. This framework involves a logic model that incorporates Government Performance and Results Act (GPRA) measures as a foundation for comprehensive evaluation that complements performance monitoring used by the U.S. Department of…
Descriptors: Models, Logical Thinking, Emergency Programs, Grants
Preskill, Hallie – New Directions for Evaluation, 2009
The chapters in this volume set a rich context for understanding the challenges that environmental evaluators face in their everyday work. In particular, the authors highlight the need for responsive, contextual, flexible, adaptive, multidisciplinary, and mixed-methods evaluation approaches. In this chapter, I reinforce their call and further…
Descriptors: Environment, Program Evaluation, Systems Approach, Change
Rennekamp, Roger A.; Engle, Molly – New Directions for Evaluation, 2008
This chapter examines how factors both internal and external to Cooperative Extension have influenced its commitment and capability to assess the quality and impact of its programs. The authors begin by documenting how the nature of Extension programming has changed dramatically in response to societal needs over the course of the organization's…
Descriptors: Extension Education, Program Evaluation, Program Effectiveness, Organizational Change

Greene, Jennifer C.; Caracelli, Valerie J. – New Directions for Evaluation, 1997
Three primary stances on the wisdom of mixing evaluation models while mixing evaluation methods frame the challenges to defensible mixed-method evaluative inquiry. These challenges are addressed by shifting the mixed-method controversy from models toward other critical features of disparate traditions of inquiry. (Author/SLD)
Descriptors: Definitions, Evaluation Methods, Models, Program Evaluation

Rogers, Patricia J.; Petrosino, Anthony; Huebner, Tracy A.; Hacsi, Timothy A. – New Directions for Evaluation, 2000
Describes the development of program theory evaluation and the various versions of program theory evaluation currently in use. Discusses some of the differences between theory and evaluation practice. Program theory evaluation consists of an explicit theory or model of how a program causes outcomes and an evaluation that is at least partially…
Descriptors: Evaluation Methods, Models, Program Evaluation, Theories

Weiss, Carol Hirschon – New Directions for Evaluation, 2000
Provides guidance about which links in program theory to study and which theories to use. In choosing the links to study, the first criterion is to choose the links that are most critical to the success of the program. The second criterion is the degree of uncertainty about the linkage. (SLD)
Descriptors: Evaluation Methods, Models, Program Evaluation, Theories

Baizerman, Michael; Compton, Donald W.; Stockdill, Stacey Hueftle – New Directions for Evaluation, 2002
Summarizes and analyzes the four case studies of this theme issue. The analyses consider the three structural elements of evaluation capacity building (ECB): overall ECB process; actual ECB practices; and ECB occupational orientation and practitioner role. Provides a practical checklist for assessing ECB at an organization and a framework for…
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation
Taylor-Powell, Ellen; Boyd, Heather H. – New Directions for Evaluation, 2008
Evaluation capacity building, or ECB, is an area of great interest within the field of evaluation as well as in Extension evaluation. Internal Extension evaluators have long offered training and technical assistance to help Extension educators conduct evaluation. Today ECB in Extension encompasses myriad activities and processes to advance…
Descriptors: Evaluators, Models, Organizational Change, Logical Thinking

House, Ernest R.; Howe, Kenneth R. – New Directions for Evaluation, 2000
Presents a framework for judging evaluations on the basis of their potential for democratic deliberation that includes the interrelated requirements of inclusion, dialogue, and deliberation. Operationalizes these requirements in 10 questions to guide evaluation and meta-evaluation from a democratic viewpoint. (SLD)
Descriptors: Democracy, Evaluation Methods, Meta Analysis, Models

Stufflebeam, Daniel L. – New Directions for Evaluation, 2001
This monograph identifies, analyzes, and judges 22 evaluation approaches used in program evaluation. Two approaches, labeled psuedoevaluations, are politically oriented and often used to misrepresent a program's value. The remaining 20, judged legitimate, are categorized by their orientations, and rated for their value. (SLD)
Descriptors: Classification, Evaluation Methods, Models, Political Influences

Caracelli, Valerie J.; Greene, Jennifer C. – New Directions for Evaluation, 1997
Two broad classes of mixed-method designs--component and integrated--that have the potential to combine elements of different inquiry traditions are described. The conceptual ideas advanced in the first chapter are illustrated through selected examples of several mixed-method integrated models. (Author/SLD)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation

Datta, Lois-ellin – New Directions for Evaluation, 1997
A pragmatic framework for making decisions about mixed-method designs is proposed and then applied to illustrative evaluation case studies to help identify the strengths and limitations of making practical, contextual, and consequential considerations a primary basis for evaluation design decisions. (Author)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation

Smith, Mary Lee – New Directions for Evaluation, 1997
Crude mental models should take priority over designs in guiding evaluative inquiry. Three illustrative mental models are shown to have implications for how method mixing is carried out in practice. One of the models, a complex and contextually contingent model, is applied to the Arizona Student Assessment Program. (SLD)
Descriptors: Case Studies, Context Effect, Evaluation Methods, Models

Wandersman, Abraham – New Directions for Evaluation, 1999
Proposes a framework for the evaluation of health and human services programs in community settings. Applies this framework to the discussions in this special issue to see how well the chapters advance community-based program evaluation. Concludes that the chapters are valuable for those interested in improving the evaluation of community-based…
Descriptors: Community Programs, Evaluation Methods, Health Programs, Human Services

Chen, Huey-tsyh – New Directions for Evaluation, 1997
Illustrative case studies support a contingency approach to mixed-method evaluation in which the evaluation team bases its selection of methods on the information to be provided, the availability of data, and the degree to which the program environment is an open or closed system. (SLD)
Descriptors: Case Studies, Context Effect, Evaluation Methods, Models
Previous Page | Next Page ยป
Pages: 1 | 2