NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)0
Since 2006 (last 20 years)4
Source
New Directions for Evaluation24
Audience
Location
Assessments and Surveys
What Works Clearinghouse Rating
Showing 1 to 15 of 24 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Zantal-Wiener, Kathy; Horwood, Thomas J. – New Directions for Evaluation, 2010
The authors propose a comprehensive evaluation framework to prepare for evaluating school emergency management programs. This framework involves a logic model that incorporates Government Performance and Results Act (GPRA) measures as a foundation for comprehensive evaluation that complements performance monitoring used by the U.S. Department of…
Descriptors: Models, Logical Thinking, Emergency Programs, Grants
Peer reviewed Peer reviewed
Direct linkDirect link
Shadish, William R.; Rindskopf, David M. – New Directions for Evaluation, 2007
Good quantitative evidence does not require large, aggregate group designs. The authors describe ground-breaking work in managing the conceptual and practical demands in developing meta-analytic strategies for single subject designs in an effort to add to evidence-based practice. (Contains 2 figures.)
Descriptors: Theory Practice Relationship, Evaluation Methods, Meta Analysis, Models
Peer reviewed Peer reviewed
Greene, Jennifer C.; Caracelli, Valerie J. – New Directions for Evaluation, 1997
Three primary stances on the wisdom of mixing evaluation models while mixing evaluation methods frame the challenges to defensible mixed-method evaluative inquiry. These challenges are addressed by shifting the mixed-method controversy from models toward other critical features of disparate traditions of inquiry. (Author/SLD)
Descriptors: Definitions, Evaluation Methods, Models, Program Evaluation
Peer reviewed Peer reviewed
Kirkhart, Karen E. – New Directions for Evaluation, 2000
Presents an integrated theory of evaluation influence that is broader than previous constructs of use and includes the dimensions of source, intention, and time. The resulting framework addresses process-based use and results-based use; intended and unintended use; and episodic and instrumental use. (SLD)
Descriptors: Evaluation Methods, Evaluation Utilization, Models, Theories
Peer reviewed Peer reviewed
Rogers, Patricia J.; Petrosino, Anthony; Huebner, Tracy A.; Hacsi, Timothy A. – New Directions for Evaluation, 2000
Describes the development of program theory evaluation and the various versions of program theory evaluation currently in use. Discusses some of the differences between theory and evaluation practice. Program theory evaluation consists of an explicit theory or model of how a program causes outcomes and an evaluation that is at least partially…
Descriptors: Evaluation Methods, Models, Program Evaluation, Theories
Peer reviewed Peer reviewed
Weiss, Carol Hirschon – New Directions for Evaluation, 2000
Provides guidance about which links in program theory to study and which theories to use. In choosing the links to study, the first criterion is to choose the links that are most critical to the success of the program. The second criterion is the degree of uncertainty about the linkage. (SLD)
Descriptors: Evaluation Methods, Models, Program Evaluation, Theories
Peer reviewed Peer reviewed
Gay, Geri; Bennington, Tammy L. – New Directions for Evaluation, 1999
Describes activity theory, which provides a conceptual framework for understanding the relationships among evaluation inquiry, technology, and the social contexts of technologically mediated evaluation practice. In activity theory, the activity provides the fundamental unit of analyses. The focus is on the use of tools by a subject to achieve an…
Descriptors: Context Effect, Evaluation Methods, Models, Technology
Peer reviewed Peer reviewed
Baizerman, Michael; Compton, Donald W.; Stockdill, Stacey Hueftle – New Directions for Evaluation, 2002
Summarizes and analyzes the four case studies of this theme issue. The analyses consider the three structural elements of evaluation capacity building (ECB): overall ECB process; actual ECB practices; and ECB occupational orientation and practitioner role. Provides a practical checklist for assessing ECB at an organization and a framework for…
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Taylor-Powell, Ellen; Boyd, Heather H. – New Directions for Evaluation, 2008
Evaluation capacity building, or ECB, is an area of great interest within the field of evaluation as well as in Extension evaluation. Internal Extension evaluators have long offered training and technical assistance to help Extension educators conduct evaluation. Today ECB in Extension encompasses myriad activities and processes to advance…
Descriptors: Evaluators, Models, Organizational Change, Logical Thinking
Peer reviewed Peer reviewed
House, Ernest R.; Howe, Kenneth R. – New Directions for Evaluation, 2000
Presents a framework for judging evaluations on the basis of their potential for democratic deliberation that includes the interrelated requirements of inclusion, dialogue, and deliberation. Operationalizes these requirements in 10 questions to guide evaluation and meta-evaluation from a democratic viewpoint. (SLD)
Descriptors: Democracy, Evaluation Methods, Meta Analysis, Models
Peer reviewed Peer reviewed
Stufflebeam, Daniel L. – New Directions for Evaluation, 2001
This monograph identifies, analyzes, and judges 22 evaluation approaches used in program evaluation. Two approaches, labeled psuedoevaluations, are politically oriented and often used to misrepresent a program's value. The remaining 20, judged legitimate, are categorized by their orientations, and rated for their value. (SLD)
Descriptors: Classification, Evaluation Methods, Models, Political Influences
Peer reviewed Peer reviewed
Caracelli, Valerie J.; Greene, Jennifer C. – New Directions for Evaluation, 1997
Two broad classes of mixed-method designs--component and integrated--that have the potential to combine elements of different inquiry traditions are described. The conceptual ideas advanced in the first chapter are illustrated through selected examples of several mixed-method integrated models. (Author/SLD)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation
Peer reviewed Peer reviewed
Datta, Lois-ellin – New Directions for Evaluation, 1997
A pragmatic framework for making decisions about mixed-method designs is proposed and then applied to illustrative evaluation case studies to help identify the strengths and limitations of making practical, contextual, and consequential considerations a primary basis for evaluation design decisions. (Author)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation
Peer reviewed Peer reviewed
Smith, Mary Lee – New Directions for Evaluation, 1997
Crude mental models should take priority over designs in guiding evaluative inquiry. Three illustrative mental models are shown to have implications for how method mixing is carried out in practice. One of the models, a complex and contextually contingent model, is applied to the Arizona Student Assessment Program. (SLD)
Descriptors: Case Studies, Context Effect, Evaluation Methods, Models
Peer reviewed Peer reviewed
Wandersman, Abraham – New Directions for Evaluation, 1999
Proposes a framework for the evaluation of health and human services programs in community settings. Applies this framework to the discussions in this special issue to see how well the chapters advance community-based program evaluation. Concludes that the chapters are valuable for those interested in improving the evaluation of community-based…
Descriptors: Community Programs, Evaluation Methods, Health Programs, Human Services
Previous Page | Next Page ยป
Pages: 1  |  2