Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 14 |
Descriptor
Source
Author
Adams, Pam | 1 |
Baudouin, Robert | 1 |
Bezanson, Lynne | 1 |
Borgen, Bill | 1 |
Braverman, Marc T. | 1 |
Caracelli, Valerie | 1 |
Dunlap, Laurie A. | 1 |
Elliott, Barbara G. | 1 |
Enterline, Sarah | 1 |
Gleeson, Ann Marie | 1 |
Goyer, Liette | 1 |
More ▼ |
Publication Type
Reports - Descriptive | 20 |
Journal Articles | 17 |
Guides - Non-Classroom | 3 |
Opinion Papers | 2 |
Guides - General | 1 |
Speeches/Meeting Papers | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 7 |
Adult Education | 3 |
Elementary Secondary Education | 2 |
Postsecondary Education | 2 |
Audience
Practitioners | 2 |
Administrators | 1 |
Researchers | 1 |
Location
South Carolina | 1 |
United States | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Hastings, Lindsay J. – New Directions for Student Leadership, 2022
The purpose of this article is to identify proper uses for mixed methods in leadership research and assessment. This article will highlight research and assessment questions that are best served by mixed methods and will offer practitioner-friendly guides and examples for integrating quantitative and qualitative data to help eat the whole whale…
Descriptors: Mixed Methods Research, Leadership, Research Design, Outcome Measures
Inkelas, Karen Kurotsuchi – New Directions for Institutional Research, 2017
This chapter describes a process for assessing programmatic initiatives with broad-ranging goals with the use of a mixed-methods design. Using an example of a day-long teaching development conference, this chapter provides practitioners step-by-step guidance on how to implement this assessment process.
Descriptors: Mixed Methods Research, Educational Objectives, Program Evaluation, Faculty Development
Mark, Melvin M.; Caracelli, Valerie; McNall, Miles A.; Miller, Robin Lin – American Journal of Evaluation, 2018
Since 2003, the Oral History Project Team has conducted interviews with individuals who have made particularly noteworthy contributions to the theory and practice of evaluation. In 2013, Mel Mark, Valerie Caracelli, and Miles McNall sat with Thomas Cook in Washington, D.C., during the American Evaluation Association (AEA) annual conference. The…
Descriptors: Biographies, Oral History, College Faculty, Faculty Development
Mott, Rebecca – Journal of Extension, 2018
With today's technology, Extension professionals have a variety of tools available for program evaluation. This article describes an innovative platform called VoiceThread that has been used in many classrooms but also is useful for conducting virtual focus group research. I explain how this tool can be used to collect qualitative participant…
Descriptors: Program Evaluation, Evaluation Methods, Extension Education, Qualitative Research
Teacher Incentive Fund, US Department of Education, 2016
The U.S. Department of Education (ED) expects all Teacher Incentive Fund (TIF) grantees to conduct an evaluation of their programs. Experience with earlier rounds of TIF grants has shown that evaluations can provide valuable information for managing and improving TIF-supported activities, as well as evidence that these activities have had a…
Descriptors: Program Evaluation, Questioning Techniques, Qualitative Research, Statistical Analysis
Torrance, Harry – Journal of Mixed Methods Research, 2012
Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…
Descriptors: Mixed Methods Research, Validity, Qualitative Research, Participation
Newton, Andrew R.; Maher, Michelle A.; Smith, Douglas A. – Research & Practice in Assessment, 2015
Assessment has assumed an increasingly prominent place in academic and student affairs practice. Yet, in smaller student affairs departments with limited staffing and resources, how might a department identify the resources or time to thoroughly assess student learning outcomes? This Notes in Brief details the partnership between the University of…
Descriptors: Partnerships in Education, Masters Programs, College Freshmen, Transfer Students
Viesca, Kara Mitchell; Reagan, Emilie Mitescu; Enterline, Sarah; Gleeson, Ann Marie – Teacher Educator, 2013
Our intention in this article is to present one institution's efforts to take on program assessment and respond to calls for accountability. To do so, the teacher education program simultaneously sought to address the narrowly defined measures called for by policy makers and politicians, while at the same time broadening and expanding…
Descriptors: Teacher Education, Urban Universities, Program Evaluation, Accountability
Braverman, Marc T. – American Journal of Evaluation, 2013
Sound evaluation planning requires numerous decisions about how constructs in a program theory will be translated into measures and instruments that produce evaluation data. This article, the first in a dialogue exchange, examines how decisions about measurement are (and should be) made, especially in the context of small-scale local program…
Descriptors: Evaluation Methods, Methods Research, Research Methodology, Research Design
Gray, Colin; Smyth, Keith – Electronic Journal of e-Learning, 2012
This paper describes the design, implementation, evaluation and further refinement of an ELGG-based social networking site to support professional development activity, project group and special interest groups, and the discussion and sharing of educational experiences and resources across Edinburgh Napier University in the United Kingdom.…
Descriptors: Foreign Countries, Professional Development, Social Networks, Learning Activities
Dunlap, Laurie A. – MathAMATYC Educator, 2012
This article describes how to design program assessment for mathematics departments, in two-year and four-year colleges across the Midwest, based on a set of components that was generated from a Delphi survey. An example is provided to illustrate how this was done at a small four-year college. There is an alignment between these components and a…
Descriptors: Mathematics Instruction, Program Evaluation, Program Design, Research Design
Baudouin, Robert; Bezanson, Lynne; Borgen, Bill; Goyer, Liette; Hiebert, Bryan; Lalande, Vivian; Magnusson, Kris; Michaud, Guylaine; Renald, Celine; Turcotte, Michel – Canadian Journal of Counselling, 2007
The findings from recent Canadian research indicate that while agency managers and front-line workers agree that evaluation is important, they seldom evaluate their work with clients. With the current emphasis on evidence-based practice and outcome-focused intervention, it is important to be able to demonstrate the value of career services in a…
Descriptors: Career Development, Delivery Systems, Formative Evaluation, Evaluation Methods
LaFrance, Joan; Nichols, Richard – Canadian Journal of Program Evaluation, 2008
The American Indian Higher Education Consortium (AIHEC), comprising 34 American Indian tribally controlled colleges and universities, has undertaken a comprehensive effort to develop an "Indigenous Framework for Evaluation" that synthesizes Indigenous ways of knowing and Western evaluation practice. To ground the framework, AIHEC engaged…
Descriptors: Expertise, Evaluators, American Indians, Focus Groups
Adams, Pam; Townsend, David – International Electronic Journal for Leadership in Learning, 2006
An invitation to explore innovative practices in program and system evaluation in a medium-sized school jurisdiction of approximately 11,000 students has resulted in a model with generative characteristics. This paper will describe the process, and several of the outcomes, when a generative evaluation approach is used to assess the effectiveness…
Descriptors: Systems Analysis, Program Evaluation, School Effectiveness, Evaluation Methods

Stevenson, Robert – Journal of Experiential Education, 1985
Demonstrates how case study evaluation concentrates on a single situation to present a holistic view of an experiential learning program and reveals unique and unanticipated features. Outlines steps of planning, gathering, analyzing, synthesizing, and reporting data and considers the advantages and disadvantages of the case study approach. (LFL)
Descriptors: Case Studies, Data Analysis, Educational Assessment, Evaluation Methods
Previous Page | Next Page ยป
Pages: 1 | 2