NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 346 to 360 of 1,584 results Save | Export
Peer reviewed Peer reviewed
Culp, Ken, III; Nall, Martha A. – Journal of Volunteer Administration, 2001
Evaluating the impact of volunteer programs should measure both program outcomes and volunteer growth and development. The Targeting Outcomes of Programs Model is a structured way to collect data on several levels. (SK)
Descriptors: Accountability, Data Collection, Program Evaluation, Staff Development
Puma, Michael; Bell, Stephen; Cook, Ronna; Heid, Camilla; Shapiro, Gary; Broene, Pam; Jenkins, Frank; Fletcher, Philip; Quinn, Liz; Friedman, Janet; Ciarico, Janet; Rohacek, Monica; Adams, Gina; Spier, Elizabeth – Administration for Children & Families, 2010
This Technical Report is designed to provide technical detail to support the analysis and findings presented in the "Head Start Impact Study Final Report" (U.S. Department of Health and Human Services, January 2010). Chapter 1 provides an overview of the Head Start Impact Study and its findings. Chapter 2 provides technical information on the…
Descriptors: Preschool Children, Disadvantaged Youth, Low Income Groups, Kindergarten
Peer reviewed Peer reviewed
Direct linkDirect link
Phillips, Henry L., IV; Foster, T. Chris – Performance Improvement, 2008
Naval aviation needs a unified standard for job-task analyses and data collection. Such a standard would facilitate consolidation of data across aviation platforms and permit evaluation of training content across phases of the training continuum. It would also make possible the construction of a training transfer evaluation system. The Navy cannot…
Descriptors: Job Training, Armed Forces, Aviation Education, Job Skills
Muller, Eve; Whaley, Kathy; Rous, Beth – Project Forum, 2009
In March 2008, the National Early Childhood Transition Initiative released a document developed collaboratively over several months titled "Designing and Implementing Effective Early Childhood Transition Processes". The document was created as a resource for improving state and local performance on the State Performance Plans (SPP) and…
Descriptors: Transitional Programs, Young Children, Program Effectiveness, Federal Legislation
Schubert, Jane G. – 1982
Studies were conducted by the American Institute for Research (AIR) under contract with the Department of Education (ED). An evaluability assessment determines the extent to which a program is ready for evaluation, the changes needed to make the program more manageable and accountable, and toward what questions a more extensive evaluation might…
Descriptors: Data Collection, Evaluation Methods, Evaluation Needs, Evaluators
Benedict, Larry G.
Research paradigms are not the proper channel for educational evaluation. Evaluation and research differ in many areas, including purpose, methods, goals, groups, and desired outcomes. Research is strictly controlled, has the purpose of gathering information and making generalizations about completed studies or events. Evaluation is a process…
Descriptors: Comparative Analysis, Data Collection, Evaluation, Objectives
Peer reviewed Peer reviewed
Scheyer, Patricia T.; Stake, Robert E. – Studies in Educational Evaluation, 1976
The portfolio approach to self-evaluation is possible with limited staff time or funds. Evaluators collect a variety of documents and records that describe the goals, perceptions and values involved in the program. The portfolio items should represent key issues and precipitate useful discussions. (GDC)
Descriptors: Data Collection, Evaluation Methods, Program Evaluation, Self Evaluation
Brandenburg, Dale C. – Training and Development Journal, 1982
Evidence shows that evaluation practice is limited to direct support within training: little action takes place with other organizational units. Evaluation must have a significant role in the overall human resource function of an organization and must be a part of the planning function. (JOW)
Descriptors: Business, Data Collection, Evaluation Methods, Industry
Peer reviewed Peer reviewed
Direct linkDirect link
Christensen, Laurene; Nielsen, Julie E.; Rogers, Christopher M.; Volkov, Boris – New Directions for Evaluation, 2005
Certain challenges are inherent in collecting data to evaluate nonformal education programs and settings. One challenge relates to the complexity of the educational system in which nonformal learning takes place. Separating the learning effects of a nonformal education program on participants from the effects of other essential elements of their…
Descriptors: Nonformal Education, Data Collection, Program Evaluation, Evaluation Methods
Forum for Youth Investment, 2008
This commentary highlights the work of the Collaborative for Building After-School Systems (CBASS), a collaborative of mature, city and county-wide nonprofit OST intermediaries, to develop and adopt common youth-, program- and system-level measures that are easy and cost-effective for local systems to implement. By agreeing to adopt and publicly…
Descriptors: After School Programs, Program Evaluation, Evaluation Methods, Productivity
Peer reviewed Peer reviewed
Direct linkDirect link
Thorndyke, Luanne E.; Gusic, Maryellen E.; Milner, Robert J. – Journal of Continuing Education in the Health Professions, 2008
Introduction: Mentoring is a central component of professional development. Evaluation of "successful" mentoring programs, however, has been limited and mainly focused on measures of satisfaction with the relationship. In today's environment, mentoring programs must produce tangible outcomes to demonstrate success. To address this issue,…
Descriptors: Medical Education, Mentors, Faculty Development, Program Effectiveness
Schwartz, Terry Ann; Kaplan, Michael H. – 1981
The benefits accrued through the use of triangulation as both a design strategy and an analytic tool cannot be overstated. Triangulation allows for the clustering and organizing of disparate yet related data. Finding out what the data have in common and how the data are different allow the researcher to eliminate (or reduce) the number of…
Descriptors: Case Studies, Community Education, Data Collection, Evaluation Methods
Alkin, Marvin C.; And Others – 1969
This project, located in a large metropolitan school district, concerns the operation and evaluation of three Mathematics Demonstration Centers. In the first phase, directions are given about the necessary information to be gathered by the evaluating organization, (e.g. the school district administrative organization, size of school district,…
Descriptors: Data Collection, Demonstration Centers, Evaluation Methods, Mathematics
Harlandale Independent School District, San Antonio, TX. Career Education Center. – 1973
The Career Education Center of the Harlandale Independent School District, San Antonio, Texas, has developed a K-12 series of career oriented curriculum guides (CE 001 005-16 and CE 001 075-84). This document outlines the proposed occupational followup study for students in grades 8-12 and for five years after leaving school. Personal data will be…
Descriptors: Career Education, Data Collection, Followup Studies, Program Evaluation
Peer reviewed Peer reviewed
Chelimsky, Eleanor – Society, 1985
Examines budgetary cutbacks in federal data systems. Concludes that (1) justifications for cutbacks are unconvincing; (2) cutbacks have affected data availability and quality severely; (3) the feasibility of the idea that major data users should collect their own information is dubious; and (4) special evaluation studies cannot substitute for…
Descriptors: Data Collection, Federal Programs, Government Role, Information Needs
Pages: 1  |  ...  |  20  |  21  |  22  |  23  |  24  |  25  |  26  |  27  |  28  |  ...  |  106