Publication Date
| In 2026 | 0 |
| Since 2025 | 9 |
| Since 2022 (last 5 years) | 35 |
| Since 2017 (last 10 years) | 139 |
| Since 2007 (last 20 years) | 383 |
Descriptor
| Data Collection | 1584 |
| Program Evaluation | 1584 |
| Evaluation Methods | 648 |
| Data Analysis | 383 |
| Program Effectiveness | 306 |
| Elementary Secondary Education | 265 |
| Research Methodology | 212 |
| Models | 202 |
| Program Development | 195 |
| Evaluation Criteria | 184 |
| Vocational Education | 166 |
| More ▼ | |
Source
Author
| Ligon, Glynn | 5 |
| Horst, Donald P. | 4 |
| Alvir, Howard P. | 3 |
| Churchman, David | 3 |
| Coker, Dana Rosenberg | 3 |
| Condelli, Larry | 3 |
| Fiene, Richard | 3 |
| Fink, Arlene | 3 |
| Friedel, Janice Nahra | 3 |
| Granville, Arthur C. | 3 |
| Johnson, Terry | 3 |
| More ▼ | |
Publication Type
Education Level
Audience
| Practitioners | 125 |
| Administrators | 66 |
| Researchers | 55 |
| Policymakers | 50 |
| Teachers | 30 |
| Community | 5 |
| Counselors | 4 |
| Parents | 2 |
| Support Staff | 2 |
| Media Staff | 1 |
| Students | 1 |
| More ▼ | |
Location
| California | 29 |
| Michigan | 25 |
| Australia | 19 |
| Canada | 19 |
| Florida | 19 |
| New York | 19 |
| Illinois | 17 |
| Minnesota | 17 |
| North Carolina | 17 |
| Pennsylvania | 17 |
| Texas | 17 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 1 |
| Meets WWC Standards with or without Reservations | 1 |
Peer reviewedEdberg, Mark C.; Wong, Frank Y.; Woo, Violet; Doong, Tuei – Evaluation and Program Planning, 2003
Describes the development of a pilot uniform data set (UDS) intended to serve as the primary data collection mechanism for all grants and standard cooperative agreements funded by the U.S. Office of Minority Health. The UDS is the product of a formative research project with implications for other program evaluations. (SLD)
Descriptors: Data Collection, Evaluation Methods, Grants, Minority Groups
Peer reviewedBush, Connee; And Others – Journal of Extension, 1995
"Black box" evaluation considers program input and outcomes (what) without considering process (how and why). A broader approach to evaluation based on implementation process theory explores the social system context of programs and provides information useful for improving them. (SK)
Descriptors: Data Analysis, Data Collection, Extension Education, Program Development
Peer reviewedMoore, Donald E., Jr. – Journal of Continuing Education in the Health Professions, 1998
A discrepancy model defines what successful continuing medical education is and what it should be. Then data are collected and analyzed to describe the nature and scope of the discrepancy. This approach to educational needs assessment uses an outcomes focus to plan more effective continuing-education programs. (SK)
Descriptors: Data Collection, Medical Education, Needs Assessment, Professional Continuing Education
Peer reviewedMason, Greg – Canadian Journal of Program Evaluation/La Revue canadienne d'evaluation de programme, 1996
This article reviews issues in questionnaire design beyond the creation of standardized questions and basic data collection rules. The use of magnitude scales, the application of decision theory, and the use of a questionnaire framework that replicates how choices are made by respondents are among the new design approaches discussed. (SLD)
Descriptors: Data Collection, Models, Program Evaluation, Questionnaires
Peer reviewedKemis, Mari; Walker, David A. – Journal of College Student Development, 2000
Describes the a-e-I-o-u program evaluation approach, a framework for organizing key evaluation questions that allows for many models of evaluation and/or methods of data collection. Evaluations by users of the approach have indicated that it provides a practical way to organize evaluation questions and collect appropriate data. (GCP)
Descriptors: Data Collection, Data Processing, Evaluation Methods, Evaluation Research
Simpson, Walter – Association for the Advancement of Sustainability in Higher Education, 2009
While a number of excellent campus climate planning resources are available, this guide attempts to fill a gap by providing an abundance of "how-to" information. The goal of the guide is to assist climate action planning teams at schools that are well versed in campus climate issues and well along in the CAP (Climate Action Planning) process as…
Descriptors: Conservation (Environment), Sustainability, Sustainable Development, Colleges
Lee, Mickey M.; McLean, James E. – 1978
Content analysis of critical events methodology is demonstrated in an evaluation of the Educational Telecommunications for Alaska (ETA) Project. This methodology was used to collect and interpret the data derived from the ETA staff personnel representing five components of the project. The data were analyzed using content analysis procedures. The…
Descriptors: Content Analysis, Cost Effectiveness, Data Analysis, Data Collection
Templin, Patricia A. – 1981
This handbook is intended to help educational evaluators use still photography in designing, conducting, and reporting evaluations of educational programs. It describes techniques for using a visual documentary approach to program evaluation that features data collected with a camera. The emphasis is on the aspects of educational evaluation…
Descriptors: Data Collection, Elementary Secondary Education, Evaluation Methods, Photography
Evans, Ellis – 1982
Intended for practitioners in early childhood special education, the document offers guidelines for conducting a program evaluation. Information is organized around seven questions: what is the purpose of the evaluation? what information is needed, and from what sources can it be obtained? when and under what conditions will information be…
Descriptors: Data Analysis, Data Collection, Disabilities, Early Childhood Education
Alvir, Howard P.
The objective of this research is to provide an outline of evaluation by objectives for a conference entitled Project Evaluation. As far as processes are concerned, this evaluation will result in a session-by-session evaluation form and transparencies to accompany the session-by-session evaluation form and to reinforce the main points under…
Descriptors: Data Analysis, Data Collection, Evaluation, Evaluation Methods
Beverly Hills Unified School District, CA. – 1972
A questionnaire designed to determine community opinion of its schools is presented. The questionnaire contains five sections that relate to: number of years lived in the community; number of children in Beverly Hills District schools and grades in which enrolled; best sources of information concerning the schools; newspapers read regularly, and…
Descriptors: Community Attitudes, Data Collection, Educational Objectives, Program Evaluation
Guba, Egon G.; Stufflebeam, Daniel L. – 1970
Part 1 of this monograph discusses the status of educational evaluation and describes several problems in carrying out such evaluation: (1) defining the educational setting, (2) defining decision types, (3) designing educational evaluation, (4) designing evaluation systems, and (5) defining criteria for judging evaluation. Part 2 proposes an…
Descriptors: Data Collection, Decision Making, Evaluation Criteria, Evaluation Methods
Stake, Robert E. – 1972
The definition, structures, utilities, stimulus-response differences, and portrayals of responsive evaluation are presented. An educational evaluation is said to be a "responsive evaluation" if it orients more directly to program activities than to program intents, if it responds to audience requirements for information, and if the different…
Descriptors: Data Collection, Evaluation Criteria, Evaluation Methods, Information Needs
Peer reviewedLee, Arthur M. – Educational Researcher, 1977
The most important impact of the Baseline study on vocational education has been on the Congress. Recommendations in the Baseline reports have been written into the vocational education provisions of various pieces of federal educational legislation. (Author/AM)
Descriptors: Data, Data Collection, Historiography, History
Rusnell, Dale – New Directions for Continuing Education, 1979
The costs of evaluation are affected by program characteristics and the extent to which it is important to be confident about proof and causality. They are also affected by decisions about issues, resources, evidence, data gathering, data analysis, and reporting. (CT)
Descriptors: Cost Effectiveness, Data Analysis, Data Collection, Program Costs

Direct link
