Publication Date
| In 2026 | 0 |
| Since 2025 | 9 |
| Since 2022 (last 5 years) | 35 |
| Since 2017 (last 10 years) | 139 |
| Since 2007 (last 20 years) | 383 |
Descriptor
| Data Collection | 1584 |
| Program Evaluation | 1584 |
| Evaluation Methods | 648 |
| Data Analysis | 383 |
| Program Effectiveness | 306 |
| Elementary Secondary Education | 265 |
| Research Methodology | 212 |
| Models | 202 |
| Program Development | 195 |
| Evaluation Criteria | 184 |
| Vocational Education | 166 |
| More ▼ | |
Source
Author
| Ligon, Glynn | 5 |
| Horst, Donald P. | 4 |
| Alvir, Howard P. | 3 |
| Churchman, David | 3 |
| Coker, Dana Rosenberg | 3 |
| Condelli, Larry | 3 |
| Fiene, Richard | 3 |
| Fink, Arlene | 3 |
| Friedel, Janice Nahra | 3 |
| Granville, Arthur C. | 3 |
| Johnson, Terry | 3 |
| More ▼ | |
Publication Type
Education Level
Audience
| Practitioners | 125 |
| Administrators | 66 |
| Researchers | 55 |
| Policymakers | 50 |
| Teachers | 30 |
| Community | 5 |
| Counselors | 4 |
| Parents | 2 |
| Support Staff | 2 |
| Media Staff | 1 |
| Students | 1 |
| More ▼ | |
Location
| California | 29 |
| Michigan | 25 |
| Australia | 19 |
| Canada | 19 |
| Florida | 19 |
| New York | 19 |
| Illinois | 17 |
| Minnesota | 17 |
| North Carolina | 17 |
| Pennsylvania | 17 |
| Texas | 17 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 1 |
| Meets WWC Standards with or without Reservations | 1 |
Earl, Lorna; Watson, Nancy; Levin, Ben; Leithwood, Ken; Fullan, Michael; Torrance, Nancy – 2003
The National Literacy and Numeracy Strategies (NLS and NNS) represent a major government initiative to improve classroom practice and student learning in literacy and mathematics in elementary schools across England. National targets were intended to increase the percentage of 11-year-olds reaching the "expected level"--Level 4--in…
Descriptors: British National Curriculum, Data Collection, Educational Change, Elementary Education
Little, Priscilla; DuPree, Sharon; Deich, Sharon – 2002
A collaborative publication between Harvard Family Research Project and The Finance Project, this brief offers guidance in documenting progress and demonstrating results in local out-of-school-time programs. Following introductory remarks providing a rationale for program evaluation, discussing principles of program evaluation, and clarifying key…
Descriptors: After School Education, After School Programs, Data Collection, Documentation
Peer reviewedHecht, Kathryn A. – Teachers College Record, 1973
Considers the evaluation provision of the Elementary and Secondary Act of 1965. (RK)
Descriptors: Annual Reports, Data Collection, Educational Change, Educational Legislation
Peer reviewedPratt, Linda K.; Reichard, Donald R. – New Directions for Institutional Research, 1983
Many institutions develop institutional goals, but few continue to the next step--assessing the extent to which the institution has achieved its goals. Some methods for assessing progress toward achieving goals are described.(MLW)
Descriptors: College Administration, College Role, Data Collection, Higher Education
Peer reviewedRyan, Alan; Randhawa, Bikkar – Assessment and Evaluation in Higher Education, 1982
An evaluation of an elementary teacher education program using the responsive evaluation method is described. Responsive evaluation uses observer judgment and flexible data collection methods. The problems involved in each of the strategy's 12 steps are discussed and the influence of the university setting on the program is considered. (Author/MSE)
Descriptors: Data Collection, Elementary Education, Evaluation Methods, Foreign Countries
Billingsley, Felix F.; Liberty, Kathleen A. – Journal of the Association for the Severely Handicapped (JASH), 1982
The article suggests that accuracy data alone provide insufficient information upon which to base instructional decisions, particularly for severely handicapped students. It discusses the merits of data that relate behavior to a time base. Characteristics of various time-based measures are described and data collection examples are presented.…
Descriptors: Data Collection, Elementary Secondary Education, Evaluation Methods, Program Evaluation
Peer reviewedDanziger, Sandra K. – Journal of Human Resources, 1981
This paper reports a qualitative study of what difference it makes to the participants to have completed the Supported Work program. For the majority, Supported Work brought steadier jobs, higher wages and fringe benefits, increased self-confidence, and independence. (Author/CT)
Descriptors: Data Collection, Economic Development, Employment Patterns, Females
Lawrence, Ben – New Directions for Higher Education, 1979
The rising demand for accountability has had a great influence on the design and use of management information systems. The purposes of accountability, needs assessment, and program evaluation would be well served if a set of indlcators could include information about the outcomes of higher education. (MLW)
Descriptors: Accountability, Data Collection, Higher Education, Information Utilization
Peer reviewedDavid, Martin – American Statistician, 1976
Advocates that the Current Population Survey (CPS) be redesigned to permit valid longitudinal studies, that the CPS include questions to evaluate poverty programs, that a common sampling framework be designed for the CPS and operating agencies serving the poor to assure that links between CPS and other data can be established, and that samples of…
Descriptors: Administrative Agencies, Census Figures, Data Collection, Economically Disadvantaged
Peer reviewedScheirer, Mary Ann – New Directions for Evaluation, 1996
This eight chapters of this special issue introduce the use of a template to specify program components as a new tool for program evaluation. The template articulates the components needed for program delivery and provides a systematic method for assessing program implementation. (SLD)
Descriptors: Data Analysis, Data Collection, Evaluation Methods, Formative Evaluation
Peer reviewedPopham, W. James – NASSP Bulletin, 1997
Many participant-satisfaction evaluation forms used in staff development programs are ritualistic contrivances that are poorly designed and administered. This article describes five rules for generating evaluation forms based on clarity, a decision focus, brevity, anonymity, and respondents' opportunity to make additional suggestions. Careful…
Descriptors: Data Analysis, Data Collection, Elementary Secondary Education, Evaluation Methods
McGrevin, Carol Z.; Paull, Robert C. – Journal of CAPEA (California Assn. of Professors of Educational Administration), 1996
Shares the program-evaluation strategy of Pepperdine University's restructured Educational Leadership Academy for entry-level school administrators and other school leaders. Describes the objectives, theoretical basis, and methodology of data collection used for the evaluation. Describes the democratic/collaborative process used in developing the…
Descriptors: Administrator Education, Data Collection, Educational Administration, Elementary Secondary Education
Peer reviewedSilky, William; Readling, John – Roeper Review, 1992
The REDSIL model for ongoing evaluation of gifted education programs involves interviewing stakeholders, isolating and categorizing critical study issues, designing a data collection plan that uses several forms of qualitative methods, implementing the plan, analyzing data, searching literature relative to each issue, analyzing data again, and…
Descriptors: Data Analysis, Data Collection, Elementary Secondary Education, Formative Evaluation
Peer reviewedFitzpatrick, Jody – American Journal of Evaluation, 1999
Discusses the methodology and content of the evaluation of the Homeless Families Program. Focuses on data collection and the determination of program outcomes. Stresses the importance of stakeholder involvement and the flexibility of the evaluators as the evaluation unfolded. (SLD)
Descriptors: Data Collection, Delivery Systems, Evaluation Methods, Health Services
Bamberger, Michael; Rugh, Jim; Church, Mary; Fort, Lucia – American Journal of Evaluation, 2004
The paper discusses two common scenarios in which evaluators must conduct impact evaluations when working under budget, time, or data constraints. Under the first scenario the evaluator is not called in until the project is already well advanced, and there is a tight deadline for completing the evaluation, frequently combined with a limited budget…
Descriptors: Foreign Countries, Program Effectiveness, Evaluators, Control Groups

Direct link
