Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 10 |
Descriptor
Evaluation Criteria | 109 |
Program Evaluation | 109 |
Research Design | 109 |
Evaluation Methods | 78 |
Research Methodology | 42 |
Data Collection | 31 |
Models | 20 |
Educational Research | 19 |
Data Analysis | 17 |
Research Problems | 17 |
Program Effectiveness | 16 |
More ▼ |
Source
Author
Johnson, Terry | 4 |
Stufflebeam, Daniel L. | 3 |
Coulson, John E. | 2 |
Feldman, Deborah | 2 |
Fink, Arlene | 2 |
Kosecoff, Jacqueline | 2 |
Madey, Doren | 2 |
McConnell, Sheena | 2 |
Rossi, Peter H. | 2 |
Tallmadge, G. Kasten | 2 |
Alkin, Marvin C. | 1 |
More ▼ |
Publication Type
Education Level
Adult Education | 6 |
Higher Education | 4 |
Elementary Education | 1 |
Postsecondary Education | 1 |
Two Year Colleges | 1 |
Audience
Practitioners | 11 |
Researchers | 11 |
Administrators | 9 |
Policymakers | 3 |
Location
Delaware | 2 |
Alabama | 1 |
Alaska | 1 |
California | 1 |
Michigan | 1 |
Mississippi | 1 |
Oregon | 1 |
Pennsylvania | 1 |
United States | 1 |
Utah | 1 |
Virginia | 1 |
More ▼ |
Laws, Policies, & Programs
Job Training Partnership Act… | 6 |
Elementary and Secondary… | 4 |
Emergency School Aid Act 1972 | 3 |
Education Consolidation… | 2 |
Education Amendments 1974 | 1 |
Youth Employment and… | 1 |
Assessments and Surveys
Torrance Tests of Creative… | 1 |
What Works Clearinghouse Rating
Hughes, Katherine L.; Miller, Trey; Reese, Kelly – Grantee Submission, 2021
This report from the Career and Technical Education (CTE) Research Network Lead team provides final results from an evaluability assessment of CTE programs that feasibly could be evaluated using a rigorous experimental design. Evaluability assessments (also called feasibility studies) are used in education and other fields, such as international…
Descriptors: Program Evaluation, Vocational Education, Evaluation Methods, Educational Research
Braverman, Marc T. – American Journal of Evaluation, 2013
Sound evaluation planning requires numerous decisions about how constructs in a program theory will be translated into measures and instruments that produce evaluation data. This article, the first in a dialogue exchange, examines how decisions about measurement are (and should be) made, especially in the context of small-scale local program…
Descriptors: Evaluation Methods, Methods Research, Research Methodology, Research Design
Spybrook, Jessaca; Lininger, Monica; Cullen, Anne – Society for Research on Educational Effectiveness, 2011
The purpose of this study is to extend the work of Spybrook and Raudenbush (2009) and examine how the research designs and sample sizes changed from the planning phase to the implementation phase in the first wave of studies funded by IES. The authors examine the impact of the changes in terms of the changes in the precision of the study from the…
Descriptors: Evaluation Criteria, Sampling, Research Design, Planning
Bustos, Antonio; Arostegui, Jose Luis – Quality of Higher Education, 2012
Universities in Europe have been playing an increasingly important role in the institutional evaluation of political and social systems for the last thirty years. Their major contribution to those processes of accountability has been to add methods and safeguards of evaluative research. In this paper we report an illustration of how evaluative…
Descriptors: Research Administration, Evaluation Criteria, Evaluation Methods, Social Services
Sanders, James R.; Nafziger, Dean N. – Journal of MultiDisciplinary Evaluation, 2011
The purpose of this paper is to provide a basis for judging the adequacy of evaluation plans or, as they are commonly called, evaluation designs. The authors assume that using the procedures suggested in this paper to determine the adequacy of evaluation designs in advance of actually conducting evaluations will lead to better evaluation designs,…
Descriptors: Check Lists, Program Evaluation, Research Design, Evaluation Methods
Ross, Margaret E.; Narayanan, N. Hari; Hendrix, Theron Dean; Myneni, Lakshman Sundeep – Journal of MultiDisciplinary Evaluation, 2011
Background: The philosophical underpinnings of evaluation guidelines set forth by a funding agency can sometimes seem inconsistent with that of the intervention. Purpose: Our purpose is to introduce questions pertaining to the contrast between the instructional program's underlying philosophical beliefs and assumptions and those underlying our…
Descriptors: Philanthropic Foundations, Grants, Financial Support, Computer Science Education
Stufflebeam, Daniel L. – Journal of MultiDisciplinary Evaluation, 2011
Good evaluation requires that evaluation efforts themselves be evaluated. Many things can and often do go wrong in evaluation work. Accordingly, it is necessary to check evaluations for problems such as bias, technical error, administrative difficulties, and misuse. Such checks are needed both to improve ongoing evaluation activities and to assess…
Descriptors: Program Evaluation, Evaluation Criteria, Evaluation Methods, Definitions
Lee, Eunjung; Mishna, Faye; Brennenstuhl, Sarah – Research on Social Work Practice, 2010
The purpose of this article is to develop guidelines to assist practitioners and researchers in evaluating and developing rigorous case studies. The main concern in evaluating a case study is to accurately assess its quality and ultimately to offer clients social work interventions informed by the best available evidence. To assess the quality of…
Descriptors: Research Design, Program Evaluation, Guidelines, Data Analysis
Stockard, Jean – Current Issues in Education, 2010
A large body of literature documents the central importance of fidelity of program implementation in creating an internally valid research design and considering such fidelity in judgments of research quality. The What Works Clearinghouse (WWC) provides web-based summary ratings of educational innovations and is the only rating group that is…
Descriptors: Research Design, Educational Innovation, Program Implementation, Program Effectiveness
Mathieu, Robert D.; Pfund, Christine; Gillian-Daniel, Don – Change: The Magazine of Higher Learning, 2009
In an attempt to move, if not balance, the scales of activity toward increasing scientific capability across a diverse national population, U.S. federal funding agencies are purposefully linking research funding to broad national impact. Among United States federal agencies, the National Science Foundation (NSF) has led the way in the integration…
Descriptors: Research Universities, Educational Change, Public Agencies, Alignment (Education)

Nielsen, Warren R.; Kimberly, John R. – Human Resource Management, 1976
Argues that systematic empirical assessment of organizational development (OD) interventions is important and can have benefits both for subject organizations and organizational researchers. Discusses suggested criteria for effective assessment of OD interventions. (For availability see EA 507 198.) (JG)
Descriptors: Evaluation Criteria, Intervention, Organizational Change, Organizational Development

Duckworth, Eleanor – Canadian Journal of Education, 1977
Combines information gathered from questionnaire data, from field visits, from teacher interviews, from documentation, and from attending Foundation conferences, in assessing the Canada Studies Foundation. (Author/RK)
Descriptors: Educational Research, Evaluation Criteria, Evaluation Methods, National Surveys

Pettigrew, Thomas F. – Journal of Educational Statistics, 1978
The desirability of evaluating the total effect of the Emergency School Aid Act versus specific programs in the Act is the focus of this commentary on the evaluation of that act and an article detailing that evaluation (TM 503 672). (JKS)
Descriptors: Compensatory Education, Elementary Education, Evaluation Criteria, Evaluation Methods

Chen, Huey-Tsyh; Rossi, Peter H. – Evaluation Review, 1983
The use of theoretical models in impact assessment can heighten the power of experimental designs and compensate for some deficiencies of quasi-experimental designs. Theoretical models of implementation processes are examined, arguing that these processes are a major obstacle to fully effective programs. (Author/CM)
Descriptors: Evaluation Criteria, Evaluation Methods, Models, Program Evaluation
Cordray, David S.; Sonnefeld, L. Joseph – New Directions for Testing and Measurement, 1985
There are numerous micro-level methods decisions associated with planning an impact evaluation. Quantitative synthesis methods can be used to construct an actuarial data base for establishing the likelihood of achieving desired sample sizes, statistical power, and measurement characteristics. (Author/BS)
Descriptors: Effect Size, Evaluation Criteria, Evaluation Methods, Meta Analysis