Descriptor
Program Evaluation | 119 |
Evaluation Methods | 84 |
Evaluators | 30 |
Research Methodology | 25 |
Evaluation Utilization | 23 |
Models | 16 |
Case Studies | 15 |
Federal Programs | 15 |
Decision Making | 13 |
Stakeholders | 13 |
Educational Assessment | 12 |
More ▼ |
Source
New Directions for Program… | 119 |
Author
Publication Type
Education Level
Audience
Researchers | 2 |
Location
Canada | 2 |
Egypt | 1 |
Louisiana | 1 |
Netherlands | 1 |
Tennessee | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating

Smith, Nick L. – New Directions for Program Evaluation, 1982
The principles of method development and assessment that have resulted from a long-range project at the Northwest Regional Educational Laboratory to create new methods for program evaluation are summarized. (Author/AL)
Descriptors: Evaluation Methods, Innovation, Program Evaluation

Reichardt, Charles S.; Gollob, Harry F. – New Directions for Program Evaluation, 1987
Four principles of taking statistical uncertainty into account when estimating and reporting effects are discussed. Means of implementing the principles, ways in which the principles are violated in practice, implications for the use of multiple methods, effect size, estimation techniques, and random and nonrandom certainty are described. (TJH)
Descriptors: Effect Size, Estimation (Mathematics), Program Evaluation, Statistical Analysis

Hunter, John E. – New Directions for Program Evaluation, 1987
Many purposes can be served by using multiple dependent variables (MDVs) in program evaluations, and the application of path analysis to multiple measures can increase both conceptual and statistical power. Alternative types of MDV designs are described and evaluated. (Author/TJH)
Descriptors: Multivariate Analysis, Path Analysis, Power (Statistics), Program Evaluation

Anderson, Scarvia B. – New Directions for Program Evaluation, 1982
In this account of the drafting of the Evaluation Research Society (ERS) Standards, the author notes that the committee was concerned not only with a broad range of evaluation applications, but also with different forms of evaluation. The absence of specificity in the standards reflects the committee's steering between opposing views. (Author/LC)
Descriptors: Committees, Evaluation Methods, Policy Formation, Program Evaluation

Ball, Samuel – New Directions for Program Evaluation, 1981
The variety of contexts, the politics of each situation, and the availability of resources all dictate variations in assessing outcomes. Seven major principles of program evaluation are considered, along with a cautionary note to the ambitious who expect large program effects. (Author)
Descriptors: Administrative Principles, Evaluation Methods, Program Evaluation, Summative Evaluation

Lincoln, Yvonna S.; Guba, Egon G. – New Directions for Program Evaluation, 1986
The emergence of a new, naturalistic, paradigm of inquiry has led to a demand for rigorous criteria that meet traditional standards of inquiry. Two sets are suggested, one of which, the "trustworthiness" criteria, parallels conventional criteria, while the second, "authenticity" criteria, is implied directly by new paradigm…
Descriptors: Evaluation Criteria, Models, Observation, Program Evaluation

House, Ernest R. – New Directions for Program Evaluation, 1983
Our scientific conceptions of evaluation are strongly influenced by underlying, deep-seated metaphors. This analysis draws upon recent philosophic work concerning the role of metaphors in all thought processes. (Author/PN)
Descriptors: Cognitive Processes, Evaluation Methods, Language Patterns, Metaphors

Wortman, Paul M.; And Others – New Directions for Program Evaluation, 1980
What a realistic training program should require, given a broad scope of research methodology, is discussed. Considering the core of evaluation research training, the diversity of methodologies raises issues about what is to be taught and how the involvement of students in actual projects is discussed. (Author/GK)
Descriptors: Curriculum, Evaluators, Graduate Study, Models

Mark, Melvin M. – New Directions for Program Evaluation, 1986
Several validity typologies are overviewed then integrated. The integrated framework is used to contrast the positions of Campbell and Cronback. The current practice and logic of quasi-experimentation is critiqued, and expansions beyond the primary focus of dominant validity typologies are suggested. (BS)
Descriptors: Evaluative Thinking, Generalization, Program Evaluation, Quasiexperimental Design

Cronbach, Lee J. – New Directions for Program Evaluation, 1982
Standardization is inimical to innovation. Although the function of a standard is to standardize, the Evaluation Research Society (ERS) Standards avoid a standardizing effect. Their contribution lies mostly in the symbolism of their existence: to issue a set of standards is to proclaim maturity of thought for the field. (Author/LC)
Descriptors: Evaluation Methods, Improvement, Innovation, Program Evaluation

Sechrest, Lee; Yeaton, William E. – New Directions for Program Evaluation, 1981
The assessment of the outcomes of social programs should always include estimates of the size of the effects produced. Various approaches to this problem are discussed. (Author)
Descriptors: Evaluation Methods, Program Effectiveness, Program Evaluation, Social Indicators

Sieber, Joan E. – New Directions for Program Evaluation, 1980
Evaluators experience conflicts between: role of scientist, administrator, and advocate; need for data v privacy; and pressures from clients v broader constituencies. To reduce conflict they must: plan; negotiate ethical work agreements; communicate effectively; and observe the ethical codes of beneficence, respect, and justice. (CP)
Descriptors: Codes of Ethics, Ethics, Evaluators, Program Evaluation

Perloff, Evelyn; Perloff, Judith K. – New Directions for Program Evaluation, 1980
Twenty-three journal articles on program evaluations were surveyed for incidence of four unethical practices: withholding purpose or mere existence of an evaluation from participants; exposure to stress; invasion of privacy; and denial of program benefits to control groups. Few violations were apparent. (CP)
Descriptors: Control Groups, Ethics, Experimental Groups, Privacy

Brown, Elizabeth D. – New Directions for Program Evaluation, 1980
Considered fundamentally a scientific enterprise, evaluation research also requires artistic skills. The latter are essential to in-house evaluators, more so than to extrainstitutional evaluators. A training model is suggested. (Author/GK)
Descriptors: Evaluators, Interprofessional Relationship, Models, Professional Training

Braskamp, Larry A.; And Others – New Directions for Program Evaluation, 1987
Evaluation client characteristics are outlined, based on experiences of the Office of Instructional and Management Services, University of Illinois at Urbana-Champaign, and contract evaluators for industry and other higher education institutions. Objectivity, expertise, sensitivity to client needs, standardized data presentation,…
Descriptors: Contracts, Educational Assessment, Higher Education, Institutional Evaluation