NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 25 results Save | Export
Peer reviewed Peer reviewed
Lincoln, Yvonna S.; Guba, Egon G. – New Directions for Program Evaluation, 1986
The emergence of a new, naturalistic, paradigm of inquiry has led to a demand for rigorous criteria that meet traditional standards of inquiry. Two sets are suggested, one of which, the "trustworthiness" criteria, parallels conventional criteria, while the second, "authenticity" criteria, is implied directly by new paradigm…
Descriptors: Evaluation Criteria, Models, Observation, Program Evaluation
Peer reviewed Peer reviewed
Mark, Melvin M. – New Directions for Program Evaluation, 1986
Several validity typologies are overviewed then integrated. The integrated framework is used to contrast the positions of Campbell and Cronback. The current practice and logic of quasi-experimentation is critiqued, and expansions beyond the primary focus of dominant validity typologies are suggested. (BS)
Descriptors: Evaluative Thinking, Generalization, Program Evaluation, Quasiexperimental Design
Peer reviewed Peer reviewed
Brown, Elizabeth D. – New Directions for Program Evaluation, 1980
Considered fundamentally a scientific enterprise, evaluation research also requires artistic skills. The latter are essential to in-house evaluators, more so than to extrainstitutional evaluators. A training model is suggested. (Author/GK)
Descriptors: Evaluators, Interprofessional Relationship, Models, Professional Training
Peer reviewed Peer reviewed
Cordray, David S. – New Directions for Program Evaluation, 1986
The role of human judgment in the development and synthesis of evidence has not been adequately developed or acknowledged within quasi-experimental analysis. Corrective solutions need to confront the fact that causal analysis within complex environments will require a more active assessment that entails reasoning and statistical modeling.…
Descriptors: Evaluative Thinking, Models, Program Effectiveness, Program Evaluation
Peer reviewed Peer reviewed
Boruch, Robert F.; Reis, Janet – New Directions for Program Evaluation, 1980
Secondary analysis is a situation that argues for explicit training in methodology. Evaluative researchers are often required to extend analysis conducted by original evaluators of a program for reasons of dissatisfaction or incompleteness. (Author/GK)
Descriptors: Higher Education, Professional Training, Program Evaluation, Research Methodology
Peer reviewed Peer reviewed
Yeaton, William; Sechrest, Lee – New Directions for Program Evaluation, 1987
In no-difference research, no differences are found among groups or conditions. This article summarizes the existing commentary on such research. The characteristics of no-difference research, its acceptance by the research community, strategies for conducting such studies, and its centrality within the experimental and nonexperimental paradigms…
Descriptors: Evaluation Methods, Literature Reviews, Models, Program Evaluation
Peer reviewed Peer reviewed
Shadish, William R., Jr.; And Others – New Directions for Program Evaluation, 1986
Since usually no defensible option for performing a task within quasi-experimentation is unbiased, it is desirable to select several options that reflect biases in different directions. The benefits of applying a critical multiplism approach to causal hypotheses, group nonequivalence, and units of analysis in quasi-experimentation is discussed.…
Descriptors: Bias, Matched Groups, Program Evaluation, Quasiexperimental Design
Peer reviewed Peer reviewed
Cline, Hugh F.; Sinnott, Loraine T. – New Directions for Program Evaluation, 1981
Methods useful in identifying and assessing outcomes of relatively simple programs may not be practical or appropriate for complex programs and systems. Comparative case study methods are recommended for dealing with such global outcomes of planned organizational change. (Author)
Descriptors: Case Studies, Comparative Analysis, Organizational Change, Program Evaluation
Peer reviewed Peer reviewed
Sudman, Seymour; Bradburn, Norman – New Directions for Program Evaluation, 1984
Situations in which mailed questionnaires are most appropriate are identified. Population variables, characteristics of questionnaires, and social desirability variables are examined in depth. (Author)
Descriptors: Attitude Measures, Evaluation Methods, Program Evaluation, Research Methodology
Peer reviewed Peer reviewed
Berk, Richard A. – New Directions for Program Evaluation, 1982
The dilemma inherent in formal standards results from the tension between maintaining quality control and constraining creativity. This dilemma is nowhere more apparent than in matters of method: measurement, sampling, research design, and statistical analysis. The author offers suggestions for revision of the standards to allow for specific…
Descriptors: Evaluation Methods, Improvement, Innovation, Program Evaluation
Peer reviewed Peer reviewed
Levine, Victor – New Directions for Program Evaluation, 1981
A program can be evaluated, and a cost-benefit ratio can be calculated to five decimal places, without ever directly examining outcomes. A useful vocabulary in economics and an understanding of the concepts involved is provided evaluators by an economist. (Author/RL)
Descriptors: Cost Effectiveness, Decision Making, Human Services, Program Evaluation
Peer reviewed Peer reviewed
Light, Richard J. – New Directions for Program Evaluation, 1984
Using examples from government program evaluation studies, six areas where research synthesis is more effective than individual studies are presented. Research synthesis can: (1) help match treatment type with recipient type; (2) explain which treatment features matter; (3) explain conflicting results; (4) determine critical outcomes; (5) assess…
Descriptors: Aptitude Treatment Interaction, Evaluation Methods, Federal Programs, Meta Analysis
Peer reviewed Peer reviewed
Sexton, Thomas R.; And Others – New Directions for Program Evaluation, 1986
Recent methodological advances are described that enable the analyst to extract additional information from the data envelopment analysis (DEA) methodology, including goal programming to develop cross-efficiencies, cluster analysis, analysis of variance, and pooled cross section time-series analysis. Some shortcomings of DEA are discussed. (LMO)
Descriptors: Efficiency, Error of Measurement, Evaluation Methods, Evaluation Problems
Peer reviewed Peer reviewed
Wortman, Paul M.; Yeaton, William H. – New Directions for Program Evaluation, 1986
These authors argue that evaluation in medical settings is best taught using experimental and quasi-experimental paradigms as a basic framework. Three types of health evaluation courses at the master's, doctoral, and professional doctoral levels at the University of Michigan that stress the crucial connection between evaluation and policy are…
Descriptors: College Curriculum, Course Content, Graduate Study, Higher Education
Peer reviewed Peer reviewed
Stevens, Carla J., Ed.; Dial, Micah, Ed. – New Directions for Program Evaluation, 1994
This theme issue discusses the misuse of evaluation. The eight articles all discuss instances in which the evaluation process or evaluation findings were misused. Misuse is usually not the result of methodological issues but rather is an issue of human relations and sometimes of political pressure. (SLD)
Descriptors: Educational Research, Evaluation Methods, Evaluation Utilization, Evaluators
Previous Page | Next Page ยป
Pages: 1  |  2