NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
Mark, Melvin M. – New Directions for Program Evaluation, 1986
Several validity typologies are overviewed then integrated. The integrated framework is used to contrast the positions of Campbell and Cronback. The current practice and logic of quasi-experimentation is critiqued, and expansions beyond the primary focus of dominant validity typologies are suggested. (BS)
Descriptors: Evaluative Thinking, Generalization, Program Evaluation, Quasiexperimental Design
Peer reviewed Peer reviewed
Cordray, David S. – New Directions for Program Evaluation, 1986
The role of human judgment in the development and synthesis of evidence has not been adequately developed or acknowledged within quasi-experimental analysis. Corrective solutions need to confront the fact that causal analysis within complex environments will require a more active assessment that entails reasoning and statistical modeling.…
Descriptors: Evaluative Thinking, Models, Program Effectiveness, Program Evaluation
Peer reviewed Peer reviewed
Yeaton, William; Sechrest, Lee – New Directions for Program Evaluation, 1987
In no-difference research, no differences are found among groups or conditions. This article summarizes the existing commentary on such research. The characteristics of no-difference research, its acceptance by the research community, strategies for conducting such studies, and its centrality within the experimental and nonexperimental paradigms…
Descriptors: Evaluation Methods, Literature Reviews, Models, Program Evaluation
Peer reviewed Peer reviewed
Shadish, William R., Jr.; And Others – New Directions for Program Evaluation, 1986
Since usually no defensible option for performing a task within quasi-experimentation is unbiased, it is desirable to select several options that reflect biases in different directions. The benefits of applying a critical multiplism approach to causal hypotheses, group nonequivalence, and units of analysis in quasi-experimentation is discussed.…
Descriptors: Bias, Matched Groups, Program Evaluation, Quasiexperimental Design
Peer reviewed Peer reviewed
Yeaton, William H.; Wortman, Paul M. – New Directions for Program Evaluation, 1984
Solutions to methodological problems in medical technologies research synthesis relating to temporal change, persons receiving treatment, research design, analytic procedures, and threats to validity are presented. These solutions should help with the planning and methodogy for research synthesis in other areas. (BS)
Descriptors: Medical Research, Meta Analysis, Patients, Research Design
Peer reviewed Peer reviewed
Light, Richard J. – New Directions for Program Evaluation, 1984
Using examples from government program evaluation studies, six areas where research synthesis is more effective than individual studies are presented. Research synthesis can: (1) help match treatment type with recipient type; (2) explain which treatment features matter; (3) explain conflicting results; (4) determine critical outcomes; (5) assess…
Descriptors: Aptitude Treatment Interaction, Evaluation Methods, Federal Programs, Meta Analysis
Peer reviewed Peer reviewed
Lipsey, Mark W. – New Directions for Program Evaluation, 1993
Explores the role of theory in strengthening causal interpretations in nonexperimental research. Evaluators must conduct theory-driven research, concentrating on "small theory," in that the focus is on the explanation of processes specific to the program being evaluated. Theory-guided treatment research must be programmatic and…
Descriptors: Causal Models, Effect Size, Evaluators, Generalization
Peer reviewed Peer reviewed
Bryant, Fred B.; Wortman, Paul M. – New Directions for Program Evaluation, 1984
Methods for selecting relevant and appropriate quasi-experimental studies for inclusion in research synthesis using the threats-to-validity approach are presented. Effects of including and excluding studies are evaluated. (BS)
Descriptors: Evaluation Criteria, Meta Analysis, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
Fortune, Jim C.; McBee, Janice K. – New Directions for Program Evaluation, 1984
Twenty-nine steps necessary for data file preparation for secondary analysis are discussed. Data base characteristics and planned use vary the complexity of the preparation. Required techniques (file verification, sample verification, file merger, data aggregation, file modification, and variable controls) and seven associated pitfalls are defined…
Descriptors: Computer Storage Devices, Data Analysis, Data Collection, Data Processing
Peer reviewed Peer reviewed
Andrew, Loyd D. – New Directions for Program Evaluation, 1984
A detailed examination of the Higher Education General Information Survey (HEGIS) data base--its content, development, and past and potential uses for researchers--is presented. Because knowledge of the data elements is crucial for successful secondary analysis, procedures for collecting needed information on the data in HEGIS are suggested. (BS)
Descriptors: Data Analysis, Data Collection, Databases, Educational Research
Peer reviewed Peer reviewed
McCleary, Richard; Riggs, James E. – New Directions for Program Evaluation, 1982
Time series analysis is applied to an assessment of the temporary and permanent impact of the 1975 Australian Family Law Act and its effect on number of divorces. The application and construct validity of the model is examined. (Author/PN)
Descriptors: Court Litigation, Demography, Divorce, Evaluation Methods