Descriptor
Research Methodology | 11 |
Research Design | 7 |
Program Evaluation | 6 |
Quasiexperimental Design | 4 |
Validity | 4 |
Evaluation Methods | 3 |
Meta Analysis | 3 |
Models | 3 |
Research Problems | 3 |
Data Analysis | 2 |
Data Collection | 2 |
More ▼ |
Source
New Directions for Program… | 11 |
Author
Wortman, Paul M. | 2 |
Andrew, Loyd D. | 1 |
Bryant, Fred B. | 1 |
Cordray, David S. | 1 |
Fortune, Jim C. | 1 |
Light, Richard J. | 1 |
Lipsey, Mark W. | 1 |
Mark, Melvin M. | 1 |
McBee, Janice K. | 1 |
McCleary, Richard | 1 |
Riggs, James E. | 1 |
More ▼ |
Publication Type
Journal Articles | 11 |
Reports - Evaluative | 5 |
Guides - Non-Classroom | 3 |
Reports - Research | 3 |
Opinion Papers | 2 |
Information Analyses | 1 |
Reports - Descriptive | 1 |
Education Level
Audience
Location
Australia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating

Mark, Melvin M. – New Directions for Program Evaluation, 1986
Several validity typologies are overviewed then integrated. The integrated framework is used to contrast the positions of Campbell and Cronback. The current practice and logic of quasi-experimentation is critiqued, and expansions beyond the primary focus of dominant validity typologies are suggested. (BS)
Descriptors: Evaluative Thinking, Generalization, Program Evaluation, Quasiexperimental Design

Cordray, David S. – New Directions for Program Evaluation, 1986
The role of human judgment in the development and synthesis of evidence has not been adequately developed or acknowledged within quasi-experimental analysis. Corrective solutions need to confront the fact that causal analysis within complex environments will require a more active assessment that entails reasoning and statistical modeling.…
Descriptors: Evaluative Thinking, Models, Program Effectiveness, Program Evaluation

Yeaton, William; Sechrest, Lee – New Directions for Program Evaluation, 1987
In no-difference research, no differences are found among groups or conditions. This article summarizes the existing commentary on such research. The characteristics of no-difference research, its acceptance by the research community, strategies for conducting such studies, and its centrality within the experimental and nonexperimental paradigms…
Descriptors: Evaluation Methods, Literature Reviews, Models, Program Evaluation

Shadish, William R., Jr.; And Others – New Directions for Program Evaluation, 1986
Since usually no defensible option for performing a task within quasi-experimentation is unbiased, it is desirable to select several options that reflect biases in different directions. The benefits of applying a critical multiplism approach to causal hypotheses, group nonequivalence, and units of analysis in quasi-experimentation is discussed.…
Descriptors: Bias, Matched Groups, Program Evaluation, Quasiexperimental Design

Yeaton, William H.; Wortman, Paul M. – New Directions for Program Evaluation, 1984
Solutions to methodological problems in medical technologies research synthesis relating to temporal change, persons receiving treatment, research design, analytic procedures, and threats to validity are presented. These solutions should help with the planning and methodogy for research synthesis in other areas. (BS)
Descriptors: Medical Research, Meta Analysis, Patients, Research Design

Light, Richard J. – New Directions for Program Evaluation, 1984
Using examples from government program evaluation studies, six areas where research synthesis is more effective than individual studies are presented. Research synthesis can: (1) help match treatment type with recipient type; (2) explain which treatment features matter; (3) explain conflicting results; (4) determine critical outcomes; (5) assess…
Descriptors: Aptitude Treatment Interaction, Evaluation Methods, Federal Programs, Meta Analysis

Lipsey, Mark W. – New Directions for Program Evaluation, 1993
Explores the role of theory in strengthening causal interpretations in nonexperimental research. Evaluators must conduct theory-driven research, concentrating on "small theory," in that the focus is on the explanation of processes specific to the program being evaluated. Theory-guided treatment research must be programmatic and…
Descriptors: Causal Models, Effect Size, Evaluators, Generalization

Bryant, Fred B.; Wortman, Paul M. – New Directions for Program Evaluation, 1984
Methods for selecting relevant and appropriate quasi-experimental studies for inclusion in research synthesis using the threats-to-validity approach are presented. Effects of including and excluding studies are evaluated. (BS)
Descriptors: Evaluation Criteria, Meta Analysis, Quasiexperimental Design, Research Methodology

Fortune, Jim C.; McBee, Janice K. – New Directions for Program Evaluation, 1984
Twenty-nine steps necessary for data file preparation for secondary analysis are discussed. Data base characteristics and planned use vary the complexity of the preparation. Required techniques (file verification, sample verification, file merger, data aggregation, file modification, and variable controls) and seven associated pitfalls are defined…
Descriptors: Computer Storage Devices, Data Analysis, Data Collection, Data Processing

Andrew, Loyd D. – New Directions for Program Evaluation, 1984
A detailed examination of the Higher Education General Information Survey (HEGIS) data base--its content, development, and past and potential uses for researchers--is presented. Because knowledge of the data elements is crucial for successful secondary analysis, procedures for collecting needed information on the data in HEGIS are suggested. (BS)
Descriptors: Data Analysis, Data Collection, Databases, Educational Research

McCleary, Richard; Riggs, James E. – New Directions for Program Evaluation, 1982
Time series analysis is applied to an assessment of the temporary and permanent impact of the 1975 Australian Family Law Act and its effect on number of divorces. The application and construct validity of the model is examined. (Author/PN)
Descriptors: Court Litigation, Demography, Divorce, Evaluation Methods