Descriptor
Research Methodology | 6 |
Statistical Analysis | 6 |
Evaluation Methods | 3 |
Databases | 2 |
Meta Analysis | 2 |
Models | 2 |
Program Evaluation | 2 |
Qualitative Research | 2 |
Research Design | 2 |
Research Problems | 2 |
Analysis of Variance | 1 |
More ▼ |
Source
New Directions for Program… | 6 |
Author
Bowering, David J. | 1 |
Fortune, Jim C. | 1 |
Hedges, Larry V. | 1 |
McBee, Janice K. | 1 |
McCleary, Richard | 1 |
Rallis, Sharon F., Ed. | 1 |
Reichardt, Charles S., Ed. | 1 |
Riggs, James E. | 1 |
Sechrest, Lee, Ed. | 1 |
Publication Type
Journal Articles | 6 |
Reports - Evaluative | 3 |
Reports - Research | 3 |
Collected Works - Serials | 2 |
Guides - Non-Classroom | 2 |
Opinion Papers | 1 |
Education Level
Audience
Location
Australia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating

Hedges, Larry V. – New Directions for Program Evaluation, 1984
The adequacy of traditional effect size measures for research synthesis is challenged. Analogues to analysis of variance and multiple regression analysis for effect sizes are presented. The importance of tests for the consistency of effect sizes in interpreting results, and problems in obtaining well-specified models for meta-analysis are…
Descriptors: Analysis of Variance, Effect Size, Mathematical Models, Meta Analysis

Sechrest, Lee, Ed. – New Directions for Program Evaluation, 1993
Two chapters of this issue consider critical multiplism as a research strategy with links to meta analysis and generalizability theory. The unifying perspective it can provide for quantitative and qualitative evaluation is discussed. The third chapter explores meta analysis as a way to improve causal inferences in nonexperimental data. (SLD)
Descriptors: Causal Models, Evaluation Methods, Generalizability Theory, Inferences

Fortune, Jim C.; McBee, Janice K. – New Directions for Program Evaluation, 1984
Twenty-nine steps necessary for data file preparation for secondary analysis are discussed. Data base characteristics and planned use vary the complexity of the preparation. Required techniques (file verification, sample verification, file merger, data aggregation, file modification, and variable controls) and seven associated pitfalls are defined…
Descriptors: Computer Storage Devices, Data Analysis, Data Collection, Data Processing

McCleary, Richard; Riggs, James E. – New Directions for Program Evaluation, 1982
Time series analysis is applied to an assessment of the temporary and permanent impact of the 1975 Australian Family Law Act and its effect on number of divorces. The application and construct validity of the model is examined. (Author/PN)
Descriptors: Court Litigation, Demography, Divorce, Evaluation Methods

Reichardt, Charles S., Ed.; Rallis, Sharon F., Ed. – New Directions for Program Evaluation, 1994
The eight articles of this issue examine the nature of differences that arise between qualitative and quantitative researchers in program evaluation in terms of goals and epistemologies. The origins of these differences and their consequences are explored. Authors represent both perspectives but do not defend their ideological turfs. (SLD)
Descriptors: Conflict, Epistemology, Evaluation Methods, Ideology

Bowering, David J. – New Directions for Program Evaluation, 1984
This case study describes how path analysis and causal modeling were used to assess the impact of federal research and development spending on Ph.D production in science and engineering at leading research universities. The nature of existing data, integrated into a single data base from seven surveys, influenced the research methodology. (BS)
Descriptors: College Science, Databases, Doctoral Degrees, Engineering Education