Descriptor
Source
New Directions for Program… | 6 |
Author
Gollob, Harry | 1 |
Hedges, Larry V. | 1 |
McCleary, Richard | 1 |
Rallis, Sharon F., Ed. | 1 |
Reichardt, Charles | 1 |
Reichardt, Charles S., Ed. | 1 |
Riggs, James E. | 1 |
Rindskopf, David | 1 |
Sechrest, Lee, Ed. | 1 |
Publication Type
Journal Articles | 6 |
Reports - Evaluative | 4 |
Opinion Papers | 3 |
Collected Works - Serials | 2 |
Guides - Non-Classroom | 1 |
Reports - Descriptive | 1 |
Reports - Research | 1 |
Education Level
Audience
Location
Australia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating

Rindskopf, David – New Directions for Program Evaluation, 1986
Modeling the process by which participants are selected into groups, rather than adjusting for preexisting group differences, provides the basis for several new approaches to the analysis of data from nonrandomized studies. Econometric approaches, the propensity scores approach, and the relative assignment variable approach to the modeling of…
Descriptors: Effect Size, Experimental Groups, Intelligence Quotient, Mathematical Models

Reichardt, Charles; Gollob, Harry – New Directions for Program Evaluation, 1986
Causal models often omit variables that should be included, use variables that are measured fallibly, and ignore time lags. Such practices can lead to severely biased estimates of effects. The discussion explains these biases and shows how to take them into account. (Author)
Descriptors: Effect Size, Error of Measurement, High Schools, Mathematical Models

Hedges, Larry V. – New Directions for Program Evaluation, 1984
The adequacy of traditional effect size measures for research synthesis is challenged. Analogues to analysis of variance and multiple regression analysis for effect sizes are presented. The importance of tests for the consistency of effect sizes in interpreting results, and problems in obtaining well-specified models for meta-analysis are…
Descriptors: Analysis of Variance, Effect Size, Mathematical Models, Meta Analysis

Sechrest, Lee, Ed. – New Directions for Program Evaluation, 1993
Two chapters of this issue consider critical multiplism as a research strategy with links to meta analysis and generalizability theory. The unifying perspective it can provide for quantitative and qualitative evaluation is discussed. The third chapter explores meta analysis as a way to improve causal inferences in nonexperimental data. (SLD)
Descriptors: Causal Models, Evaluation Methods, Generalizability Theory, Inferences

McCleary, Richard; Riggs, James E. – New Directions for Program Evaluation, 1982
Time series analysis is applied to an assessment of the temporary and permanent impact of the 1975 Australian Family Law Act and its effect on number of divorces. The application and construct validity of the model is examined. (Author/PN)
Descriptors: Court Litigation, Demography, Divorce, Evaluation Methods

Reichardt, Charles S., Ed.; Rallis, Sharon F., Ed. – New Directions for Program Evaluation, 1994
The eight articles of this issue examine the nature of differences that arise between qualitative and quantitative researchers in program evaluation in terms of goals and epistemologies. The origins of these differences and their consequences are explored. Authors represent both perspectives but do not defend their ideological turfs. (SLD)
Descriptors: Conflict, Epistemology, Evaluation Methods, Ideology