NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 3,991 to 4,005 of 10,160 results Save | Export
Peer reviewed Peer reviewed
Williams, John D. – Multiple Linear Regression Viewpoints, 1980
Multiple comparisons involve the examination of which group or groups are actually different from other group(s) in analysis of variance results. Such comparisons usually involve one-way analysis of variance. This monograph discusses designs more complex than one-way designs. (JKS)
Descriptors: Analysis of Variance, Data Analysis, Hypothesis Testing, Research Design
Peer reviewed Peer reviewed
Edgington, Eugene S. – Journal of Educational Statistics, 1980
Valid statistical tests for one-subject experiments are necessary to justify statistical inferences and to ensure the acceptability of research reports to a wide range of journals and readers. The validity of randomization tests for one-subject experiments is examined. (See TM 505 800-801).(Author/JKS)
Descriptors: Experimental Groups, Hypothesis Testing, Research Design, Statistical Data
Peer reviewed Peer reviewed
Edgington, Eugene S. – Journal of Educational Statistics, 1980
Two types of problems supposedly associated with the use of randomization tests for single-subject experiments have been discussed: the random introduction of treatments and the repeated alternation of treatments. Ways to reduce the adverse effects associated with these problems are presented. (See TM 505 799-800). (Author/JKS) (Author/JKS)
Descriptors: Experimental Groups, Hypothesis Testing, Research Design, Statistical Data
Peer reviewed Peer reviewed
Rush, Jean C.; Kratochwill, Thomas R. – Studies in Art Education, 1981
The authors describe some of the applications of time-series, its advantages in applied work, and methodological issues of concern to researchers. (Author/SJL)
Descriptors: Art Education, Behavior Change, Research Design, Trend Analysis
Fuqua, Dale R. – Improving Human Performance Quarterly, 1979
Provides a condensed description of the criteria and choices for selecting a particular assessment, evaluation, or research design. Strengths and weaknesses of each design are presented. (Author/JEG)
Descriptors: Consultants, Evaluation Methods, Organizational Development, Pretests Posttests
Peer reviewed Peer reviewed
Strain, Phillip S.; Shores, Richard E. – American Journal of Mental Deficiency, 1979
A number of measurement and design issues that are critical to the use of multiple baseline procedures in evaluating instructional interventions with mentally retarded persons are highlighted. (For related information, see EC 122 157.) (Author/CL)
Descriptors: Exceptional Child Research, Mental Retardation, Research Design, Research Methodology
Peer reviewed Peer reviewed
Darom, Efraim; Rimon, Jonathan – Educational and Psychological Measurement, 1979
A computer analysis for a split plot design with unequal number of subjects in treatment groups is described. The technique presented is based on the Statistical Package for the Social Sciences (SPSS). An example is worked out, and full documentation is presented. (Author/JKS)
Descriptors: Analysis of Variance, Computer Programs, Program Descriptions, Research Design
Peer reviewed Peer reviewed
Meyer, Lennart; And Others – Educational and Psychological Measurement, 1979
G analysis is a data analytic method based on the G index of agreement. A generalized technique for the rotation of factors in G analysis in which marker variables are brought into alignment with their adjoining reference axis is presented. (Author/JKS)
Descriptors: Correlation, Discriminant Analysis, Oblique Rotation, Research Design
Peer reviewed Peer reviewed
Hummel, Thomas J.; Johnston, Charles B. – Journal of Educational Statistics, 1979
Stochastic approximation is suggested as a useful technique in areas where individuals have a goal firmly in mind, but lack sufficient knowledge to design an efficient, more traditional experiment. One potential area of application for stochastic approximation is that of formative evaluation. (CTM)
Descriptors: Monte Carlo Methods, Research Design, Statistical Analysis, Technical Reports
Peer reviewed Peer reviewed
Braver, Sanford L.; Smith, Melanie C. – Evaluation and Program Planning, 1996
The Combined Modified Design is presented to overcome problems of internal and external validity in the conduct of true experiments in longitudinal studies. It combines a modified version of the Randomized Invitation Design with the lottery longitudinal experimental field trial situation. (SLD)
Descriptors: Experiments, Field Studies, Longitudinal Studies, Research Design
Peer reviewed Peer reviewed
Caracelli, Valerie J.; Greene, Jennifer C. – New Directions for Evaluation, 1997
Two broad classes of mixed-method designs--component and integrated--that have the potential to combine elements of different inquiry traditions are described. The conceptual ideas advanced in the first chapter are illustrated through selected examples of several mixed-method integrated models. (Author/SLD)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation
Peer reviewed Peer reviewed
Datta, Lois-ellin – New Directions for Evaluation, 1997
A pragmatic framework for making decisions about mixed-method designs is proposed and then applied to illustrative evaluation case studies to help identify the strengths and limitations of making practical, contextual, and consequential considerations a primary basis for evaluation design decisions. (Author)
Descriptors: Case Studies, Evaluation Methods, Models, Program Evaluation
Peer reviewed Peer reviewed
Anderson, Paul V. – College Composition and Communication, 1998
Discusses ethical issues involved with person-based research. Discusses the ethical discourse embodied in the "Nuremberg Code," federal regulations, and the "Belmont Report." Discusses several specific issues in research ethics to illustrate how this discourse provides new ways of thinking about what must be done to treat…
Descriptors: Ethics, Higher Education, Research Design, Research Methodology
Peer reviewed Peer reviewed
Burke, Lisa A. – Human Resource Development Quarterly, 1996
Steps for conducting field research include generating the idea, finding sites, selling the idea, identifying longitudinal concerns, meeting human subject guidelines, motivating participants, obtaining good data, exerting control, creating ownership, handling conflict, and presenting results. (SK)
Descriptors: Field Studies, Human Resources, Research Design, Research Methodology
Peer reviewed Peer reviewed
Watkins, Ryan; Schlosser, Charles – Quarterly Review of Distance Education, 2003
Suggests a starting place for formal inquiry into distance education. Topics covered include: background on educational research and research on distance education; research paradigms that are applicable to distance education; subsystems of a distance education program; and a matrix for conceptualizing distance education research. (MES)
Descriptors: Distance Education, Educational Research, Models, Research Design
Pages: 1  |  ...  |  263  |  264  |  265  |  266  |  267  |  268  |  269  |  270  |  271  |  ...  |  678