NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
Elementary and Secondary…1
Assessments and Surveys
National Assessment of…1
What Works Clearinghouse Rating
Showing 1 to 15 of 37 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Stephen Gorard – Review of Education, 2024
This paper describes, and lays out an argument for, the use of a procedure to help groups of reviewers to judge the quality of prior research reports. It argues why such a procedure is needed, and how other existing approaches are only relevant to some kinds of research, meaning that a review or synthesis cannot successfully combine quality…
Descriptors: Credibility, Research Reports, Evaluation Methods, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Ahn, Soyeon; Ames, Allison J.; Myers, Nicholas D. – Review of Educational Research, 2012
The current review addresses the validity of published meta-analyses in education that determines the credibility and generalizability of study findings using a total of 56 meta-analyses published in education in the 2000s. Our objectives were to evaluate the current meta-analytic practices in education, identify methodological strengths and…
Descriptors: Inferences, Meta Analysis, Educational Practices, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Lane, Kathleen; Wolery, Mark; Reichow, Brian; Rogers, Leslie – Journal of Behavioral Education, 2007
Experimental research involves comparing two or more conditions. One of those conditions in single subject research is often the baseline. Adequate description of the baseline conditions is necessary for evaluating the effects of independent variables, drawing generalizations from studies, and providing information for subsequent replication…
Descriptors: Research Methodology, Evaluation Methods, Generalization, Research Design
Peer reviewed Peer reviewed
Renger, Ralph; Cimetta, Adriana; Pettygrove, Sydney; Rogan, Seumas – American Journal of Evaluation, 2002
Describes how Geographic Information Systems (GIS) can be used to help evaluators convey complex information simply through a spatial representation. Demonstrates how GIS can be used to plot change over time, including impact and outcome data gathered by primary data collection. (SLD)
Descriptors: Change, Data Collection, Evaluation Methods, Evaluation Utilization
Stake, Robert E. – 1972
The definition, structures, utilities, stimulus-response differences, and portrayals of responsive evaluation are presented. An educational evaluation is said to be a "responsive evaluation" if it orients more directly to program activities than to program intents, if it responds to audience requirements for information, and if the different…
Descriptors: Data Collection, Evaluation Criteria, Evaluation Methods, Information Needs
California State Legislature, Sacramento. Joint Committee on Educational Goals and Evaluation. – 1970
This report of the Joint Committee on Educational Goals and Evaluation to the California Legislature discusses the need for educational goals and evaluation, the committee investigation, and the collection of information. As a result of the investigation, it was concluded that: (1) it is essential that the goal-setting process include the…
Descriptors: Data Collection, Educational Objectives, Educational Research, Evaluation Methods
Peer reviewed Peer reviewed
Stake, Robert – Canadian Journal of Program Evaluation/La Revue canadienne d'evaluation de programme, 1996
The presentation of case study research in this book is based on a disciplined, qualitative inquiry into a single case, rather than quantitative methodologies or case studies for instructional purposes. Practical suggestions are given for gathering and organizing data and for data interpretation and reporting. (SLD)
Descriptors: Case Studies, Data Collection, Evaluation Methods, Evaluation Research
Hall, Melvin E. – 1979
Program portrayal is one way of addressing the need for increased descriptive capability in evaluation research. Portrayal supplements traditional reporting by utilizing subjective, anecdotal, or impressionistic information, in an appropriately communicable form, to enrich the description of program transactions, settings, and outcomes. It is…
Descriptors: Data Collection, Evaluation Methods, Evaluators, Program Descriptions
King, Jean A.; Morris, Lynn Lyons; Fitz-Gibbon, Carol Taylor – 1987
The "CSE Program Evaluation Kit" is a series of nine books intended to assist people conducting program evaluations. This volume, fifth in the kit, discusses the role and importance of implementation evaluation and presents methods for designing, using, and reporting the results using assessment instruments to describe accurately how a program…
Descriptors: Data Collection, Evaluation Methods, Information Dissemination, Planning
Veale, James R.; Morley, Raymond E.; Erickson, Cynthia L. – 2002
This book is an effort to document ideas, processes, and outcomes based on input from many people involved in accountability of interagency collaborative services programs. The goals, processes, instruments, and reporting systems presented have been developed through a team process over a 10-year period of involvement with Iowa's School-Based…
Descriptors: Agency Cooperation, Data Collection, Delivery Systems, Elementary Secondary Education
Thorpe, Jo Anne L. – 1986
Practicality is the hallmark of this physical education research manual. It focuses on applications with which investigators in this field typically are concerned; numerous examples clarify these points. All phases of research are covered, beginning with writing mechanics and style and a blueprint of the chapters that comprise the creditable…
Descriptors: Computers, Data Collection, Evaluation Methods, Measurement Techniques
Peer reviewed Peer reviewed
Cooper, Harris M. – Review of Educational Research, 1982
Research review is conceptualized as a scientific inquiry involving five stages that parallel those of primary research. The functions, sources of variance, and potential threats to validity associated with each stage are described. (Author/PN)
Descriptors: Data Collection, Evaluation Criteria, Evaluation Methods, Guidelines
Peer reviewed Peer reviewed
Mathison, Sandra. – Educational Researcher, 1988
Triangulation strategy results in evidence characterized by the following: (1) convergence; (2) inconsistency; and (3) contradiction. In order to render the data sensible, the researcher or evaluator must report data collection procedures, as well as the three levels of information from which explanations of social phenomena are constructed. (BJV)
Descriptors: Data Collection, Data Interpretation, Evaluation Methods, Evaluators
Central New York Regional Planning and Development Board, Syracuse. – 1971
Procedures for evaluating day care centers are described, formulated specifically with regard to the information available through, and planning requirements of, the Syracuse Model City Agency. The first chapter discusses some problems involved in providing adequate day care services on the national level, the city level, and within the Syracuse…
Descriptors: Child Welfare, Data Collection, Day Care, Day Care Centers
Holcomb, Edie L. – 1999
This book outlines a process for showing how well a school or district meets its goal of sustained student learning. The first section provides a knowledge base for collecting and reporting educational data. The next section reflects the components of educational change and answers "how to" questions about data gathering, analysis, and reporting…
Descriptors: Communication (Thought Transfer), Data Collection, Educational Improvement, Educational Research
Previous Page | Next Page ยป
Pages: 1  |  2  |  3