NotesFAQContact Us
Collection
Advanced
Search Tips
Assessments and Surveys
Levels of Use of the…1
What Works Clearinghouse Rating
Showing 1 to 15 of 50 results Save | Export
Zid Mancenido – Annenberg Institute for School Reform at Brown University, 2022
Many teacher education researchers have expressed concerns with the lack of rigorous impact evaluations of teacher preparation practices. I summarize these various concerns as they relate to issues of internal validity, external validity, and measurement. I then assess the prevalence of these issues by reviewing 166 impact evaluations of teacher…
Descriptors: Teacher Education, Educational Research, Program Evaluation, Validity
Blase, Karen; Fixsen, Dean – US Department of Health and Human Services, 2013
This brief is part of a series that explores key implementation considerations. It focuses on the importance of identifying, operationalizing, and implementing the "core components" of evidence-based and evidence-informed interventions that likely are critical to producing positive outcomes. The brief offers a definition of "core components",…
Descriptors: Program Implementation, Program Evaluation, Evidence, Research Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cheung, Alan; Slavin, Robert – Society for Research on Educational Effectiveness, 2016
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Descriptors: Effect Size, Evidence Based Practice, Educational Change, Educational Policy
Stufflebeam, Daniel L.; Shinkfield, Anthony J. – Jossey-Bass, An Imprint of Wiley, 2007
"Evaluation Theory, Models, and Applications" is designed for evaluators and students who need to develop a commanding knowledge of the evaluation field: its history, theory and standards, models and approaches, procedures, and inclusion of personnel as well as program evaluation. This important book shows how to choose from a growing…
Descriptors: Program Evaluation, Evaluation Methods, Standards, Glossaries
Coalition for Evidence-Based Policy, 2007
The purpose of this Guide is to advise researchers, policymakers, and others on when it is possible to conduct a high-quality randomized controlled trial in education at reduced cost. Well-designed randomized controlled trials are recognized as the gold standard for evaluating the effectiveness of an intervention (i.e., program or practice) in…
Descriptors: Costs, Scores, Data, Research Design
Greene, Jennifer C. – 1985
The naturalistic research perspective assumes that reality is multiplistic, phenomenological, and context-dependent. This perspective legitimizes the subjective insights of the investigator by acknowledging the interdependence of facts and values as well as of the investigator and the object of investigation. Although discrepancies between…
Descriptors: Case Studies, Data Analysis, Guidelines, Program Evaluation
Peer reviewed Peer reviewed
Luker, William A.; And Others – Journal of Economic Education, 1984
Based on an independent analysis of the data used to evaluate the Developmental Economic Education Program, questions are raised about the methodology and conclusions reached by Walstad and Soper in an article published in the Winter 1982 issue of the Journal. The original study is also defended in two replies. (Author/RM)
Descriptors: Economics Education, Program Evaluation, Research Design, Research Methodology
Peer reviewed Peer reviewed
Edmonds, M. Leslie – Library Trends, 1987
Improvement in the research climate in children's librarianship is needed. There must be a determined demand for research, financial and job support for researchers, improvement in the research techniques for studying children, and implementation of actual research projects. Particular concerns include research on reading, school/public library…
Descriptors: Accountability, Cooperative Programs, Library Research, Program Evaluation
Peer reviewed Peer reviewed
Hewitt, J. E.; And Others – Environmental Monitoring and Assessment, 1993
Discusses a technique for estimating sample size that does not require an a priori definition of desired precision. Presents five modifications that make the method easier to use and reduce the probability of estimating a larger sample size than necessary. The technique can guide the design of environmental sampling programs. (Contains 31…
Descriptors: Environmental Education, Environmental Research, Evaluation Methods, Higher Education
Peer reviewed Peer reviewed
Gilbert, John K.; Swift, David J. – Science Education, 1985
Lakatos's methodology of scientific research programs is summarized and discussed for Piagetian schools and alternative conceptions movement. Commonalities/differences between these two rival programs are presented along with fundamental assumptions, auxiliary hypotheses, and research policy. Suggests that research findings should not be merely…
Descriptors: Elementary School Science, Elementary Secondary Education, Models, Program Evaluation
Keesling, J. Ward; Smith, Allen G. – 1980
Future research in Follow Through should be eclectic and comprehensive. In particular, the setting for Strand Two of the National Institute of Education's (NIE) research program for Follow Through should provide opportunities for sponsors of innovations to implement programs targeted at many outcomes. There should be a commitment to measuring…
Descriptors: Agency Role, Early Childhood Education, Program Evaluation, Research Design
Coalition for Evidence-Based Policy, 2005
This is a checklist of key items to get right when conducting a randomized controlled trial to evaluate an educational program or practice ("intervention"). It is intended as a practical resource for researchers and sponsors of research, describing items that are often critical to the success of a randomized controlled trial. A significant…
Descriptors: Educational Research, Program Evaluation, Intervention, Scientific Methodology
Hser, Y.; And Others – 1987
A repeated measures design was used to assess the effects of a methadone maintenance program in several California locations. Subjects were 720 Chicano and Anglo men and women participating in rehabilitation programs for heroin addicts. The subjects were interviewed between 1978-1981, and the average follow-up time after initial treatment was 4-6…
Descriptors: Adults, Behavior Change, Drug Addiction, Drug Rehabilitation
Dangel, Timothy R. – 1986
This paper reports the process by which Anne Arundel County Public Schools evaluated its mathematics program. The steps are outlined by which the evaluation was initiated, designed, implemented, and reported. The procedures developed to address and implement the evaluation findings and recommendations are described. The evaluation was intitiated…
Descriptors: Consultants, Curriculum Evaluation, Elementary Secondary Education, Evaluation Criteria
Miller, Michael K. – 1985
This report for researchers in rural sociology develops a conceptual and analytic framework for evaluating policies or programs through interrelated models. Three models--structural analysis, control programming, and simulation--allow researchers to assess policy/program impact, determine why expected impacts did or did not occur, and modify…
Descriptors: Evaluation Methods, Factor Analysis, Models, Path Analysis
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4