NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)1
Since 2006 (last 20 years)29
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 38 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
See, Beng Huat – Research in Education, 2018
With the push for evidence-informed policy and practice, schools and policy makers are now increasingly encouraged and supported to use and enagage with research evidence. This means that consumers of research will now need to be discerning in judging the quality of research evidence that will inform their decisions. This paper evaluates the…
Descriptors: Evaluation Research, Evidence, Literature Reviews, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Ahlin, Eileen M. – American Journal of Evaluation, 2015
Evaluation research conducted in agencies that sanction law violators is often challenging and due process may preclude evaluators from using experimental methods in traditional criminal justice agencies such as police, courts, and corrections. However, administrative agencies often deal with the same population but are not bound by due process…
Descriptors: Research Methodology, Evaluation Research, Criminals, Correctional Institutions
Peer reviewed Peer reviewed
Direct linkDirect link
Patton, Michael Quinn – American Journal of Evaluation, 2015
Our understanding of programs is enhanced when trained, skilled, and observant evaluators go "into the field"--the real world where programs are conducted--paying attention to what's going on, systematically documenting what they see, and reporting what they learn. The article opens by presenting and illustrating twelve reasons for…
Descriptors: Program Evaluation, Evaluation Methods, Design Requirements, Field Studies
Peer reviewed Peer reviewed
Direct linkDirect link
Supovitz, Jonathan – Yearbook of the National Society for the Study of Education, 2013
Design-based implementation research offers the opportunity to rethink the relationships between intervention, research, and situation to better attune research and evaluation to the program development process. Using a heuristic called the intervention development curve, I describe the rough trajectory that programs typically follow as they…
Descriptors: Research Methodology, Instructional Design, Educational Research, Intervention
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bloom, Howard S.; Porter, Kristin E.; Weiss, Michael J.; Raudenbush, Stephen – Society for Research on Educational Effectiveness, 2013
To date, evaluation research and policy analysis have focused mainly on average program impacts and paid little systematic attention to their variation. Recently, the growing number of multi-site randomized trials that are being planned and conducted make it increasingly feasible to study "cross-site" variation in impacts. Important…
Descriptors: Research Methodology, Policy, Evaluation Research, Randomized Controlled Trials
Spybrook, Jessaca; Lininger, Monica; Cullen, Anne – Society for Research on Educational Effectiveness, 2011
The purpose of this study is to extend the work of Spybrook and Raudenbush (2009) and examine how the research designs and sample sizes changed from the planning phase to the implementation phase in the first wave of studies funded by IES. The authors examine the impact of the changes in terms of the changes in the precision of the study from the…
Descriptors: Evaluation Criteria, Sampling, Research Design, Planning
Peer reviewed Peer reviewed
Direct linkDirect link
Rogers, Paul; Whitney, Anne Elrod; Bright, Alison; Cabe, Rosemary; Dewar, Tim; Null, Suzie Y. – English Education, 2011
In this article, a group of inservice providers and beginning researchers describe their experiences in learning to conduct evaluation research on a long-term school-university partnership program. We offer the practical lessons we learned in how to undertake such a study, and we share the immediate and powerful effects that the process of…
Descriptors: Evaluation Research, Program Evaluation, Teacher Researchers, College School Cooperation
Kimball, Steven M.; Lander, Rachel; Thorn, Christopher A. – Wisconsin Center for Education Research (NJ1), 2010
Beginning in 2002, The Chicago Community Trust embarked on an ambitious grant- making strategy to improve education outcomes primarily in the city of Chicago and Cook County, Illinois. Known as The Education Initiative, this effort focused on three priority grant areas: literacy, professional development, and alternative models of schools. In…
Descriptors: Educational Change, Urban Schools, Philanthropic Foundations, Grants
Peer reviewed Peer reviewed
Direct linkDirect link
Coryn, Chris L. S.; Noakes, Lindsay A.; Westine, Carl D.; Schroter, Daniela C. – American Journal of Evaluation, 2011
Although the general conceptual basis appeared far earlier, theory-driven evaluation came to prominence only a few decades ago with the appearance of Chen's 1990 book "Theory-Driven Evaluations." Since that time, the approach has attracted many supporters as well as detractors. In this paper, 45 cases of theory-driven evaluations, published over a…
Descriptors: Evidence, Program Evaluation, Educational Practices, Literature Reviews
Peer reviewed Peer reviewed
Direct linkDirect link
Sanders, James R.; Nafziger, Dean N. – Journal of MultiDisciplinary Evaluation, 2011
The purpose of this paper is to provide a basis for judging the adequacy of evaluation plans or, as they are commonly called, evaluation designs. The authors assume that using the procedures suggested in this paper to determine the adequacy of evaluation designs in advance of actually conducting evaluations will lead to better evaluation designs,…
Descriptors: Check Lists, Program Evaluation, Research Design, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Stufflebeam, Daniel L. – Journal of MultiDisciplinary Evaluation, 2011
Good evaluation requires that evaluation efforts themselves be evaluated. Many things can and often do go wrong in evaluation work. Accordingly, it is necessary to check evaluations for problems such as bias, technical error, administrative difficulties, and misuse. Such checks are needed both to improve ongoing evaluation activities and to assess…
Descriptors: Program Evaluation, Evaluation Criteria, Evaluation Methods, Definitions
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, Eunjung; Mishna, Faye; Brennenstuhl, Sarah – Research on Social Work Practice, 2010
The purpose of this article is to develop guidelines to assist practitioners and researchers in evaluating and developing rigorous case studies. The main concern in evaluating a case study is to accurately assess its quality and ultimately to offer clients social work interventions informed by the best available evidence. To assess the quality of…
Descriptors: Research Design, Program Evaluation, Guidelines, Data Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Stockard, Jean – Current Issues in Education, 2010
A large body of literature documents the central importance of fidelity of program implementation in creating an internally valid research design and considering such fidelity in judgments of research quality. The What Works Clearinghouse (WWC) provides web-based summary ratings of educational innovations and is the only rating group that is…
Descriptors: Research Design, Educational Innovation, Program Implementation, Program Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Tourmen, Claire – American Journal of Evaluation, 2009
How do evaluation practitioners make choices when they evaluate a program? What function do evaluation theories play in practice? In this article, I report on an exploratory study that examined evaluation practices in France. The research began with observations of practitioners' activities, with a particular focus on the phases of evaluation…
Descriptors: Evaluators, Foreign Countries, Evaluation Methods, Theory Practice Relationship
Peer reviewed Peer reviewed
Direct linkDirect link
Compton, Donald W. – New Directions for Evaluation, 2009
Donald W. Compton, the first director of evaluation services at the National Home Office (Atlanta) of the American Cancer Society, tells the story of building the unit in conditions of high demand and a limited budget. Along the way, evaluation was brought to regional divisions and to local offices in part as a response to United Way and to his…
Descriptors: Evaluators, Evaluation Methods, Program Evaluation, Evaluation Research
Previous Page | Next Page ยป
Pages: 1  |  2  |  3