Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 22 |
Descriptor
Evaluation Research | 29 |
Intervention | 29 |
Program Evaluation | 29 |
Program Effectiveness | 17 |
Evaluation Methods | 14 |
Educational Research | 9 |
Research Methodology | 8 |
Research Design | 7 |
Effect Size | 4 |
Evaluation Problems | 4 |
Foreign Countries | 4 |
More ▼ |
Source
Author
Publication Type
Journal Articles | 23 |
Reports - Evaluative | 13 |
Reports - Descriptive | 7 |
Reports - Research | 6 |
Information Analyses | 2 |
Opinion Papers | 2 |
Education Level
Elementary Secondary Education | 5 |
Adult Education | 4 |
Early Childhood Education | 1 |
Elementary Education | 1 |
Higher Education | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Preschool Education | 1 |
Audience
Researchers | 2 |
Policymakers | 1 |
Practitioners | 1 |
Location
United Kingdom | 3 |
Maryland | 1 |
Slovenia | 1 |
United States | 1 |
Laws, Policies, & Programs
No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
What Works Clearinghouse, 2017
Many studies of education interventions make claims about impacts on students' outcomes. Some studies have designs that enable readers to make causal inferences about the effects of an intervention but others have designs that do not permit these types of conclusions. To help policymakers, practitioners, and others make sense of study results, the…
Descriptors: Educational Research, Intervention, Program Evaluation, Program Effectiveness
Bloom, Howard S.; Spybrook, Jessaca – Journal of Research on Educational Effectiveness, 2017
Multisite trials, which are being used with increasing frequency in education and evaluation research, provide an exciting opportunity for learning about how the effects of interventions or programs are distributed across sites. In particular, these studies can produce rigorous estimates of a cross-site mean effect of program assignment…
Descriptors: Program Effectiveness, Program Evaluation, Sample Size, Evaluation Research
What Works Clearinghouse, 2017
The What Works Clearinghouse (WWC) evaluates research studies that look at the effectiveness of education programs, products, practices, and policies, which the WWC calls "interventions." Many studies of education interventions make claims about impacts on students' outcomes. Some studies have designs that enable readers to make causal…
Descriptors: Program Design, Program Development, Program Effectiveness, Program Evaluation
See, Beng Huat – Research in Education, 2018
With the push for evidence-informed policy and practice, schools and policy makers are now increasingly encouraged and supported to use and enagage with research evidence. This means that consumers of research will now need to be discerning in judging the quality of research evidence that will inform their decisions. This paper evaluates the…
Descriptors: Evaluation Research, Evidence, Literature Reviews, Research Methodology
Ahlin, Eileen M. – American Journal of Evaluation, 2015
Evaluation research conducted in agencies that sanction law violators is often challenging and due process may preclude evaluators from using experimental methods in traditional criminal justice agencies such as police, courts, and corrections. However, administrative agencies often deal with the same population but are not bound by due process…
Descriptors: Research Methodology, Evaluation Research, Criminals, Correctional Institutions
Supovitz, Jonathan – Yearbook of the National Society for the Study of Education, 2013
Design-based implementation research offers the opportunity to rethink the relationships between intervention, research, and situation to better attune research and evaluation to the program development process. Using a heuristic called the intervention development curve, I describe the rough trajectory that programs typically follow as they…
Descriptors: Research Methodology, Instructional Design, Educational Research, Intervention
Lendrum, Ann; Humphrey, Neil – Oxford Review of Education, 2012
Implementation refers to the process by which an intervention is put into practice. Research studies across multiple disciplines, including education, have consistently demonstrated that interventions are rarely implemented as designed and, crucially, that variability in implementation is related to variability in the achievement of expected…
Descriptors: Intervention, Foreign Countries, Fidelity, Evaluation Research
Darrow, Catherine L. – Society for Research on Educational Effectiveness, 2010
This paper examines measures used by studies associated with Preschool Curriculum Evaluation Research Initiative (PCER) funded by the Institute of Education Sciences. Analysis of 17 measures of fidelity used by 13 curriculum interventions highlights how research in preschools is and is not measuring fidelity of implementation. The following…
Descriptors: Preschool Curriculum, Evaluation Research, Curriculum Evaluation, Program Implementation
Gulikers, Judith T. M.; Baartman, Liesbeth K. J.; Biemans, Harm J. A. – Evaluation and Program Planning, 2010
Schools are held more responsible for evaluating, quality assuring and improving their student assessments. Teachers' lack of understanding of new, competence-based assessments as well as the lack of key stakeholders' involvement, hamper effective and efficient self-evaluations by teachers of innovative, competence-based assessments (CBAs). While…
Descriptors: Vocational Education, Stakeholders, Educational Assessment, Student Evaluation
Schochet, Peter Z.; Puma, Mike; Deke, John – National Center for Education Evaluation and Regional Assistance, 2014
This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…
Descriptors: Statistical Analysis, Evaluation Methods, Educational Research, Intervention
Stockard, Jean – Current Issues in Education, 2010
A large body of literature documents the central importance of fidelity of program implementation in creating an internally valid research design and considering such fidelity in judgments of research quality. The What Works Clearinghouse (WWC) provides web-based summary ratings of educational innovations and is the only rating group that is…
Descriptors: Research Design, Educational Innovation, Program Implementation, Program Effectiveness
Spybrook, Jessaca; Raudenbush, Stephen W. – Educational Evaluation and Policy Analysis, 2009
This article examines the power analyses for the first wave of group-randomized trials funded by the Institute of Education Sciences. Specifically, it assesses the precision and technical accuracy of the studies. The authors identified the appropriate experimental design and estimated the minimum detectable standardized effect size (MDES) for each…
Descriptors: Research Design, Research Methodology, Effect Size, Correlation
Sentocnik, Sonja; Rupar, Brigita – European Education, 2009
Current educational literature suggests that distributing leadership in schools can facilitate individual and organizational development. While many state agencies in the United States and Europe are encouraging schools to reshape their leadership practice to distribute responsibilities for leadership tasks across roles, empirical research on how…
Descriptors: Evaluation Research, Foreign Countries, Organizational Development, Instructional Leadership
Peck, Laura R.; Gorzalski, Lindsey M. – Journal of MultiDisciplinary Evaluation, 2009
Background: Research on evaluation use focuses on putting evaluation recommendations into practice. Prior theoretical research proposes varied frameworks for understanding the use (or lack) of program evaluation results. Purpose: Our purpose is to create and test a single, integrated framework for understanding evaluation use. This article relies…
Descriptors: Evaluation Research, Intervention, Program Evaluation, Content Analysis
Wallace, Tanner LeBaron – Studies in Educational Evaluation, 2008
This article describes an effectiveness evaluation of an intensive case management intervention coordinated by a non-profit organization in a midsize Midwest City. As an effectiveness evaluation, the primary evaluation question was causal in nature; the key task of the evaluative study was to establish and probe connections between the…
Descriptors: Intervention, Youth, Program Effectiveness, Evaluation Methods
Previous Page | Next Page ยป
Pages: 1 | 2