NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
No Child Left Behind Act 20012
Assessments and Surveys
Stanford Achievement Tests1
What Works Clearinghouse Rating
Showing 1 to 15 of 48 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Phuntsho Choden; Fiona Cram – American Journal of Evaluation, 2024
Bhutan's overarching development paradigm of Gross National Happiness (GNH) promotes a harmonious balance between material and non-material dimensions. But Bhutan's evaluation practice has not yet adopted the principles of GNH, preventing evaluation findings and recommendations from aligning with the priorities of GNH. This article makes the case…
Descriptors: Foreign Countries, Cultural Influences, Cultural Relevance, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Suzanne Nobrega; Kasper Edwards; Mazen El Ghaziri; Lauren Giacobbe; Serena Rice; Laura Punnett – American Journal of Evaluation, 2024
Program evaluations that lack experimental design often fail to produce evidence of impact because there is no available control group. Theory-based evaluations can generate evidence of a program's causal effects if evaluators collect evidence along the theorized causal chain and identify possible competing causes. However, few methods are…
Descriptors: STEM Education, Gender Differences, Intervention, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Rocha, Ana Cristina; Silva, Márcia; Duarte, Cidália – Sex Education: Sexuality, Society and Learning, 2022
Despite growing interest in the effectiveness of sexuality education, only a single systematic review regarding its evaluation has been conducted. This review aimed to systematically analyse studies that include the evaluation of sexuality education for adolescents using the Context, Input, Process, Product (CIPP) model, and summarise evidence…
Descriptors: Sex Education, Program Evaluation, Adolescents, Cultural Differences
Peer reviewed Peer reviewed
Direct linkDirect link
Michele A. Parker; Tamara M. Walser; Satlaj Dighe – Discover Education, 2024
This perspective paper describes the strategic development of evaluation and organizational learning programs focused on building core knowledge and skills at a university with high research activity. New program development allows academic leaders to introduce relevant and cutting-edge knowledge to students. Nevertheless, the program development…
Descriptors: Cooperation, Context Effect, Program Development, Doctoral Programs
Peer reviewed Peer reviewed
Direct linkDirect link
Merchie, Emmelien; Tuytens, Melissa; Devos, Geert; Vanderlinde, Ruben – Research Papers in Education, 2018
Evaluating teachers' professional development initiatives (PDI) is one of the main challenges for the teacher professionalisation field. Although different studies have focused on the effectiveness of PDI, the obtained effects and evaluative methods have been found to be widely divergent. By means of a narrative review, this study provides an…
Descriptors: Program Evaluation, Program Effectiveness, Faculty Development, Teacher Education Programs
Peer reviewed Peer reviewed
Direct linkDirect link
Bennett, Anna – Educational Research and Evaluation, 2018
This article draws on findings from a national review of the evaluation of access and equity initiatives across Australian higher education to argue that utilising responsive mixed methods focused on the values of participants enables crucial understanding of what matters to the people involved. Based on the evidence collected, a "what…
Descriptors: Foreign Countries, Access to Education, Equal Education, Program Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Bamber, Veronica; Stefani, Lorraine – International Journal for Academic Development, 2016
Measurable targets, key performance indicators, value for money--whatever we may think of the "impact agenda," it looks like it is here to stay. Are we trapped in a positivist, new managerialist spiral of demonstrating the value of our work, or can we take the lead in reframing the discourse on how educational development proves its…
Descriptors: Theory Practice Relationship, Educational Development, Evidence, Value Judgment
Peer reviewed Peer reviewed
Direct linkDirect link
Vanderkruik, Rachel; McPherson, Marianne E. – American Journal of Evaluation, 2017
Evaluating initiatives implemented across multiple settings can elucidate how various contextual factors may influence both implementation and outcomes. Understanding context is especially critical when the same program has varying levels of success across settings. We present a framework for evaluating contextual factors affecting an initiative…
Descriptors: Public Health, Sustainability, Program Implementation, Context Effect
Peer reviewed Peer reviewed
Direct linkDirect link
Chouinard, Jill Anne – American Journal of Evaluation, 2013
Evaluation occurs within a specific context and is influenced by the economic, political, historical, and social forces that shape that context. The culture of evaluation is thus very much embedded in the culture of accountability that currently prevails in public sector institutions, policies, and program. As such, our understanding of the…
Descriptors: Accountability, Public Sector, Participatory Research, Context Effect
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z.; Puma, Mike; Deke, John – National Center for Education Evaluation and Regional Assistance, 2014
This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…
Descriptors: Statistical Analysis, Evaluation Methods, Educational Research, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Sloane, Finbarr C.; Oloff-Lewis, Jennifer; Kim, Seong Hee – Irish Educational Studies, 2013
The government of Ireland, like many European countries, is currently under severe pressure from external forces to grow the economy. One possible way to maintain and grow its economy is through the production of a highly educated and globally competitive workforce. In an effort to develop such a workforce, the government, through the Department…
Descriptors: School Effectiveness, Accountability, Foreign Countries, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Hendricks, Michael; Bamberger, Michael – American Journal of Evaluation, 2010
Each year a great many evaluations are conducted of international development efforts around the world. These development evaluations study projects, programs, country-wide portfolios, policy reform efforts, and other topics of interest to funders, governments, program managers, and other involved stakeholders. Although some of these evaluations…
Descriptors: Program Development, Economic Development, Program Evaluation, Evaluation Criteria
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Zhang, Guili; Zeller, Nancy; Griffith, Robin; Metcalf, Debbie; Williams, Jennifer; Shea, Christine; Misulis, Katherine – Journal of Higher Education Outreach and Engagement, 2011
Planning, implementing, and assessing a service-learning project can be a complex task because service-learning projects often involve multiple constituencies and aim to meet both the needs of service providers and community partners. In this article, Stufflebeam's Context, Input, Process, and Product (CIPP) evaluation model is recommended as a…
Descriptors: Service Learning, Higher Education, College Faculty, Teacher Role
Peer reviewed Peer reviewed
Direct linkDirect link
Braverman, Marc T.; Arnold, Mary E. – New Directions for Evaluation, 2008
Methodological rigor consists of a series of elements that, in combination, determine the confidence with which conclusions can be drawn from the evaluation results. These elements include evaluation design, conceptualization of constructs, measurement strategies, time frames, program integrity, and others. The authors examine the factors that…
Descriptors: Information Needs, Evaluators, Program Evaluation, Evaluation Methods
Petscher, Yaacov; Foorman, Barbara – Society for Research on Educational Effectiveness, 2009
The current study will examine possible contextual effects relative to differences in reading comprehension performance in the state of Florida. While the Reading First (RF) Impact study examined such difference using a regression discontinuity design, the authors are primarily interested in other analytic methods that might answer different…
Descriptors: Reading Comprehension, Criterion Referenced Tests, Comparative Analysis, Reading Programs
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4