NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 233 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Bergeron, Dave A.; Gaboury, Isabelle – International Journal of Social Research Methodology, 2020
Realist evaluation (RE) is a research design increasingly used in program evaluation, that aims to explore and understand the influence of context and underlying mechanisms on intervention or program outcomes. Several methodological challenges, however, are associated with this approach. This article summarizes RE key principles and examines some…
Descriptors: Research Design, Program Evaluation, Context Effect, Research Problems
Peer reviewed Peer reviewed
Direct linkDirect link
Jiang Li; Chen Zhu; Mark Goh – Research Evaluation, 2025
Data Envelopment Analysis (DEA) is a widely adopted non-parametric technique for evaluating R&D performance. However, traditional DEA models often struggle to provide reliable solutions in the presence of data uncertainty. To address this limitation, this study develops a novel robust super-efficiency DEA approach to evaluate R&D…
Descriptors: Foreign Countries, Research and Development, COVID-19, Pandemics
Peer reviewed Peer reviewed
Direct linkDirect link
Bamattre, R.; Schowengerdt, B.; Nikoi, A.; DeJaeghere, J. – International Journal of Social Research Methodology, 2019
While social programs are often assessed using short-term impact studies, longitudinal designs allow evaluators to capture change over time, identify longer-term outcomes, adapt instruments, and better understand participants in transition. A mixed methods design can be critical in understanding these dynamics; yet there is a lack of literature…
Descriptors: Mixed Methods Research, Longitudinal Studies, Program Evaluation, Research Problems
Peer reviewed Peer reviewed
Direct linkDirect link
Little, Todd D.; Chang, Rong; Gorrall, Britt K.; Waggenspack, Luke; Fukuda, Eriko; Allen, Patricia J.; Noam, Gil G. – International Journal of Behavioral Development, 2020
We revisit the merits of the retrospective pretest-posttest (RPP) design for repeated-measures research. The underutilized RPP method asks respondents to rate survey items twice during the same posttest measurement occasion from two specific frames of reference: "now" and "then." Individuals first report their current attitudes…
Descriptors: Pretesting, Alternative Assessment, Program Evaluation, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Lortie-Forgues, Hugues; Inglis, Matthew – Educational Researcher, 2019
In this response, we first show that Simpson's proposed analysis answers a different and less interesting question than ours. We then justify the choice of prior for our Bayes factors calculations, but we also demonstrate that the substantive conclusions of our article are not substantially affected by varying this choice.
Descriptors: Randomized Controlled Trials, Bayesian Statistics, Educational Research, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Norwich, Brahm; Koutsouris, George – International Journal of Research & Method in Education, 2020
This paper describes the context, processes and issues experienced over 5 years in which a RCT was carried out to evaluate a programme for children aged 7-8 who were struggling with their reading. Its specific aim is to illuminate questions about the design of complex teaching approaches and their evaluation using an RCT. This covers the early…
Descriptors: Randomized Controlled Trials, Program Evaluation, Reading Programs, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Simpson, Adrian – Educational Researcher, 2019
A recent paper uses Bayes factors to argue a large minority of rigorous, large-scale education RCTs are "uninformative." The definition of "uninformative" depends on the authors' hypothesis choices for calculating Bayes factors. These arguably overadjust for effect size inflation and involve a fixed prior distribution,…
Descriptors: Randomized Controlled Trials, Bayesian Statistics, Educational Research, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Barrett, Paula M.; Cooper, Marita; Stallard, Paul; Zeggio, Larissa; Gallegos- Guajardo, Julia – Education and Treatment of Children, 2017
This response aims to critically evaluate the methodology and aims of the meta-analytic review written by Maggin and Johnson (2014). The present authors systematically provide responses for each of the original criticisms and highlight concerns regarding Maggin and Johnson's methodology, while objectively describing the current state of evidence…
Descriptors: Anxiety, Prevention, Program Effectiveness, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Makel, Matthew C.; Wai, Jonathan – Journal of Advanced Academics, 2016
In their National Bureau of Economic Research (NBER) working paper, "Does Gifted Education Work? For Which Students?" Card and Giuliano (C & G) made an enormous splash in not just gifted education but also the world (e.g., "The Washington Post," "The Atlantic," Five Thirty Eight). In this commentary, we highlight…
Descriptors: Special Education, Gifted, Economic Research, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Wing, Coady; Bello-Gomez, Ricardo A. – American Journal of Evaluation, 2018
Treatment effect estimates from a "regression discontinuity design" (RDD) have high internal validity. However, the arguments that support the design apply to a subpopulation that is narrower and usually different from the population of substantive interest in evaluation research. The disconnect between RDD population and the…
Descriptors: Regression (Statistics), Research Design, Validity, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Springett, Jane – Educational Action Research, 2017
Participatory Health Research is a collective term adopted globally for participatory action research in a health context. As an approach to research, it challenges current ways used within the health sciences to measure research impact as research, learning and action are integrated throughout the research process and dependent on context and…
Descriptors: Participatory Research, Health Education, Action Research, Program Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Dawson, Anneka; Yeomans, Emily; Brown, Elena Rosa – Educational Research, 2018
Background: The Education Endowment Foundation (EEF) is an independent charity that was established in 2011 with the explicit aim of breaking the link between family income and educational achievement in England. Over the seven years since its inception, EEF has contributed to the existing evidence base by funding over one hundred randomised…
Descriptors: Foreign Countries, Educational Research, Randomized Controlled Trials, Research Problems
Peer reviewed Peer reviewed
Direct linkDirect link
Warne, Russell T. – Journal of Advanced Academics, 2016
Card and Giuliano conducted a regression discontinuity study in a large Florida school district to investigate the magnitude of academic benefits of the district's gifted program. They found that for children identified as gifted through an intelligence test, the program provided few or no benefits. But children who were admitted to the gifted…
Descriptors: Special Education, Gifted, Program Evaluation, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Piccone, Jason E. – Journal of Correctional Education, 2015
The effective evaluation of correctional programs is critically important. However, research in corrections rarely allows for the randomization of offenders to conditions of the study. This limitation compromises internal validity, and thus, causal conclusions can rarely be drawn. Increasingly, researchers are employing propensity score matching…
Descriptors: Correctional Education, Program Evaluation, Probability, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Adelson, Jill L.; Kelcey, Benjamin – Journal of Advanced Academics, 2016
In this commentary of "Evaluating the Gifted Program of an Urban School District Using a Modified Regression Discontinuity Design" by Davis, Engberg, Epple, Sieg, and Zimmer, we examine the background of the study, critique the methods used, and discuss the results and implications. The study used a fuzzy regression discontinuity design…
Descriptors: Special Education, Gifted, Program Evaluation, Regression (Statistics)
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  16