NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 285 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Alyson S. Busse – ProQuest LLC, 2024
The purpose of this study was to assess and describe the status of program evaluation within the National Science Foundation's (NSF) Research Experiences for Undergraduates (REU) sites. The National Science Foundation's REU program supports thousands of undergraduates annually in hands-on, immersive research experiences under the mentorship of…
Descriptors: Undergraduate Study, Educational Research, Evaluation Methods, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Bárbara Mariana Gutiérrez-Pérez; María Teresa Silva-Fernández; Sara Serrate González – Educational Media International, 2024
The aim of this paper is to describe and analyze the process of co-creation and evaluation of an educational video game application called "Natur-Kingdom," developed within the framework of the NaturTEC-Kids Living Lab with the active participation of children and adolescents. This article aims to demonstrate how the integration of end…
Descriptors: Video Games, Educational Technology, Learner Engagement, Children
Peer reviewed Peer reviewed
Direct linkDirect link
Kheirandish, Shadi; Funk, Mathias; Wensveen, Stephan; Verkerk, Maarten; Rauterberg, Matthias – International Journal of Technology and Design Education, 2020
Human values play an integral role in any design that aims to improve the quality of human life. However, only a few design approaches concentrate on human values in their design, and there is even very little agreement between them in identifying human values. Considering this, we created a design tool based on a comprehensive value framework to…
Descriptors: Values, Design, Quality of Life, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Begley, Thelma; Daly, Louise; Downes, Carmel; De Vries, Jan; Sharek, Danika; Higgins, Agnes – Sex Education: Sexuality, Society and Learning, 2022
This paper reports on findings from an integrative literature review of evaluation studies undertaken into sexual health promotion preparation programmes aimed at professionals with a sexual health promotion remit. Using a pre-defined search strategy, inclusion criteria and PRISMA guidelines, databases were systematically searched. Studies were…
Descriptors: Training, Sex Education, Health Promotion, Knowledge Level
Peer reviewed Peer reviewed
Direct linkDirect link
Steffen Steinert; Lars Krupp; Karina E. Avila; Anke S. Janssen; Verena Ruf; David Dzsotjan; Christian De Schryver; Jakob Karolus; Stefan Ruzika; Karen Joisten; Paul Lukowicz; Jochen Kuhn; Norbert Wehn; Stefan Küchemann – Education and Information Technologies, 2025
As distance learning becomes increasingly important and artificial intelligence tools continue to advance, automated systems for individual learning have attracted significant attention. However, the scarcity of open-source online tools that are capable of providing personalized feedback has restricted the widespread implementation of…
Descriptors: Higher Education, Open Educational Resources, STEM Education, Feedback (Response)
Wong, Vivian C.; Steiner, Peter M.; Anglin, Kylie L. – Grantee Submission, 2018
Given the widespread use of non-experimental (NE) methods for assessing program impacts, there is a strong need to know whether NE approaches yield causally valid results in field settings. In within-study comparison (WSC) designs, the researcher compares treatment effects from an NE with those obtained from a randomized experiment that shares the…
Descriptors: Evaluation Methods, Program Evaluation, Program Effectiveness, Comparative Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles – National Center for Education Evaluation and Regional Assistance, 2017
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Weinstock, Phyllis; Gulemetova, Michaela; Sanchez, Raquel; Silver, David; Barach, Ilana – National Center for Education Evaluation and Regional Assistance, 2019
This evaluation, which focuses on the capacity-building mission of the Comprehensive Centers, was designed to add to the findings of the previous national evaluation of the Comprehensive Centers Program. That evaluation, completed in 2011, used survey data, structured interviews, and rating scales to analyze the extent to which State Education…
Descriptors: Technical Assistance, Federal Programs, Program Design, Program Implementation
Zimmerman, Kathleen N.; Ledford, Jennifer R.; Severini, Katherine E.; Pustejovsky, James E.; Barton, Erin E.; Lloyd, Blair P. – Grantee Submission, 2018
Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional…
Descriptors: Research Design, Evaluation Methods, Synthesis, Validity
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Maria Blanton; Angela Murphy Gardiner; Ana Stephens; Rena Stroud; Eric Knuth; Despina Stylianou – Grantee Submission, 2023
We describe here lessons learned in designing an early algebra curriculum to measure early algebra's impact on children's algebra readiness for middle grades. The curriculum was developed to supplement regular mathematics instruction in Grades K-5. Lessons learned centered around the importance of several key factors, including using conceptual…
Descriptors: Mathematics Curriculum, Curriculum Design, Mathematics Instruction, Kindergarten
Zimmerman, Kathleen N.; Pustejovsky, James E.; Ledford, Jennifer R.; Barton, Erin E.; Severini, Katherine E.; Lloyd, Blair P. – Grantee Submission, 2018
Varying methods for evaluating the outcomes of single case research designs (SCD) are currently used in reviews and meta-analyses of interventions. Quantitative effect size measures are often presented alongside visual analysis conclusions. Six measures across two classes--overlap measures (percentage non-overlapping data, improvement rate…
Descriptors: Research Design, Evaluation Methods, Synthesis, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Girginov, Vassil – European Physical Education Review, 2016
The organisers of the 2012 London Olympics have endeavoured explicitly to use the Games to inspire a generation. This is nothing short of putting the main claim of Olympism to the test, but surprisingly the Inspire project has received virtually no scholarly scrutiny. Using an educationally-informed view of inspiration, this paper interrogates the…
Descriptors: Athletics, Evidence, Foreign Countries, Research Design
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  19