NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 198 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Bell, Stephen H.; Stapleton, David C.; Wood, Michelle; Gubits, Daniel – American Journal of Evaluation, 2023
A randomized experiment that measures the impact of a social policy in a sample of the population reveals whether the policy will work on average with universal application. An experiment that includes only the subset of the population that volunteers for the intervention generates narrower "proof-of-concept" evidence of whether the…
Descriptors: Public Policy, Policy Formation, Federal Programs, Social Services
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2018
Underlying all What Works Clearinghouse (WWC) products are WWC Study Review Guides, which are intended for use by WWC certified reviewers to assess studies against the WWC evidence standards. As part of an ongoing effort to increase transparency, promote collaboration, and encourage widespread use of the WWC standards, the Institute of Education…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Tang, Yun – ProQuest LLC, 2018
Propensity and prognostic score methods are two statistical techniques used to correct for the selection bias in nonexperimental studies. Recently, the joint use of propensity and prognostic scores (i.e., two-score methods) has been proposed to improve the performance of adjustments using propensity or prognostic scores alone for bias reduction.…
Descriptors: Statistical Analysis, Probability, Bias, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Barnow, Burt S.; Greenberg, David H. – American Journal of Evaluation, 2020
This paper reviews the use of multiple trials, defined as multiple sites or multiple arms in a single evaluation and replications, in evaluating social programs. After defining key terms, the paper discusses the rationales for conducting multiple trials, which include increasing sample size to increase statistical power; identifying the most…
Descriptors: Evaluation, Randomized Controlled Trials, Experiments, Replication (Evaluation)
Peer reviewed Peer reviewed
Direct linkDirect link
Geuke, Gemma G. M.; Maric, Marija; Miocevic, Milica; Wolters, Lidewij H.; de Haan, Else – New Directions for Child and Adolescent Development, 2019
The major aim of this manuscript is to bring together two important topics that have recently received much attention in child and adolescent research, albeit separately from each other: single-case experimental designs and statistical mediation analysis. Single-case experimental designs (SCEDs) are increasingly recognized as a valuable…
Descriptors: Children, Adolescents, Research, Case Studies
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2016
This document provides step-by-step instructions on how to complete the Study Review Guide (SRG, Version S3, V2) for single-case designs (SCDs). Reviewers will complete an SRG for every What Works Clearinghouse (WWC) review. A completed SRG should be a reviewer's independent assessment of the study, relative to the criteria specified in the review…
Descriptors: Guides, Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Hurney, Carol A.; Nash, Carole; Hartman, Christie-Joy B.; Brantmeier, Edward J. – International Journal of Sustainability in Higher Education, 2016
Purpose: Key elements of a curriculum are presented for a faculty development program that integrated sustainability content with effective course design methodology across a variety of disciplines. The study aims to present self-reported impacts for a small number of faculty participants and their courses. Design/methodology/approach: A yearlong…
Descriptors: Sustainability, Course Content, Faculty Development, Instructional Design
Peer reviewed Peer reviewed
Direct linkDirect link
Company, Pedro; Contero, Manuel; Otey, Jeffrey; Camba, Jorge D.; Agost, María-Jesús; Pérez-López, David – Educational Technology & Society, 2017
This paper describes the implementation and testing of our concept of adaptable rubrics, defined as analytical rubrics that arrange assessment criteria at multiple levels that can be expanded on demand. Because of its adaptable nature, these rubrics cannot be implemented in paper formats, neither are they supported by current Learning Management…
Descriptors: Scoring Rubrics, Case Studies, Evaluation Methods, Formative Evaluation
Yomtov, Dani; Plunkett, Scott W.; Efrat, Rafi; Marin, Adriana Garcia – Journal of College Student Retention: Research, Theory & Practice, 2017
The effectiveness of a peer-mentoring program was examined at a university in California. Previous studies suggest university peer mentoring might increase students' feelings of engagement, which can contribute to their retention. Pretest and posttest data were collected from 304 freshmen (mentored and nonmentored) during the fall of 2012 in a…
Descriptors: College Freshmen, Mentors, Peer Teaching, Pretests Posttests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Idawati; Mahmud, Alimuddin; Dirawan, Gufran Darma – International Education Studies, 2016
The purpose of this research was to determine the effectiveness of a training model for capacity building of women entrepreneurship community-based. Research type approach Research and Development Model, which refers to the model of development research that developed by Romiszowki (1996) combined with a model of development Sugiono (2011) it was…
Descriptors: Females, Entrepreneurship, Empowerment, Training
Peer reviewed Peer reviewed
Direct linkDirect link
Fominykh, Mikhail; Prasolova-Førland, Ekaterina; Stiles, Tore C.; Krogh, Anne Berit; Linde, Mattias – Journal of Interactive Learning Research, 2018
This paper presents a concept for designing low-cost therapeutic training with biofeedback and virtual reality. We completed the first evaluation of a prototype--a mobile learning application for relaxation training, primarily for adolescents suffering from tension-type headaches. The system delivers visual experience on a head-mounted display. A…
Descriptors: Therapy, Relaxation Training, Biofeedback, Computer Simulation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cheung, Alan; Slavin, Robert – Society for Research on Educational Effectiveness, 2016
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Descriptors: Effect Size, Evidence Based Practice, Educational Change, Educational Policy
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Guerra-Martín, María Dolores; Lima-Serrano, Marta; Lima-Rodríguez, Joaquín Salvador – Journal of New Approaches in Educational Research, 2017
In response to the increase of Higher Education support provided to tutoring programs, this paper presents the design, implementation and evaluation of a tutoring program to improve the academic performance of at-risk students enrolled in the last year of a nursing degree characterized by academic failure (failed courses). A controlled…
Descriptors: Academic Achievement, Questionnaires, College Faculty, Intervention
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Vela, Adriana; Jones, Don; Mundy, Marie-Anne; Isaacson, Carrie – Research in Higher Education Journal, 2017
This ex-post-facto quasi-experimental research design was conducted by selecting a convenient sample of approximately 2,000 3rd grade ELLs who took the regular reading and math English STAAR test during the 2014-15 school year in an urban southern Texas school district. This study was conducted using a quantitative research method of data…
Descriptors: Bilingual Education Programs, Quasiexperimental Design, Reading Tests, Mathematics Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Bloom, Howard S.; Raudenbush, Stephen W.; Weiss, Michael J.; Porter, Kristin – Journal of Research on Educational Effectiveness, 2017
The present article considers a fundamental question in evaluation research: "By how much do program effects vary across sites?" The article first presents a theoretical model of cross-site impact variation and a related estimation model with a random treatment coefficient and fixed site-specific intercepts. This approach eliminates…
Descriptors: Evaluation Research, Program Evaluation, Welfare Services, Employment
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  14