NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Researchers2
What Works Clearinghouse Rating
Does not meet standards1
Showing 1 to 15 of 44 results Save | Export
K. L. Anglin; A. Krishnamachari; V. Wong – Grantee Submission, 2020
This article reviews important statistical methods for estimating the impact of interventions on outcomes in education settings, particularly programs that are implemented in field, rather than laboratory, settings. We begin by describing the causal inference challenge for evaluating program effects. Then four research designs are discussed that…
Descriptors: Causal Models, Statistical Inference, Intervention, Program Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Wing, Coady; Bello-Gomez, Ricardo A. – American Journal of Evaluation, 2018
Treatment effect estimates from a "regression discontinuity design" (RDD) have high internal validity. However, the arguments that support the design apply to a subpopulation that is narrower and usually different from the population of substantive interest in evaluation research. The disconnect between RDD population and the…
Descriptors: Regression (Statistics), Research Design, Validity, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Chen, Ruey-Shin; Liu, I-Fan – Journal of Educational Computing Research, 2017
Currently, e-learning systems are being widely used in all stages of education. However, it is difficult for school administrators to accurately assess the actual usage performance of a new system, especially when an organization wishes to update the system for users from different backgrounds using new devices such as smartphones. To allow school…
Descriptors: Educational Technology, Technology Uses in Education, College Students, Student Attitudes
Middlemist, George Edward – ProQuest LLC, 2017
During the 2004 legislative session, the Colorado General Assembly enacted Senate Bill 189 (SB189), which established the first system of college vouchers in the United States. The supporters of SB189 hoped that the voucher system, called the College Opportunity Fund (COF), would: 1) stabilize the flow of state funding to higher education; 2)…
Descriptors: Higher Education, Funding Formulas, Educational Finance, Public Colleges
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Vilaplana Prieto, Cristina – Research-publishing.net, 2016
In 2009, some Spanish regions implemented the Program School 2.0 with the purpose of introducing digital methodologies at schools. The aim of this paper is to analyse which part of the variation in reading scores is due to this program. For this purpose, we use data from the Program for International Student Assessment (PISA 2009 and 2012) for…
Descriptors: Foreign Countries, Reading Achievement, Program Evaluation, International Assessment
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles – National Center for Education Evaluation and Regional Assistance, 2017
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Powell, Robert B.; Vezeau, Susan Lynn; Stern, Marc J.; Moore, DeWayne D.; Wright, Brett A. – Environmental Education Research, 2018
Understanding the development of pro-environmental behavioral intentions and behaviors remains one of the greatest challenges for environmental educators worldwide. Using the Elaboration Likelihood Model as a theoretical foundation, we developed surveys to evaluate the influence of the Great Smoky Mountains National Park Junior Ranger program on…
Descriptors: Environmental Education, Parks, Youth, Residential Programs
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schochet, Peter Z.; Puma, Mike; Deke, John – National Center for Education Evaluation and Regional Assistance, 2014
This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…
Descriptors: Statistical Analysis, Evaluation Methods, Educational Research, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Baird, Thomas J.; Clark, Linda E. – Professional Development in Education, 2018
Professional development is an important part of creating a culture of continuous school improvement to maximize student learning. Rarely do we measure the impact of professional development on teacher learning, implementation and student outcomes. This study describes a professional development strategy for elementary teachers to promote…
Descriptors: Faculty Development, Models, Curriculum Implementation, Educational Strategies
Peer reviewed Peer reviewed
Direct linkDirect link
García, Sandra; Saavedra, Juan E. – Review of Educational Research, 2017
We meta-analyze for impact and cost-effectiveness 94 studies from 47 conditional cash transfer programs in low- and middle-income countries worldwide, focusing on educational outcomes that include enrollment, attendance, dropout, and school completion. To conceptually guide and interpret the empirical findings of our meta-analysis, we present a…
Descriptors: Developing Nations, Cost Effectiveness, Meta Analysis, Educational Benefits
Peer reviewed Peer reviewed
Direct linkDirect link
Adedokun, Omolola A.; Childress, Amy L.; Burgess, Wilella D. – American Journal of Evaluation, 2011
A theory-driven approach to evaluation (TDE) emphasizes the development and empirical testing of conceptual models to understand the processes and mechanisms through which programs achieve their intended goals. However, most reported applications of TDE are limited to large-scale experimental/quasi-experimental program evaluation designs. Very few…
Descriptors: Feedback (Response), Program Evaluation, Structural Equation Models, Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Thomas, Ally S.; Bonner, Sarah M.; Everson, Howard T. – Society for Research on Educational Effectiveness, 2014
Recently, the authors have been exploring the use of propensity score methods for developing evidence of program impact. Specifically, they have been developing evidence (after one year of implementation) of the effects of the Math Science Partnership in New York City ("MSPinNYC2") on high school students' achievement--both in terms of…
Descriptors: Program Evaluation, Probability, Scores, Scoring
Previous Page | Next Page »
Pages: 1  |  2  |  3