Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 29 |
Descriptor
Comparative Analysis | 123 |
Educational Research | 123 |
Program Evaluation | 123 |
Program Effectiveness | 41 |
Research Methodology | 30 |
Evaluation Methods | 24 |
Academic Achievement | 21 |
Statistical Analysis | 20 |
Foreign Countries | 19 |
Student Attitudes | 19 |
Vocational Education | 18 |
More ▼ |
Source
Author
Brown, Alan | 2 |
Cort, H. Russell, Jr. | 2 |
Evans, Karen | 2 |
Hallberg, Kelly | 2 |
Johnson, Terry | 2 |
McConnell, Sheena | 2 |
Peskowitz, Nancy | 2 |
Swanlund, Andrew | 2 |
Williams, Ryan | 2 |
Akers, Lauren | 1 |
Alexander, Thomas | 1 |
More ▼ |
Publication Type
Education Level
Location
California | 3 |
Minnesota | 3 |
New York | 2 |
Ohio | 2 |
South Africa | 2 |
United Kingdom | 2 |
United Kingdom (England) | 2 |
United Kingdom (Great Britain) | 2 |
Wisconsin | 2 |
Africa | 1 |
Asia | 1 |
More ▼ |
Laws, Policies, & Programs
Comprehensive Employment and… | 1 |
Education Consolidation… | 1 |
Education Professions… | 1 |
Emergency School Aid Act 1972 | 1 |
No Child Left Behind Act 2001 | 1 |
Vocational Education… | 1 |
Workforce Investment Act 1998 | 1 |
Assessments and Surveys
General Educational… | 1 |
Metropolitan Achievement Tests | 1 |
Preschool Inventory | 1 |
What Works Clearinghouse Rating
Does not meet standards | 1 |
Hansford, Nathaniel; Schechter, Rachel L. – International Journal of Modern Education Studies, 2023
Meta-analyses are systematic summaries of research that use quantitative methods to find the mean effect size (standardized mean difference) for interventions. Critics of meta-analysis point out that such analyses can conflate the results of low- and high-quality studies, make improper comparisons and result in statistical noise. All these…
Descriptors: Meta Analysis, Best Practices, Randomized Controlled Trials, Criticism
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew – Journal of Research on Educational Effectiveness, 2020
More aggregate data on school performance is available than ever before, opening up new possibilities for applied researchers interested in assessing the effectiveness of school-level interventions quickly and at a relatively low cost by implementing comparative interrupted times series (CITS) designs. We examine the extent to which effect…
Descriptors: Data Use, Research Methodology, Program Effectiveness, Design
Hallberg, Kelly; Williams, Ryan; Swanlund, Andrew; Eno, Jared – Educational Researcher, 2018
Short comparative interrupted times series (CITS) designs are increasingly being used in education research to assess the effectiveness of school-level interventions. These designs can be implemented relatively inexpensively, often drawing on publicly available data on aggregate school performance. However, the validity of this approach hinges on…
Descriptors: Educational Research, Research Methodology, Comparative Analysis, Time
Dong, Nianbo; Lipsey, Mark – Society for Research on Educational Effectiveness, 2014
When randomized control trials (RCT) are not feasible, researchers seek other methods to make causal inference, e.g., propensity score methods. One of the underlined assumptions for the propensity score methods to obtain unbiased treatment effect estimates is the ignorability assumption, that is, conditional on the propensity score, treatment…
Descriptors: Educational Research, Benchmarking, Statistical Analysis, Computation
Ferris, Helena A.; Collins, Mary E. – International Journal of Higher Education, 2015
The landscape of medical education is continuously evolving, as are the needs of the learner. The appropriate use of research and evaluation is key when assessing the need for change and instituting one's innovative endeavours. This paper demonstrates how research seeks to generate new knowledge, whereas evaluation uses information acquired from…
Descriptors: Medical Education, Evidence, Evaluation Methods, Models
Akers, Lauren; Resch, Alexandra; Berk, Jillian – National Center for Education Evaluation and Regional Assistance, 2014
This guide for district and school leaders shows how to recognize opportunities to embed randomized controlled trials (RCTs) into planned policies or programs. Opportunistic RCTs can generate strong evidence for informing education decisions--with minimal added cost and disruption. The guide also outlines the key steps to conduct RCTs and responds…
Descriptors: School Districts, Educational Research, Guides, Program Evaluation
Lane, Forrest C.; To, Yen M.; Shelley, Kyna; Henson, Robin K. – Career and Technical Education Research, 2012
Researchers may be interested in examining the impact of programs that prepare youth and adults for successful careers but unable to implement experimental designs with true randomization of participants. As a result, these studies can be compromised by underlying factors that impact group selection and thus lead to potentially biased results.…
Descriptors: Vocational Education, Educational Research, Research Methodology, Research Design
Milesi, Carolina; Brown, Kevin L.; Hawkley, Louise; Dropkin, Eric; Schneider, Barbara L. – Educational Researcher, 2014
Impact evaluation plays a critical role in determining whether federally funded research programs in science, technology, engineering, and mathematics are wise investments. This paper develops quantitative methods for program evaluation and applies this approach to a flagship National Science Foundation-funded education research program, Research…
Descriptors: Federal Aid, Bibliometrics, Financial Support, STEM Education
Fetsch, Robert J.; MacPhee, David; Boyer, Luann K. – Journal of Extension, 2012
Extension agents and specialists have experienced increased pressure for greater program effectiveness and accountability and especially for evidence-based programs. This article builds on previously published evidence-based programming articles. It provides ideas that address three problems that Extension staff face with EBPs and that Extension…
Descriptors: Extension Education, Program Development, Program Evaluation, Program Effectiveness
What Works Clearinghouse, 2012
This document provides guidance about how to describe studies and report their findings in a way that is clear, complete, and transparent. This document does not include information about how studies are judged against What Works Clearinghouse evidence standards. For information about What Works Clearinghouse evidence standards, please refer to…
Descriptors: Intervention, Research Reports, Educational Research, Guides
Kizil, Ruhan Circi; Briggs, Derek; Seidel, Kent; Green, Kathy – Society for Research on Educational Effectiveness, 2014
The evidence that teacher preparation programs have an impact on teacher quality is often limited. Progress in research on this topic will remain rather limited in its influence on practice until more proximal measures of teacher education outcomes can be established. The dearth of variables to measure the impact of teacher preparation programs on…
Descriptors: Teacher Competencies, Teacher Education Programs, Educational Quality, Educational Research
Barber, Larissa K.; Bailey, Sarah F.; Bagsby, Patricia G. – Teaching of Psychology, 2015
The undergraduate psychology curriculum often does not address guidelines for acceptable participant behavior. This two-part study tested the efficacy of a recently developed online learning module on ethical perceptions, knowledge, and behavior. In the preliminary quasi-experiment, students who viewed the module did not have higher…
Descriptors: Ethics, Learning Modules, Online Courses, Educational Research
Kelcey, Ben; Spybrook, Jessaca; Zhang, Jiaqi; Phelps, Geoffrey; Jones, Nathan – Society for Research on Educational Effectiveness, 2015
With research indicating substantial differences among teachers in terms of their effectiveness (Nye, Konstantopoulous, & Hedges, 2004), a major focus of recent research in education has been on improving teacher quality through professional development (Desimone, 2009; Institute of Educations Sciences [IES], 2012; Measures of Effective…
Descriptors: Teacher Effectiveness, Faculty Development, Program Design, Educational Research
Parylo, Oksana – Studies in Educational Evaluation, 2012
This sequential mixed methods study analyzed how program evaluation was used to assess educational administration and examined thematic trends in educational evaluation published over 10 years (2001-2010). First, qualitative content analysis examined the articles in eight peer-reviewed evaluation journals. This analysis revealed that numerous…
Descriptors: Educational Administration, Program Evaluation, Mixed Methods Research, Educational Research
What Works Clearinghouse, 2014
This "What Works Clearinghouse Procedures and Standards Handbook (Version 3.0)" provides a detailed description of the standards and procedures of the What Works Clearinghouse (WWC). The remaining chapters of this Handbook are organized to take the reader through the basic steps that the WWC uses to develop a review protocol, identify…
Descriptors: Educational Research, Guides, Intervention, Classification