NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 120 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Wong, Vivian C.; Steiner, Peter M.; Anglin, Kylie L. – Grantee Submission, 2018
Given the widespread use of non-experimental (NE) methods for assessing program impacts, there is a strong need to know whether NE approaches yield causally valid results in field settings. In within-study comparison (WSC) designs, the researcher compares treatment effects from an NE with those obtained from a randomized experiment that shares the…
Descriptors: Evaluation Methods, Program Evaluation, Program Effectiveness, Comparative Analysis
Zimmerman, Kathleen N.; Ledford, Jennifer R.; Severini, Katherine E.; Pustejovsky, James E.; Barton, Erin E.; Lloyd, Blair P. – Grantee Submission, 2018
Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional…
Descriptors: Research Design, Evaluation Methods, Synthesis, Validity
Zimmerman, Kathleen N.; Pustejovsky, James E.; Ledford, Jennifer R.; Barton, Erin E.; Severini, Katherine E.; Lloyd, Blair P. – Grantee Submission, 2018
Varying methods for evaluating the outcomes of single case research designs (SCD) are currently used in reviews and meta-analyses of interventions. Quantitative effect size measures are often presented alongside visual analysis conclusions. Six measures across two classes--overlap measures (percentage non-overlapping data, improvement rate…
Descriptors: Research Design, Evaluation Methods, Synthesis, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Girginov, Vassil – European Physical Education Review, 2016
The organisers of the 2012 London Olympics have endeavoured explicitly to use the Games to inspire a generation. This is nothing short of putting the main claim of Olympism to the test, but surprisingly the Inspire project has received virtually no scholarly scrutiny. Using an educationally-informed view of inspiration, this paper interrogates the…
Descriptors: Athletics, Evidence, Foreign Countries, Research Design
Sheryl MacMath; Barbara Salingré – Sage Research Methods Cases, 2016
Our research examines the effectiveness of intake variables used by a Canadian post-degree teacher education program over the period of 3 years to select candidates for entry into the program. Using a mixed-methods approach, we compared intake variables (grade point average, written response, work experience, reference letters, academic…
Descriptors: Teacher Education, Mixed Methods Research, Research Methodology, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Solmeyer, Anna R.; Constance, Nicole – American Journal of Evaluation, 2015
Traditionally, evaluation has primarily tried to answer the question "Does a program, service, or policy work?" Recently, more attention is given to questions about variation in program effects and the mechanisms through which program effects occur. Addressing these kinds of questions requires moving beyond assessing average program…
Descriptors: Program Effectiveness, Program Evaluation, Program Content, Measurement Techniques
Peer reviewed Peer reviewed
Direct linkDirect link
Higgins, Julian P. T.; Ramsay, Craig; Reeves, Barnaby C.; Deeks, Jonathan J.; Shea, Beverley; Valentine, Jeffrey C.; Tugwell, Peter; Wells, George – Research Synthesis Methods, 2013
Non-randomized studies may provide valuable evidence on the effects of interventions. They are the main source of evidence on the intended effects of some types of interventions and often provide the only evidence about the effects of interventions on long-term outcomes, rare events or adverse effects. Therefore, systematic reviews on the effects…
Descriptors: Research Methodology, Intervention, Program Effectiveness, Program Evaluation
Jennifer Scoles; Mark Huxham; Jan McArthur – Sage Research Methods Cases, 2014
Researchers who are interested in asking a number of questions about a particular phenomenon are increasingly looking to mixed methods as a useful research approach. In this case study, we discuss how mixed-methods research is situated in current debates about quantitative and qualitative methods. Then, we provide some reasons as to why we, and…
Descriptors: Mixed Methods Research, Research Methodology, Statistical Analysis, Qualitative Research
Peer reviewed Peer reviewed
Direct linkDirect link
Marcus, Sue M.; Stuart, Elizabeth A.; Wang, Pei; Shadish, William R.; Steiner, Peter M. – Psychological Methods, 2012
Although randomized studies have high internal validity, generalizability of the estimated causal effect from randomized clinical trials to real-world clinical or educational practice may be limited. We consider the implication of randomized assignment to treatment, as compared with choice of preferred treatment as it occurs in real-world…
Descriptors: Educational Practices, Program Effectiveness, Validity, Causal Models
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Jamaludin, Khairul Azhar; Alias, Norlidah; DeWitt, Dorothy – Turkish Online Journal of Educational Technology - TOJET, 2015
The practice of homeschooling still receives contrasting responses on its relevancy and effectiveness. The current study is aimed to map the trends in the selected eleven studies from various educational journals. The analysis focuses on mapping the trends on: a) research settings, b) target sample, c) method or instrument used, d) common focus or…
Descriptors: Journal Articles, Home Schooling, Educational Practices, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Harvill, Eleanor L.; Peck, Laura R.; Bell, Stephen H. – American Journal of Evaluation, 2013
Using exogenous characteristics to identify endogenous subgroups, the approach discussed in this method note creates symmetric subsets within treatment and control groups, allowing the analysis to take advantage of an experimental design. In order to maintain treatment--control symmetry, however, prior work has posited that it is necessary to use…
Descriptors: Experimental Groups, Control Groups, Research Design, Sampling
Peer reviewed Peer reviewed
Direct linkDirect link
Losinski, Mickey; Maag, John W.; Katsiyannis, Antonis; Ennis, Robin Parks – Exceptional Children, 2014
Interventions based on the results of functional behavioral assessment (FBA) have been the topic of extensive research and, in certain cases, mandated for students with disabilities under the Individuals With Disabilities Education Act. There exist a wide variety of methods for conducting such assessments, with little consensus in the field. The…
Descriptors: Intervention, Predictor Variables, Program Effectiveness, Educational Quality
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bell, Stephen H.; Puma, Michael J.; Cook, Ronna J.; Heid, Camilla A. – Society for Research on Educational Effectiveness, 2013
Access to Head Start has been shown to improve children's preschool experiences and school readiness on selected factors through the end of 1st grade. Two more years of follow-up, through the end of 3rd grade, can now be examined to determine whether these effects continue into the middle elementary grades. The statistical design and impact…
Descriptors: Evaluation Methods, Data Analysis, Randomized Controlled Trials, Sampling
Monahan, Shannon; Kratochwill, Thomas; Lipscomb, Stephen – Society for Research on Educational Effectiveness, 2011
The What Works Clearinghouse (WWC) seeks to provide educators, policymakers, researchers, and the public with a central and trusted source of scientific evidence for what works in education. The WWC was established in 2002 by the U.S. Department of Education's Institute of Education Sciences (IES). It serves as a decision-making resource by…
Descriptors: Expertise, Evidence, Educational Research, Clearinghouses
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8