NotesFAQContact Us
Collection
Advanced
Search Tips
Assessments and Surveys
What Works Clearinghouse Rating
Does not meet standards1
Showing 1 to 15 of 218 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Little, Todd D.; Chang, Rong; Gorrall, Britt K.; Waggenspack, Luke; Fukuda, Eriko; Allen, Patricia J.; Noam, Gil G. – International Journal of Behavioral Development, 2020
We revisit the merits of the retrospective pretest-posttest (RPP) design for repeated-measures research. The underutilized RPP method asks respondents to rate survey items twice during the same posttest measurement occasion from two specific frames of reference: "now" and "then." Individuals first report their current attitudes…
Descriptors: Pretesting, Alternative Assessment, Program Evaluation, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Lortie-Forgues, Hugues; Inglis, Matthew – Educational Researcher, 2019
In this response, we first show that Simpson's proposed analysis answers a different and less interesting question than ours. We then justify the choice of prior for our Bayes factors calculations, but we also demonstrate that the substantive conclusions of our article are not substantially affected by varying this choice.
Descriptors: Randomized Controlled Trials, Bayesian Statistics, Educational Research, Program Evaluation
Hughes, Katherine L.; Miller, Trey; Reese, Kelly – Grantee Submission, 2021
This report from the Career and Technical Education (CTE) Research Network Lead team provides final results from an evaluability assessment of CTE programs that feasibly could be evaluated using a rigorous experimental design. Evaluability assessments (also called feasibility studies) are used in education and other fields, such as international…
Descriptors: Program Evaluation, Vocational Education, Evaluation Methods, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Simpson, Adrian – Educational Researcher, 2019
A recent paper uses Bayes factors to argue a large minority of rigorous, large-scale education RCTs are "uninformative." The definition of "uninformative" depends on the authors' hypothesis choices for calculating Bayes factors. These arguably overadjust for effect size inflation and involve a fixed prior distribution,…
Descriptors: Randomized Controlled Trials, Bayesian Statistics, Educational Research, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Makel, Matthew C.; Wai, Jonathan – Journal of Advanced Academics, 2016
In their National Bureau of Economic Research (NBER) working paper, "Does Gifted Education Work? For Which Students?" Card and Giuliano (C & G) made an enormous splash in not just gifted education but also the world (e.g., "The Washington Post," "The Atlantic," Five Thirty Eight). In this commentary, we highlight…
Descriptors: Special Education, Gifted, Economic Research, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Wing, Coady; Bello-Gomez, Ricardo A. – American Journal of Evaluation, 2018
Treatment effect estimates from a "regression discontinuity design" (RDD) have high internal validity. However, the arguments that support the design apply to a subpopulation that is narrower and usually different from the population of substantive interest in evaluation research. The disconnect between RDD population and the…
Descriptors: Regression (Statistics), Research Design, Validity, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Springett, Jane – Educational Action Research, 2017
Participatory Health Research is a collective term adopted globally for participatory action research in a health context. As an approach to research, it challenges current ways used within the health sciences to measure research impact as research, learning and action are integrated throughout the research process and dependent on context and…
Descriptors: Participatory Research, Health Education, Action Research, Program Effectiveness
Peer reviewed Peer reviewed
PDF on ERIC Download full text
What Works Clearinghouse, 2014
This "What Works Clearinghouse Procedures and Standards Handbook (Version 3.0)" provides a detailed description of the standards and procedures of the What Works Clearinghouse (WWC). The remaining chapters of this Handbook are organized to take the reader through the basic steps that the WWC uses to develop a review protocol, identify…
Descriptors: Educational Research, Guides, Intervention, Classification
Rymer, Les – Group of Eight (NJ1), 2011
Current economic conditions and the increasing competition for government funding are leading to an increased focus on the impact of research. Measuring the impact of research is difficult because not all impacts are direct and some can be negative or result from the identification of problems that require a non-research response. The time between…
Descriptors: Foreign Countries, Program Effectiveness, Research Utilization, Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Hung, Hsin-Ling; Altschuld, James W.; Lee, Yi-Fang – Evaluation and Program Planning, 2008
Although the Delphi is widely used, research on certain methodological issues is somewhat limited. After a brief introduction to the strengths, limitations, and methodological challenges of the technique, we share our experiences (as well as problems encountered) with an electronic Delphi of educational program evaluation (EPE) in the Asia-Pacific…
Descriptors: Delphi Technique, Program Evaluation, Research Methodology, Foreign Countries
Cheung, Alan C. K.; Slavin, Robert E. – Center for Research and Reform in Education, 2011
The use of educational technology in K-12 classrooms has been gaining tremendous momentum across the country since the 1990s. Many school districts have been investing heavily in various types of technology, such as computers, mobile devices, internet access, and interactive whiteboards. Almost all public schools have access to the internet and…
Descriptors: Evidence, Elementary Secondary Education, Mathematics Achievement, Program Effectiveness
Cheung, Alan C. K.; Slavin, Robert E. – Center for Research and Reform in Education, 2011
The present review examines research on the effects of technology use on reading achievement in K-12 classrooms. Unlike previous reviews, this review applies consistent inclusion standards to focus on studies that met high methodological standards. In addition, methodological and substantive features of the studies are investigated to examine the…
Descriptors: Foreign Countries, Evidence, Elementary Secondary Education, Reading Achievement
Carnoy, Martin – Education and the Public Interest Center, 2009
The third-year evaluation of the federally funded Washington, D.C. voucher program shows that low-income students offered vouchers in the first two years of the program had modestly higher reading scores after three years but showed no significant difference in mathematics. Students were randomly assigned to treatment and control groups, and the…
Descriptors: Control Groups, Private Schools, Program Effectiveness, Scoring
Peer reviewed Peer reviewed
Direct linkDirect link
Hagermoser Sanetti, Lisa M.; Kratochwill, Thomas R. – School Psychology Review, 2009
Treatment integrity (also referred to as "treatment fidelity," "intervention integrity," and "procedural reliability") is an important methodological concerning both research and practice because treatment integrity data are essential to making valid conclusions regarding treatment outcomes. Despite its relationship to validity, treatment…
Descriptors: Intervention, Research Methodology, Models, Validity
Peer reviewed Peer reviewed
Direct linkDirect link
McLeod, Bryce D.; Southam-Gerow, Michael A.; Weisz, John R. – School Psychology Review, 2009
This special series focused on treatment integrity in the child mental health and education field is timely. The articles do a laudable job of reviewing (a) the current status of treatment integrity research and measurement, (b) existing conceptual models of treatment integrity, and (c) the limitations of prior research. Overall, this thoughtful…
Descriptors: Evaluation Research, Children, Intervention, Research Methodology
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  15