Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 9 |
Since 2006 (last 20 years) | 35 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
Practitioners | 55 |
Administrators | 30 |
Researchers | 26 |
Policymakers | 18 |
Teachers | 12 |
Community | 2 |
Support Staff | 2 |
Parents | 1 |
Location
Wisconsin | 4 |
California | 3 |
Florida | 3 |
Oregon | 3 |
Texas | 3 |
Illinois | 2 |
Indiana | 2 |
Kentucky | 2 |
Louisiana | 2 |
Michigan | 2 |
Minnesota | 2 |
More ▼ |
Laws, Policies, & Programs
No Child Left Behind Act 2001 | 2 |
Adult Education Act | 1 |
American Recovery and… | 1 |
Education Consolidation… | 1 |
Elementary and Secondary… | 1 |
Workforce Investment Act 1998 | 1 |
Assessments and Surveys
National Assessment of… | 1 |
Program for International… | 1 |
Trends in International… | 1 |
What Works Clearinghouse Rating
Hill, Carolyn J.; Scher, Lauren; Haimson, Joshua; Granito, Kelly – National Center for Education Evaluation and Regional Assistance, 2023
Implementation analyses conducted as part of impact studies can help educators know whether a tested intervention is likely to be a good fit for their own settings. This guide can help researchers design and conduct these kinds of analyses. The guide provides steps and recommendations about ways to specify implementation research questions, assess…
Descriptors: Program Implementation, Intervention, Educational Research, Context Effect
What Works Clearinghouse, 2022
Education decisionmakers need access to the best evidence about the effectiveness of education interventions, including practices, products, programs, and policies. It can be difficult, time consuming, and costly to access and draw conclusions from relevant studies about the effectiveness of interventions. The What Works Clearinghouse (WWC)…
Descriptors: Program Evaluation, Program Effectiveness, Standards, Educational Research
Steve Klein; Cherise Moore – Career and Technical Education Research Network, 2021
This is the sixth in a series of six practitioner training modules developed as part of the Career & Technical Education (CTE) Research Network Lead. Designed for CTE practitioners and state agency staff, these modules are designed to strengthen the capacity to access, understand, and use CTE data and research as well as conduct one's own…
Descriptors: Vocational Education, Educational Research, Research Utilization, Data Use
Lohr, Sharon; Schochet, Peter Z.; Sanders, Elizabeth – National Center for Education Research, 2014
Suppose an education researcher wants to test the impact of a high school drop-out prevention intervention in which at-risk students attend classes to receive intensive summer school instruction. The district will allow the researcher to randomly assign students to the treatment classes or to the control group. Half of the students (the treatment…
Descriptors: Educational Research, Research Design, Data Analysis, Intervention
Porter, Kristin E. – Journal of Research on Educational Effectiveness, 2018
Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) are statistical…
Descriptors: Statistical Analysis, Program Effectiveness, Intervention, Hypothesis Testing
Teacher Incentive Fund, US Department of Education, 2016
The U.S. Department of Education (ED) expects all Teacher Incentive Fund (TIF) grantees to conduct an evaluation of their programs. Experience with earlier rounds of TIF grants has shown that evaluations can provide valuable information for managing and improving TIF-supported activities, as well as evidence that these activities have had a…
Descriptors: Program Evaluation, Questioning Techniques, Qualitative Research, Statistical Analysis
Porter, Kristin E. – Grantee Submission, 2017
Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) are statistical…
Descriptors: Statistical Analysis, Program Effectiveness, Intervention, Hypothesis Testing
Kekahio, Wendy; Baker, Myriam – Regional Educational Laboratory Pacific, 2013
Using data strategically to guide decisions and actions can have a positive effect on education practices and processes. This facilitation guide shows education data teams how to move beyond simply reporting data to applying data to direct strategic action. Using guiding questions, suggested activities, and activity forms, this guide provides…
Descriptors: Research Utilization, Data Analysis, Strategic Planning, Decision Making
Institute of Education Sciences, 2013
In January 2011, a Joint Committee of representatives from the U.S. Department of Education (ED) and the U.S. National Science Foundation (NSF) began work to establish cross-agency guidelines for improving the quality, coherence, and pace of knowledge development in science, technology, engineering and mathematics (STEM) education. Although the…
Descriptors: STEM Education, Research and Development, Intervention, Educational Improvement
Faria, Ann-Marie; Hawkinson, Laura; Metzger, Ivan; Bouacha, Nora; Cantave, Michelle – Regional Educational Laboratory Midwest, 2017
A quality rating and improvement system (QRIS) is a voluntary state assessment system that uses multidimensional data on early childhood education programs to rate program quality, support quality improvement efforts, and provide information to families about the quality of available early childhood education programs. QRISs have two components:…
Descriptors: Early Childhood Education, Educational Quality, Educational Improvement, Educational Practices
Porter, Kristin E. – MDRC, 2016
In education research and in many other fields, researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple…
Descriptors: Statistical Analysis, Program Effectiveness, Intervention, Hypothesis Testing
Akers, Lauren; Resch, Alexandra; Berk, Jillian – National Center for Education Evaluation and Regional Assistance, 2014
This guide for district and school leaders shows how to recognize opportunities to embed randomized controlled trials (RCTs) into planned policies or programs. Opportunistic RCTs can generate strong evidence for informing education decisions--with minimal added cost and disruption. The guide also outlines the key steps to conduct RCTs and responds…
Descriptors: School Districts, Educational Research, Guides, Program Evaluation
What Works Clearinghouse, 2014
This "What Works Clearinghouse Procedures and Standards Handbook (Version 3.0)" provides a detailed description of the standards and procedures of the What Works Clearinghouse (WWC). The remaining chapters of this Handbook are organized to take the reader through the basic steps that the WWC uses to develop a review protocol, identify…
Descriptors: Educational Research, Guides, Intervention, Classification
Center for Research and Reform in Education, 2017
Educational technology products offer potentially effective means of supporting teaching and learning in K-12 classrooms. But for any given instructional need, there are likely to be numerous product options available for purchasing. What can school districts do to help ensure that good selections are made? In a recent, comprehensive study of…
Descriptors: Educational Technology, Technology Uses in Education, School Districts, Purchasing
Zhu, Pei; Jacob, Robin; Bloom, Howard; Xu, Zeyu – MDRC, 2011
This paper provides practical guidance for researchers who are designing and analyzing studies that randomize schools--which comprise three levels of clustering (students in classrooms in schools)--to measure intervention effects on student academic outcomes when information on the middle level (classrooms) is missing. This situation arises…
Descriptors: Intervention, Academic Achievement, Research Methodology, Research Design