NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Tipton, Elizabeth; Olsen, Robert B. – Educational Researcher, 2018
School-based evaluations of interventions are increasingly common in education research. Ideally, the results of these evaluations are used to make evidence-based policy decisions for students. However, it is difficult to make generalizations from these evaluations because the types of schools included in the studies are typically not selected…
Descriptors: Intervention, Educational Research, Decision Making, Evidence Based Practice
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castillo, Veronica – Society for Research on Educational Effectiveness, 2015
Randomized experiments are commonly used to evaluate if particular interventions improve student achievement. While these experiments can establish that a treatment actually "causes" changes, typically the participants are not randomly selected from a well-defined population and therefore the results do not readily generalize. Three…
Descriptors: Site Selection, Randomized Controlled Trials, Educational Experiments, Research Methodology
Peer reviewed Peer reviewed
Direct linkDirect link
Taylor, Joseph A.; Roth, Kathleen; Wilson, Christopher D.; Stuhlsatz, Molly A. M.; Tipton, Elizabeth – Journal of Research on Educational Effectiveness, 2017
This article describes the effects of an analysis-of-practice professional development (PD) program on elementary school students' (Grades 4-6) science outcomes. The study design was a cluster-randomized trial with an analysis sample of 77 schools, 144 teachers and 2,823 students. Forty-two schools were randomly assigned to treatment, (88.5 hours)…
Descriptors: Faculty Development, Elementary School Science, Science Achievement, Elementary School Students