Publication Date
In 2025 | 0 |
Since 2024 | 6 |
Since 2021 (last 5 years) | 11 |
Since 2016 (last 10 years) | 18 |
Since 2006 (last 20 years) | 67 |
Descriptor
Source
Author
Anna Shapiro | 3 |
Anne-Marie Faria | 3 |
Breno Braga | 3 |
Christina Weiland | 3 |
Erica Greenberg | 3 |
Howard Bloom | 3 |
Jane Arnold Lincove | 3 |
Jon Valant | 3 |
Karen Manship | 3 |
Lindsay Weixler | 3 |
Luke Miratrix | 3 |
More ▼ |
Publication Type
Education Level
Higher Education | 16 |
Postsecondary Education | 8 |
Adult Education | 5 |
Early Childhood Education | 4 |
Elementary Secondary Education | 4 |
Preschool Education | 3 |
Elementary Education | 1 |
Grade 5 | 1 |
Grade 6 | 1 |
Grade 7 | 1 |
Grade 8 | 1 |
More ▼ |
Location
District of Columbia | 3 |
Louisiana (New Orleans) | 3 |
Massachusetts (Boston) | 3 |
New York (New York) | 3 |
Australia | 2 |
Canada | 2 |
New York | 2 |
United Kingdom | 2 |
United Kingdom (England) | 2 |
Illinois | 1 |
Indiana | 1 |
More ▼ |
Laws, Policies, & Programs
No Child Left Behind Act 2001 | 4 |
Youth Employment and… | 4 |
Carl D Perkins Vocational… | 1 |
Carl D Perkins Vocational and… | 1 |
Assessments and Surveys
National Assessment of… | 3 |
Schools and Staffing Survey… | 2 |
What Works Clearinghouse Rating
Stephen Gorard – Review of Education, 2024
This paper describes, and lays out an argument for, the use of a procedure to help groups of reviewers to judge the quality of prior research reports. It argues why such a procedure is needed, and how other existing approaches are only relevant to some kinds of research, meaning that a review or synthesis cannot successfully combine quality…
Descriptors: Credibility, Research Reports, Evaluation Methods, Research Design
Matthew J. Mayhew; Christa E. Winkler – Journal of Postsecondary Student Success, 2024
Higher education professionals often are tasked with providing evidence to stakeholders that programs, services, and practices implemented on their campuses contribute to student success. Furthermore, in the absence of a solid base of evidence related to effective practices, higher education researchers and practitioners are left questioning what…
Descriptors: Higher Education, Educational Practices, Evidence Based Practice, Program Evaluation
Christina Weiland; Rebecca Unterman; Susan Dynarski; Rachel Abenavoli; Howard Bloom; Breno Braga; Anne-Marie Faria; Erica Greenberg; Brian A. Jacob; Jane Arnold Lincove; Karen Manship; Meghan McCormick; Luke Miratrix; Tomás E. Monarrez; Pamela Morris-Perez; Anna Shapiro; Jon Valant; Lindsay Weixler – Grantee Submission, 2024
Lottery-based identification strategies offer potential for generating the next generation of evidence on U.S. early education programs. The authors' collaborative network of five research teams applying this design in early education settings and methods experts has identified six challenges that need to be carefully considered in this next…
Descriptors: Early Childhood Education, Program Evaluation, Evaluation Methods, Admission (School)
Christina Weiland; Rebecca Unterman; Susan Dynarski; Rachel Abenavoli; Howard Bloom; Breno Braga; Anne-Marie Faria; Erica Greenberg; Brian A. Jacob; Jane Arnold Lincove; Karen Manship; Meghan McCormick; Luke Miratrix; Tomás E. Monarrez; Pamela Morris-Perez; Anna Shapiro; Jon Valant; Lindsay Weixler – AERA Open, 2024
Lottery-based identification strategies offer potential for generating the next generation of evidence on U.S. early education programs. The authors' collaborative network of five research teams applying this design in early education settings and methods experts has identified six challenges that need to be carefully considered in this next…
Descriptors: Early Childhood Education, Program Evaluation, Evaluation Methods, Admission (School)
Manolov, Rumen; Tanious, René; Fernández-Castilla, Belén – Journal of Applied Behavior Analysis, 2022
In science in general and in the context of single-case experimental designs, replication of the effects of the intervention within and/or across participants or experiments is crucial for establishing causality and for assessing the generality of the intervention effect. Specific developments and proposals for assessing whether an effect has been…
Descriptors: Intervention, Behavioral Science Research, Replication (Evaluation), Research Design
Bower, Kyle L. – American Journal of Evaluation, 2022
The purpose of this paper is to introduce the Five-Level Qualitative Data Analysis (5LQDA) method for ATLAS.ti as a way to intentionally design methodological approaches applicable to the field of evaluation. To demonstrate my analytical process using ATLAS.ti, I use examples from an existing evaluation of a STEM Peer Learning Assistant program.…
Descriptors: Qualitative Research, Data Analysis, Program Evaluation, Evaluation Methods
Angela Johnson; Elizabeth Barker; Marcos Viveros Cespedes – Educational Measurement: Issues and Practice, 2024
Educators and researchers strive to build policies and practices on data and evidence, especially on academic achievement scores. When assessment scores are inaccurate for specific student populations or when scores are inappropriately used, even data-driven decisions will be misinformed. To maximize the impact of the research-practice-policy…
Descriptors: Equal Education, Inclusion, Evaluation Methods, Error of Measurement
Elizabeth Talbott; Andres De Los Reyes; Devin M. Kearns; Jeannette Mancilla-Martinez; Mo Wang – Exceptional Children, 2023
Evidence-based assessment (EBA) requires that investigators employ scientific theories and research findings to guide decisions about what domains to measure, how and when to measure them, and how to make decisions and interpret results. To implement EBA, investigators need high-quality assessment tools along with evidence-based processes. We…
Descriptors: Evidence Based Practice, Evaluation Methods, Special Education, Educational Research
Martínez, Alejandro José Gallard; Pitts, Wesley B.; Brkich, Katie Milton; de Robles, S. Lizette Ramos – Cultural Studies of Science Education, 2020
In this paper we introduce the concept of contextual mitigating factors (CMFs) as an analytical tool for interrogating the contextual landscapes that situate one's research. While we understand that much work has been done on the importance of identifying context in research programs, we continue to stress the importance of developing contextually…
Descriptors: Context Effect, Research Design, Researchers, Evaluation Methods
Weidlich, Joshua; Gaševic, Dragan; Drachsler, Hendrik – Journal of Learning Analytics, 2022
As a research field geared toward understanding and improving learning, Learning Analytics (LA) must be able to provide empirical support for causal claims. However, as a highly applied field, tightly controlled randomized experiments are not always feasible nor desirable. Instead, researchers often rely on observational data, based on which they…
Descriptors: Causal Models, Inferences, Learning Analytics, Comparative Analysis
Radhakrishna, Rama; Chaudhary, Anil Kumar; Tobin, Daniel – Journal of Extension, 2019
We present a framework to help those working in Extension connect program designs with appropriate evaluation designs to improve evaluation. The framework links four distinct Extension program domains--service, facilitation, content transformation, and transformative education--with three types of evaluation design--preexperimental,…
Descriptors: Extension Education, Program Design, Evaluation Methods, Research Design
Christina Weiland; Rebecca Unterman; Susan Dynarski; Rachel Abenavoli; Howard Bloom; Breno Braga; Anne-Marie Faria; Erica Greenberg; Brian Jacob; Jane Arnold Lincove; Karen Manship; Meghan McCormick; Luke Miratrix; Tomás E. Monarrez; Pamela Morris-Perez; Anna Shapiro; Jon Valant; Lindsay Weixler – Annenberg Institute for School Reform at Brown University, 2023
Lottery-based identification strategies offer potential for generating the next generation of evidence on U.S. early education programs. Our collaborative network of five research teams applying this design in early education and methods experts has identified six challenges that need to be carefully considered in this next context: (1) available…
Descriptors: Early Childhood Education, Program Evaluation, Evaluation Methods, Admission (School)
Laura P. Naumann; Samantha N. Jewell; Erin L. Rider – Journal of Faculty Development, 2024
Prior studies indicate that faculty often struggle with Scholarship of Teaching and Learning (SoTL) research due to limited knowledge of relevant research methodologies and data analysis techniques (Boshier, 2009; Kim et al., 2021; McKinney, 2006). Faculty developers at a teaching-intensive institution created an innovative, scaffolded model to…
Descriptors: Scaffolding (Teaching Technique), Faculty Development, Fellowships, Novices
What Works Clearinghouse, 2020
This supplement concerns Appendix E of the "What Works Clearinghouse (WWC) Procedures Handbook, Version 4.1." The supplement extends the range of designs and analyses that can generate effect size and standard error estimates for the WWC. This supplement presents several new standard error formulas for cluster-level assignment studies,…
Descriptors: Educational Research, Evaluation Methods, Effect Size, Research Design
Hong, Quan Nha; Fàbregues, Sergi; Bartlett, Gillian; Boardman, Felicity; Cargo, Margaret; Dagenais, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Nicolau, Belinda; O'Cathain, Alicia; Rousseau, Marie-Claude; Vedel, Isabelle; Pluye, Pierre – Education for Information, 2018
Introduction: Appraising the quality of studies included in systematic reviews combining qualitative and quantitative evidence is challenging. To address this challenge, a critical appraisal tool was developed: the Mixed Methods Appraisal Tool (MMAT). The aim of this paper is to present the enhancements made to the MMAT. Development: The MMAT was…
Descriptors: Mixed Methods Research, Evaluation Methods, Research Design, Evaluation Criteria