Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 4 |
Descriptor
Source
Educational Evaluation and… | 7 |
Author
Kelcey, Ben | 2 |
Bloom, Howard | 1 |
Desimone, Laura M. | 1 |
Dong, Nianbo | 1 |
Fetterman, David M. | 1 |
Herman, Rebecca | 1 |
Jacob, Robin | 1 |
Le Floch, Kerstin Carlson | 1 |
Lehne, Richard | 1 |
Phelps, Geoffrey | 1 |
Song, Mengli | 1 |
More ▼ |
Publication Type
Journal Articles | 7 |
Reports - Research | 4 |
Reports - Descriptive | 2 |
Information Analyses | 1 |
Opinion Papers | 1 |
Education Level
Elementary Secondary Education | 2 |
Elementary Education | 1 |
Grade 1 | 1 |
Grade 2 | 1 |
Grade 3 | 1 |
Grade 5 | 1 |
High Schools | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Secondary Education | 1 |
Audience
Location
Florida | 1 |
North Carolina | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Spybrook, Jessaca; Zhang, Qi; Kelcey, Ben; Dong, Nianbo – Educational Evaluation and Policy Analysis, 2020
Over the past 15 years, we have seen an increase in the use of cluster randomized trials (CRTs) to test the efficacy of educational interventions. These studies are often designed with the goal of determining whether a program works, or answering the what works question. Recently, the goals of these studies expanded to include for whom and under…
Descriptors: Randomized Controlled Trials, Educational Research, Program Effectiveness, Intervention
Kelcey, Ben; Phelps, Geoffrey – Educational Evaluation and Policy Analysis, 2013
Despite recent shifts in research emphasizing the value of carefully designed experiments, the number of studies of teacher professional development with rigorous designs has lagged behind its student outcome counterparts. We outline a framework for the design of group randomized trials (GRTs) with teachers' knowledge as the outcome and…
Descriptors: Research Design, Faculty Development, Educational Research, Reading
Song, Mengli; Herman, Rebecca – Educational Evaluation and Policy Analysis, 2010
Drawing on our five years of experience developing WWC evidence standards and reviewing studies against those standards as well as current literature on the design of impact studies, we highlight in this paper some of the most critical issues and common pitfalls in designing and conducting impact studies in education, and provide practical…
Descriptors: Clearinghouses, Program Evaluation, Program Effectiveness, Research Methodology
Zhu, Pei; Jacob, Robin; Bloom, Howard; Xu, Zeyu – Educational Evaluation and Policy Analysis, 2012
This paper provides practical guidance for researchers who are designing and analyzing studies that randomize schools--which comprise three levels of clustering (students in classrooms in schools)--to measure intervention effects on student academic outcomes when information on the middle level (classrooms) is missing. This situation arises…
Descriptors: Educational Research, Educational Researchers, Research Methodology, Multivariate Analysis

Lehne, Richard – Educational Evaluation and Policy Analysis, 1983
Institutions, individuals, policies, and interactions are identified as the four research perspectives that have guided most legislative studies and that could help to identify promising areas for future investigation of education issues. The discussion of each perspective includes a review of landmark research and comments on its contributions…
Descriptors: Educational Policy, Educational Research, Legislators, Policy Formation

Fetterman, David M. – Educational Evaluation and Policy Analysis, 1982
The design and conduct of a national evaluation study is discussed, demonstrating that a control group may not provide the no-cause baseline information expected. Resolution of this problem requires reexamination of paradigms, research practices, and policies, as well as the underlying real world constraints and views that generate them. (PN)
Descriptors: Dropout Research, Educational Research, Ethics, Ethnography
Desimone, Laura M.; Le Floch, Kerstin Carlson – Educational Evaluation and Policy Analysis, 2004
Improving the validity and reliability of surveys is a critical part of the response to the call for improved rigor of education research, policy analysis and evaluation. Too often we create inquiry tools without validating our measures against how respondents interpret our questions, and therefore collect data of questionable quality. The purpose…
Descriptors: Cognitive Processes, Evaluation Methods, Educational Research, Validity