Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 9 |
Descriptor
Experiments | 18 |
Program Evaluation | 18 |
Research Methodology | 18 |
Educational Research | 8 |
Evaluation Methods | 8 |
Program Effectiveness | 5 |
Computation | 4 |
Intervention | 4 |
Research Design | 4 |
Data Collection | 3 |
Design | 3 |
More ▼ |
Source
Author
Schochet, Peter Z. | 3 |
Borman, Geoffrey D. | 1 |
Burtless, Gary | 1 |
Curran, F. Chris | 1 |
Dennis, Michael L. | 1 |
Gaus, Hansjoerg | 1 |
Gladkowski, Gerald | 1 |
Glazerman, Steven | 1 |
Greig, Jeffrey | 1 |
Hamilton, Gayle | 1 |
Hansen, Ben | 1 |
More ▼ |
Publication Type
Journal Articles | 10 |
Reports - Research | 7 |
Reports - Descriptive | 5 |
Reports - Evaluative | 4 |
Speeches/Meeting Papers | 2 |
Numerical/Quantitative Data | 1 |
Tests/Questionnaires | 1 |
Education Level
Early Childhood Education | 2 |
Elementary Education | 2 |
Elementary Secondary Education | 1 |
Higher Education | 1 |
Kindergarten | 1 |
Postsecondary Education | 1 |
Audience
Location
Connecticut | 1 |
Germany | 1 |
Tennessee | 1 |
United Kingdom | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Early Childhood Longitudinal… | 1 |
What Works Clearinghouse Rating
Shen, Zuchao; Curran, F. Chris; You, You; Splett, Joni Williams; Zhang, Huibin – Educational Evaluation and Policy Analysis, 2023
Programs that improve teaching effectiveness represent a core strategy to improve student educational outcomes and close student achievement gaps. This article compiles empirical values of intraclass correlations for designing effective and efficient experimental studies evaluating the effects of these programs. The Early Childhood Longitudinal…
Descriptors: Children, Longitudinal Studies, Surveys, Teacher Empowerment
White, Mark C.; Rowan, Brian; Hansen, Ben; Lycurgus, Timothy – Journal of Research on Educational Effectiveness, 2019
There is growing pressure to make efficacy experiments more useful. This requires attending to the twin goals of generalizing experimental results to those schools that will use the results and testing the intervention's theory of action. We show how electronic records, created naturally during the daily operation of technology-based…
Descriptors: Program Evaluation, Generalization, Experiments, Records (Forms)
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles – National Center for Education Evaluation and Regional Assistance, 2017
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Schochet, Peter Z. – National Center for Education Evaluation and Regional Assistance, 2017
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples…
Descriptors: Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology
Mueller, Christoph Emanuel; Gaus, Hansjoerg – American Journal of Evaluation, 2015
In this article, we test an alternative approach to creating a counterfactual basis for estimating individual and average treatment effects. Instead of using control/comparison groups or before-measures, the so-called Counterfactual as Self-Estimated by Program Participants (CSEPP) relies on program participants' self-estimations of their own…
Descriptors: Intervention, Research Design, Research Methodology, Program Evaluation
Jaciw, Andrew; Newman, Denis – Society for Research on Educational Effectiveness, 2011
The purpose of the current work is to apply several main principles of the causal explanatory approach for establishing external validity to the experimental arena. By spanning the paradigm of the experimental approach and the school of program evaluation founded by Lee Cronbach and colleagues, the authors address the question of how research…
Descriptors: Validity, Experiments, Research Methodology, Generalization
Glazerman, Steven – Education Finance and Policy, 2012
Randomized trials are a common way to provide rigorous evidence on the impacts of education programs. This article discusses the trade-offs associated with study designs that involve random assignment of students within schools and describes the experience from one such study of Teach for America (TFA). The TFA experiment faced challenges with…
Descriptors: Educational Research, Research Design, Research Methodology, Experiments
Walker, Robert; Hoggart, Lesley; Hamilton, Gayle – American Journal of Evaluation, 2008
Although random assignment is generally the preferred methodology in impact evaluations, it raises numerous ethical concerns, some of which are addressed by securing participants' informed consent. However, there has been little investigation of how consent is obtained in social experiments and the amount of information that can be conveyed--and…
Descriptors: Employment Programs, Foreign Countries, Case Studies, Program Evaluation

Worthen, Blaine – Evaluation Practice, 1995
It is argued that reports of evaluation activities often leave much to be desired because they fail to portray the real trial-and-error processes of the research study and do not report on the logic in use at the time of the inquiry. (SLD)
Descriptors: Educational Research, Evaluation Methods, Experiments, Logic

Burtless, Gary; Orr, Larry L. – Journal of Human Resources, 1986
This paper examines the major methodological advantages of random assignment for the purpose of estimating the effectiveness of current manpower policy. It also reviews the claimed methodological and ethical objections to experiments. The authors argue that the offsetting gain from experimentation is the inherent reliability of experimental…
Descriptors: Cost Effectiveness, Data Analysis, Data Collection, Experiments

Peck, Laura R. – American Journal of Evaluation, 2003
Proposes a methodology for analyzing the impacts of social programs on previously unexamined subgroups. The approach estimates the impact of programs on subgroups identified by a postreatment choice while maintaining the integrity of the experimental research design. (SLD)
Descriptors: Evaluation Methods, Experiments, Measurement Techniques, Outcomes of Treatment

Dennis, Michael L. – Evaluation Review, 1990
Six potential problems with the use of randomized experiments to evaluate programs in the field are addressed. Problems include treatment dilution, treatment contamination or confounding, inaccurate case flow and power estimates, violations of the random assignment processes, changes in the environmental context, and changes in the treatment…
Descriptors: Drug Rehabilitation, Evaluation Problems, Experiments, Field Studies

Borman, Geoffrey D. – Peabody Journal of Education, 2002
Asserts that experimental designs are the gold standard for research, addressing common criticisms of the application of experimental designs to education research, discussing why they have not been applied in education as frequently as they have been in other fields, examining circumstances under which experiments are appropriate, and offering…
Descriptors: Educational Improvement, Educational Research, Elementary Secondary Education, Ethics
Levitan, Sar A. – 1992
This paper explores the impact that the evaluation industry has had on the development and implementation of social policy and programs, primarily as carried out by the U.S. Departments of Labor and Health and Human Services. In addition, major tools evaluators have developed and used, and the institutional arrangements through which they have…
Descriptors: Decision Making, Evaluation Methods, Evaluation Utilization, Experiments
Previous Page | Next Page ยป
Pages: 1 | 2