Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 9 |
Descriptor
Computation | 10 |
Program Evaluation | 10 |
Research Design | 10 |
Research Methodology | 7 |
Intervention | 6 |
Effect Size | 5 |
Statistical Analysis | 5 |
Regression (Statistics) | 4 |
Comparative Analysis | 3 |
Correlation | 3 |
Educational Research | 3 |
More ▼ |
Source
American Journal of Evaluation | 2 |
Mathematica Policy Research,… | 2 |
Educational Evaluation and… | 1 |
Educational Researcher | 1 |
Evaluation Review | 1 |
Psychology in the Schools | 1 |
Society for Research on… | 1 |
Springer | 1 |
Author
Gaus, Hansjoerg | 2 |
Mueller, Christoph Emanuel | 2 |
Bloom, Howard S. | 1 |
Burghardt, John | 1 |
Catharine Lory | 1 |
Deke, John | 1 |
Di Liu | 1 |
Dragoset, Lisa | 1 |
Porter, Kristin E. | 1 |
Qingli Lei | 1 |
Raudenbush, Stephen | 1 |
More ▼ |
Publication Type
Journal Articles | 6 |
Reports - Research | 5 |
Reports - Descriptive | 2 |
Books | 1 |
Guides - Non-Classroom | 1 |
Information Analyses | 1 |
Reports - Evaluative | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 2 |
Postsecondary Education | 2 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Grade 1 | 1 |
Grade 2 | 1 |
Grade 3 | 1 |
Grade 4 | 1 |
Grade 5 | 1 |
Grade 6 | 1 |
Grade 7 | 1 |
More ▼ |
Audience
Location
Germany | 2 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Di Liu; Yiwen Mao; Catharine Lory; Qingli Lei; Yingying Zeng – Psychology in the Schools, 2024
Computation is foundational to learning many mathematics concepts, as well as a functional skill in everyday life. Yet students with autism spectrum disorder (ASD) often have challenges in learning computation skills. The current study aimed to provide quantitative and descriptive analyses of single-case experimental studies on computation…
Descriptors: Computation, Intervention, Students with Disabilities, Autism Spectrum Disorders
Mueller, Christoph Emanuel; Gaus, Hansjoerg – American Journal of Evaluation, 2015
In this article, we test an alternative approach to creating a counterfactual basis for estimating individual and average treatment effects. Instead of using control/comparison groups or before-measures, the so-called Counterfactual as Self-Estimated by Program Participants (CSEPP) relies on program participants' self-estimations of their own…
Descriptors: Intervention, Research Design, Research Methodology, Program Evaluation
Bloom, Howard S.; Porter, Kristin E.; Weiss, Michael J.; Raudenbush, Stephen – Society for Research on Educational Effectiveness, 2013
To date, evaluation research and policy analysis have focused mainly on average program impacts and paid little systematic attention to their variation. Recently, the growing number of multi-site randomized trials that are being planned and conducted make it increasingly feasible to study "cross-site" variation in impacts. Important…
Descriptors: Research Methodology, Policy, Evaluation Research, Randomized Controlled Trials
Deke, John; Dragoset, Lisa – Mathematica Policy Research, Inc., 2012
The regression discontinuity design (RDD) has the potential to yield findings with causal validity approaching that of the randomized controlled trial (RCT). However, Schochet (2008a) estimated that, on average, an RDD study of an education intervention would need to include three to four times as many schools or students as an RCT to produce…
Descriptors: Research Design, Elementary Secondary Education, Regression (Statistics), Educational Research
Mueller, Christoph Emanuel; Gaus, Hansjoerg; Rech, Joerg – American Journal of Evaluation, 2014
This article proposes an innovative approach to estimating the counterfactual without the necessity of generating information from either a control group or a before-measure. Building on the idea that program participants are capable of estimating the hypothetical state they would be in had they not participated, the basics of the Roy-Rubin model…
Descriptors: Research Design, Program Evaluation, Research Methodology, Models
Spybrook, Jessaca; Raudenbush, Stephen W. – Educational Evaluation and Policy Analysis, 2009
This article examines the power analyses for the first wave of group-randomized trials funded by the Institute of Education Sciences. Specifically, it assesses the precision and technical accuracy of the studies. The authors identified the appropriate experimental design and estimated the minimum detectable standardized effect size (MDES) for each…
Descriptors: Research Design, Research Methodology, Effect Size, Correlation
Rosenthal, James A. – Springer, 2011
Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…
Descriptors: Statistics, Data Interpretation, Social Work, Social Science Research
Schochet, Peter; Burghardt, John – Evaluation Review, 2007
This article discusses the use of propensity scoring in experimental program evaluations to estimate impacts for subgroups defined by program features and participants' program experiences. The authors discuss estimation issues and provide specification tests. They also discuss the use of an overlooked data collection design--obtaining predictions…
Descriptors: Program Effectiveness, Scoring, Experimental Programs, Control Groups
Stuart, Elizabeth A. – Educational Researcher, 2007
Education researchers, practitioners, and policymakers alike are committed to identifying interventions that teach students more effectively. Increased emphasis on evaluation and accountability has increased desire for sound evaluations of these interventions; and at the same time, school-level data have become increasingly available. This article…
Descriptors: Research Methodology, Computation, Causal Models, Intervention
Schochet, Peter Z. – Mathematica Policy Research, Inc., 2005
This paper examines issues related to the statistical power of impact estimates for experimental evaluations of education programs. The focus is on "group-based" experimental designs, because many studies of education programs involve random assignment at the group level (for example, at the school or classroom level) rather than at the student…
Descriptors: Statistical Analysis, Evaluation Methods, Program Evaluation, Research Design