NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Baek, Eunkyeng; Beretvas, S. Natasha; Van den Noortgate, Wim; Ferron, John M. – Journal of Experimental Education, 2020
Recently, researchers have used multilevel models for estimating intervention effects in single-case experiments that include replications across participants (e.g., multiple baseline designs) or for combining results across multiple single-case studies. Researchers estimating these multilevel models have primarily relied on restricted maximum…
Descriptors: Bayesian Statistics, Intervention, Case Studies, Monte Carlo Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Baek, Eunkyeng; Luo, Wen; Henri, Maria – Journal of Experimental Education, 2022
It is common to include multiple dependent variables (DVs) in single-case experimental design (SCED) meta-analyses. However, statistical issues associated with multiple DVs in the multilevel modeling approach (i.e., possible dependency of error, heterogeneous treatment effects, and heterogeneous error structures) have not been fully investigated.…
Descriptors: Meta Analysis, Hierarchical Linear Modeling, Comparative Analysis, Statistical Inference
Peer reviewed Peer reviewed
Direct linkDirect link
Xiao, ZhiMin; Higgins, Steve; Kasim, Adetayo – Journal of Experimental Education, 2019
Lord's Paradox occurs when a continuous covariate is statistically controlled for and the relationship between a continuous outcome and group status indicator changes in both magnitude and direction. This phenomenon poses a challenge to the notion of evidence-based policy, where data are supposed to be self-evident. We examined 50 effect size…
Descriptors: Statistical Analysis, Decision Making, Research Methodology, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Dong, Nianbo; Kelcey, Benjamin; Spybrook, Jessaca – Journal of Experimental Education, 2018
Researchers are often interested in whether the effects of an intervention differ conditional on individual- or group-moderator variables such as children's characteristics (e.g., gender), teacher's background (e.g., years of teaching), and school's characteristics (e.g., urbanity); that is, the researchers seek to examine for whom and under what…
Descriptors: Statistical Analysis, Randomized Controlled Trials, Intervention, Effect Size
Peer reviewed Peer reviewed
Direct linkDirect link
Heyvaert, Mieke; Moeyaert, Mariola; Verkempynck, Paul; Van den Noortgate, Wim; Vervloet, Marlies; Ugille, Maaike; Onghena, Patrick – Journal of Experimental Education, 2017
This article reports on a Monte Carlo simulation study, evaluating two approaches for testing the intervention effect in replicated randomized AB designs: two-level hierarchical linear modeling (HLM) and using the additive method to combine randomization test "p" values (RTcombiP). Four factors were manipulated: mean intervention effect,…
Descriptors: Monte Carlo Methods, Simulation, Intervention, Replication (Evaluation)
Peer reviewed Peer reviewed
Direct linkDirect link
Hembry, Ian; Bunuan, Rommel; Beretvas, S. Natasha; Ferron, John M.; Van den Noortgate, Wim – Journal of Experimental Education, 2015
A multilevel logistic model for estimating a nonlinear trajectory in a multiple-baseline design is introduced. The model is applied to data from a real multiple-baseline design study to demonstrate interpretation of relevant parameters. A simple change-in-levels (?"Levels") model and a model involving a quadratic function…
Descriptors: Computation, Research Design, Data, Intervention