NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Baek, Eunkyeng; Beretvas, S. Natasha; Van den Noortgate, Wim; Ferron, John M. – Journal of Experimental Education, 2020
Recently, researchers have used multilevel models for estimating intervention effects in single-case experiments that include replications across participants (e.g., multiple baseline designs) or for combining results across multiple single-case studies. Researchers estimating these multilevel models have primarily relied on restricted maximum…
Descriptors: Bayesian Statistics, Intervention, Case Studies, Monte Carlo Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Moeyaert, Mariola; Ugille, Maaike; Ferron, John M.; Beretvas, S. Natasha; Van den Noortgate, Wim – Journal of Experimental Education, 2016
The impact of misspecifying covariance matrices at the second and third levels of the three-level model is evaluated. Results indicate that ignoring existing covariance has no effect on the treatment effect estimate. In addition, the between-case variance estimates are unbiased when covariance is either modeled or ignored. If the research interest…
Descriptors: Hierarchical Linear Modeling, Monte Carlo Methods, Computation, Statistical Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Smith, Lindsey J. Wolff; Beretvas, S. Natasha – Journal of Experimental Education, 2017
Conventional multilevel modeling works well with purely hierarchical data; however, pure hierarchies rarely exist in real datasets. Applied researchers employ ad hoc procedures to create purely hierarchical data. For example, applied educational researchers either delete mobile participants' data from the analysis or identify the student only with…
Descriptors: Student Mobility, Academic Achievement, Simulation, Influences