NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 18 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Milica Miocevic; Fayette Klaassen; Mariola Moeyaert; Gemma G. M. Geuke – Journal of Experimental Education, 2025
Mediation analysis in Single Case Experimental Designs (SCEDs) evaluates intervention mechanisms for individuals. Despite recent methodological developments, no clear guidelines exist for maximizing power to detect the indirect effect in SCEDs. This study compares frequentist and Bayesian methods, determining (1) minimum required sample size to…
Descriptors: Research Design, Mediation Theory, Statistical Analysis, Simulation
Peer reviewed Peer reviewed
Direct linkDirect link
Huibin Zhang; Zuchao Shen; Walter L. Leite – Journal of Experimental Education, 2025
Cluster-randomized trials have been widely used to evaluate the treatment effects of interventions on student outcomes. When interventions are implemented by teachers, researchers need to account for the nested structure in schools (i.e., students are nested within teachers nested within schools). Schools usually have a very limited number of…
Descriptors: Sample Size, Multivariate Analysis, Randomized Controlled Trials, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Kyle Cox; Ben Kelcey; Hannah Luce – Journal of Experimental Education, 2024
Comprehensive evaluation of treatment effects is aided by considerations for moderated effects. In educational research, the combination of natural hierarchical structures and prevalence of group-administered or shared facilitator treatments often produces three-level partially nested data structures. Literature details planning strategies for a…
Descriptors: Randomized Controlled Trials, Monte Carlo Methods, Hierarchical Linear Modeling, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; Finch, Maria Hernández – Journal of Experimental Education, 2018
Single subject (SS) designs are popular in educational and psychological research. There exist several statistical techniques designed to analyze such data and to address the question of whether an intervention has the desired impact. Recently, researchers have suggested that generalized additive models (GAMs) might be useful for modeling…
Descriptors: Educational Research, Longitudinal Studies, Simulation, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Heyvaert, Mieke; Moeyaert, Mariola; Verkempynck, Paul; Van den Noortgate, Wim; Vervloet, Marlies; Ugille, Maaike; Onghena, Patrick – Journal of Experimental Education, 2017
This article reports on a Monte Carlo simulation study, evaluating two approaches for testing the intervention effect in replicated randomized AB designs: two-level hierarchical linear modeling (HLM) and using the additive method to combine randomization test "p" values (RTcombiP). Four factors were manipulated: mean intervention effect,…
Descriptors: Monte Carlo Methods, Simulation, Intervention, Replication (Evaluation)
Peer reviewed Peer reviewed
Direct linkDirect link
Ugille, Maaike; Moeyaert, Mariola; Beretvas, S. Natasha; Ferron, John M.; Van den Noortgate, Wim – Journal of Experimental Education, 2014
A multilevel meta-analysis can combine the results of several single-subject experimental design studies. However, the estimated effects are biased if the effect sizes are standardized and the number of measurement occasions is small. In this study, the authors investigated 4 approaches to correct for this bias. First, the standardized effect…
Descriptors: Effect Size, Statistical Bias, Sample Size, Regression (Statistics)
Peer reviewed Peer reviewed
Direct linkDirect link
Manolov, Rumen; Solanas, Antonio; Bulte, Isis; Onghena, Patrick – Journal of Experimental Education, 2010
This study deals with the statistical properties of a randomization test applied to an ABAB design in cases where the desirable random assignment of the points of change in phase is not possible. To obtain information about each possible data division, the authors carried out a conditional Monte Carlo simulation with 100,000 samples for each…
Descriptors: Monte Carlo Methods, Effect Size, Simulation, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Luh, Wei-Ming; Guo, Jiin-Huarng – Journal of Experimental Education, 2009
The sample size determination is an important issue for planning research. However, limitations in size have seldom been discussed in the literature. Thus, how to allocate participants into different treatment groups to achieve the desired power is a practical issue that still needs to be addressed when one group size is fixed. The authors focused…
Descriptors: Sample Size, Research Methodology, Evaluation Methods, Simulation
Peer reviewed Peer reviewed
Rheinheimer, David C.; Penfield, Douglas A. – Journal of Experimental Education, 2001
Studied, through Monte Carlo simulation, the conditions for which analysis of covariance (ANCOVA) does not maintain adequate Type I error rates and power and evaluated some alternative tests. Discusses differences in ANCOVA robustness for balanced and unbalanced designs. (SLD)
Descriptors: Analysis of Covariance, Monte Carlo Methods, Power (Statistics), Research Design
Peer reviewed Peer reviewed
Klockars, Alan J.; Beretvas, S. Natasha – Journal of Experimental Education, 2001
Compared the Type I error rate and the power to detect differences in slopes and additive treatment effects of analysis of covariance (ANCOVA) and randomized block designs through a Monte Carlo simulation. Results show that the more powerful option in almost all simulations for tests of both slope and means was ANCOVA. (SLD)
Descriptors: Analysis of Covariance, Monte Carlo Methods, Power (Statistics), Research Design
Peer reviewed Peer reviewed
Hsu, Tse-Chi; Sebatane, E. Molapi – Journal of Experimental Education, 1979
A Monte Carlo technique was used to investigate the effect of the differences in covariate means among treatment groups on the significance level and the power of the F-test of the analysis of covariance. (Author/GDC)
Descriptors: Analysis of Covariance, Correlation, Research Design, Research Problems
Peer reviewed Peer reviewed
Sawilowsky, Shlomo; And Others – Journal of Experimental Education, 1994
A Monte Carlo study considers the use of meta analysis with the Solomon four-group design. Experiment-wise Type I error properties and the relative power properties of Stouffer's Z in the Solomon four-group design are explored. Obstacles to conducting meta analysis in the Solomon design are discussed. (SLD)
Descriptors: Meta Analysis, Monte Carlo Methods, Power (Statistics), Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Zhongmiao; Thompson, Bruce – Journal of Experimental Education, 2007
In this study the authors investigated the use of 5 (i.e., Claudy, Ezekiel, Olkin-Pratt, Pratt, and Smith) R[squared] correction formulas with the Pearson r[squared]. The authors estimated adjustment bias and precision under 6 x 3 x 6 conditions (i.e., population [rho] values of 0.0, 0.1, 0.3, 0.5, 0.7, and 0.9; population shapes normal, skewness…
Descriptors: Effect Size, Correlation, Mathematical Formulas, Monte Carlo Methods
Peer reviewed Peer reviewed
Marsh, Herbert W. – Journal of Experimental Education, 1998
Eight variations of a general matching design, matching program participants and a control group, were studied through simulation, for their effectiveness in evaluating programs for the gifted and talented. A regression-discontinuity design provided the best approach, with unbiased estimates of program effects. (SLD)
Descriptors: Control Groups, Gifted, Matched Groups, Program Evaluation
Peer reviewed Peer reviewed
Ferron, John; Onghena, Patrick – Journal of Experimental Education, 1996
Monte Carlo methods were used to estimate the power of randomization tests used with single-case designs involving random assignment of treatments to phases. Simulations of two treatments and six phases showed an adequate level of power when effect sizes were large, phase lengths exceeded five, and autocorrelation was not negative. (SLD)
Descriptors: Case Studies, Correlation, Educational Research, Effect Size
Previous Page | Next Page »
Pages: 1  |  2