NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Green, Samuel B.; Redell, Nickalus; Thompson, Marilyn S.; Levy, Roy – Educational and Psychological Measurement, 2016
Parallel analysis (PA) is a useful empirical tool for assessing the number of factors in exploratory factor analysis. On conceptual and empirical grounds, we argue for a revision to PA that makes it more consistent with hypothesis testing. Using Monte Carlo methods, we evaluated the relative accuracy of the revised PA (R-PA) and traditional PA…
Descriptors: Accuracy, Factor Analysis, Hypothesis Testing, Monte Carlo Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Green, Samuel B.; Thompson, Marilyn S.; Levy, Roy; Lo, Wen-Juo – Educational and Psychological Measurement, 2015
Traditional parallel analysis (T-PA) estimates the number of factors by sequentially comparing sample eigenvalues with eigenvalues for randomly generated data. Revised parallel analysis (R-PA) sequentially compares the "k"th eigenvalue for sample data to the "k"th eigenvalue for generated data sets, conditioned on"k"-…
Descriptors: Factor Analysis, Error of Measurement, Accuracy, Hypothesis Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Jin, Ying; Myers, Nicholas D.; Ahn, Soyeon; Penfield, Randall D. – Educational and Psychological Measurement, 2013
The Rasch model, a member of a larger group of models within item response theory, is widely used in empirical studies. Detection of uniform differential item functioning (DIF) within the Rasch model typically employs null hypothesis testing with a concomitant consideration of effect size (e.g., signed area [SA]). Parametric equivalence between…
Descriptors: Test Bias, Effect Size, Item Response Theory, Comparative Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Froelich, Amy G.; Habing, Brian – Applied Psychological Measurement, 2008
DIMTEST is a nonparametric hypothesis-testing procedure designed to test the assumptions of a unidimensional and locally independent item response theory model. Several previous Monte Carlo studies have found that using linear factor analysis to select the assessment subtest for DIMTEST results in a moderate to severe loss of power when the exam…
Descriptors: Test Items, Monte Carlo Methods, Form Classes (Languages), Program Effectiveness
Peer reviewed Peer reviewed
Wilson, Gale A.; Martin, Samuel A. – Educational and Psychological Measurement, 1983
Either Bartlett's chi-square test of sphericity or Steiger's chi-square test can be used to test the significance of a correlation matrix to determine the appropriateness of factor analysis. They were evaluated using computer-generated correlation matrices. Steiger's test is recommended due to its increased power and computational simplicity.…
Descriptors: Comparative Analysis, Correlation, Factor Analysis, Hypothesis Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; French, Brian F. – Educational and Psychological Measurement, 2007
Differential item functioning (DIF) continues to receive attention both in applied and methodological studies. Because DIF can be an indicator of irrelevant variance that can influence test scores, continuing to evaluate and improve the accuracy of detection methods is an essential step in gathering score validity evidence. Methods for detecting…
Descriptors: Item Response Theory, Factor Analysis, Test Bias, Comparative Analysis
Pohlmann, John T. – 1972
The Monte Carlo method was used, and the factors considered were (1) level of main effects in the population; (2) level of interaction effects in the population; (3) alpha level used in determining whether to pool; and (4) number of degrees of freedom. The results indicated that when the ratio degrees of freedom (axb)/degrees of freedom (within)…
Descriptors: Analysis of Variance, Computer Programs, Factor Analysis, Hypothesis Testing