NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Researchers2
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 76 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Chunhua Cao; Xinya Liang – Structural Equation Modeling: A Multidisciplinary Journal, 2024
Cross-loadings are common in multiple-factor confirmatory factor analysis (CFA) but often ignored in measurement invariance testing. This study examined the impact of ignoring cross-loadings on the sensitivity of fit measures (CFI, RMSEA, SRMR, SRMRu, AIC, BIC, SaBIC, LRT) to measurement noninvariance. The manipulated design factors included the…
Descriptors: Goodness of Fit, Error of Measurement, Sample Size, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Philipp Sterner; Kim De Roover; David Goretzko – Structural Equation Modeling: A Multidisciplinary Journal, 2025
When comparing relations and means of latent variables, it is important to establish measurement invariance (MI). Most methods to assess MI are based on confirmatory factor analysis (CFA). Recently, new methods have been developed based on exploratory factor analysis (EFA); most notably, as extensions of multi-group EFA, researchers introduced…
Descriptors: Error of Measurement, Measurement Techniques, Factor Analysis, Structural Equation Models
Peer reviewed Peer reviewed
Direct linkDirect link
Abdolvahab Khademi; Craig S. Wells; Maria Elena Oliveri; Ester Villalonga-Olives – SAGE Open, 2023
The most common effect size when using a multiple-group confirmatory factor analysis approach to measurement invariance is [delta]CFI and [delta]TLI with a cutoff value of 0.01. However, this recommended cutoff value may not be ubiquitously appropriate and may be of limited application for some tests (e.g., measures using dichotomous items or…
Descriptors: Factor Analysis, Factor Structure, Error of Measurement, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Hyunjung Lee; Heining Cham – Educational and Psychological Measurement, 2024
Determining the number of factors in exploratory factor analysis (EFA) is crucial because it affects the rest of the analysis and the conclusions of the study. Researchers have developed various methods for deciding the number of factors to retain in EFA, but this remains one of the most difficult decisions in the EFA. The purpose of this study is…
Descriptors: Factor Structure, Factor Analysis, Monte Carlo Methods, Goodness of Fit
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Teck Kiang Tan – Practical Assessment, Research & Evaluation, 2024
The procedures of carrying out factorial invariance to validate a construct were well developed to ensure the reliability of the construct that can be used across groups for comparison and analysis, yet mainly restricted to the frequentist approach. This motivates an update to incorporate the growing Bayesian approach for carrying out the Bayesian…
Descriptors: Bayesian Statistics, Factor Analysis, Programming Languages, Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Mohammad Mehdi Latifi; Dariush Tahmasebi Aghbelaghi; Sajad Khani Pordanjani – European Journal of Education, 2025
The present study sought to assess the psychometric properties of the Iranian adaptation of the Vietnam Teacher Resilience Scale for Asia (VITRS), referred to as the Iranian Teachers' Resilience Scale (ITRS) and to examine its measurement invariance across middle and high school teachers in Iran. In total, 700 participants completed the…
Descriptors: Resilience (Psychology), Error of Measurement, Factor Analysis, Teacher Attitudes
Peer reviewed Peer reviewed
Direct linkDirect link
Cristian Zanon; Nan Zhao; Nursel Topkaya; Ertugrul Sahin; David L. Vogel; Melissa M. Ertl; Samineh Sanatkar; Hsin-Ya Liao; Mark Rubin; Makilim N. Baptista; Winnie W. S. Mak; Fatima Rashed Al-Darmaki; Georg Schomerus; Ying-Fen Wang; Dalia Nasvytiene – International Journal of Testing, 2025
Examinations of the internal structure of the Depression, Anxiety, and Stress Scale-21 (DASS-21) have yielded inconsistent conclusions within and across cultural contexts. This study examined the dimensionality and reliability of the DASS-21 across three theoretically plausible factor structures (i.e., unidimensional, oblique three-factor, and…
Descriptors: Anxiety, Depression (Psychology), Psychometrics, Cultural Context
Peer reviewed Peer reviewed
Direct linkDirect link
Yanjing Cao; Chenchen Xu; Shan Lu; Qi Li; Jing Xiao – Psychology in the Schools, 2025
The patient health questionnaire-9 (PHQ-9) is widely utilized in assessing individuals' depression levels. Nevertheless, research regarding its factor structure and measurement invariance remains inadequate. The aim of this study was to delve into the factor structure of the PHQ-9 and to further investigate its measurement invariance across gender…
Descriptors: Factor Structure, Error of Measurement, Factor Analysis, Age Differences
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, Bitna; Sohn, Wonsook – Educational and Psychological Measurement, 2022
A Monte Carlo study was conducted to compare the performance of a level-specific (LS) fit evaluation with that of a simultaneous (SI) fit evaluation in multilevel confirmatory factor analysis (MCFA) models. We extended previous studies by examining their performance under MCFA models with different factor structures across levels. In addition,…
Descriptors: Goodness of Fit, Factor Structure, Monte Carlo Methods, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Karina Mostert; Clarisse van Rensburg; Reitumetse Machaba – Journal of Applied Research in Higher Education, 2024
Purpose: This study examined the psychometric properties of intention to drop out and study satisfaction measures for first-year South African students. The factorial validity, item bias, measurement invariance and reliability were tested. Design/methodology/approach: A cross-sectional design was used. For the study on intention to drop out, 1,820…
Descriptors: Intention, Potential Dropouts, Student Satisfaction, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Zhong Jian Chee; Anke M. Scheeren; Marieke de Vries – Autism: The International Journal of Research and Practice, 2024
Despite several psychometric advantages over the 50-item Autism Spectrum Quotient, an instrument used to measure autistic traits, the abridged AQ-28 and its cross-cultural validity have not been examined as extensively. Therefore, this study aimed to examine the factor structure and measurement invariance of the AQ-28 in 818 Dutch (M[subscript…
Descriptors: Autism Spectrum Disorders, Questionnaires, Factor Structure, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Kathleen Lynne Lane; Wendy Peia Oakes; Mark Matthew Buckman; Nathan Allen Lane; Katie Scarlett Lane; Kandace Fleming; Rebecca E. Swinburne Romine; Rebecca L. Sherod; Chi-Ning Chang; Jamie Jones; Emily Dawn Cantwell; Meredith Crittenden – Remedial and Special Education, 2024
Given the need for a swift, systematic way to identify students with internalizing and externalizing behavior patterns to connect these students with appropriate supports, we present new findings of the Student Risk Screening Scale--Internalizing and Externalizing (SRSS-IE). In this article, we examined (a) factor structure of the SRSS-IE and (b)…
Descriptors: Screening Tests, At Risk Students, Psychometrics, Factor Structure
Peer reviewed Peer reviewed
Direct linkDirect link
Montoya, Amanda K.; Edwards, Michael C. – Educational and Psychological Measurement, 2021
Model fit indices are being increasingly recommended and used to select the number of factors in an exploratory factor analysis. Growing evidence suggests that the recommended cutoff values for common model fit indices are not appropriate for use in an exploratory factor analysis context. A particularly prominent problem in scale evaluation is the…
Descriptors: Goodness of Fit, Factor Analysis, Cutting Scores, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
John B. Buncher; Jayson M. Nissen; Ben Van Dusen; Robert M. Talbot – Physical Review Physics Education Research, 2025
Research-based assessments (RBAs) allow researchers and practitioners to compare student performance across different contexts and institutions. In recent years, research attention has focused on the student populations these RBAs were initially developed with because much of that research was done with "samples of convenience" that were…
Descriptors: Science Tests, Physics, Comparative Analysis, Gender Differences
Peer reviewed Peer reviewed
Direct linkDirect link
Lehmann, Vicky; Hillen, Marij A.; Verdam, Mathilde G. E.; Pieterse, Arwen H.; Labrie, Nanon H. M.; Fruijtier, Agnetha D.; Oreel, Tom H.; Smets, Ellen M. A.; Visser, Leonie N. C. – International Journal of Social Research Methodology, 2023
The Video Engagement Scale (VES) is a quality indicator to assess engagement in experimental video-vignette studies, but its measurement properties warrant improvement. Data from previous studies were combined (N = 2676) and split into three subsamples for a stepped analytical approach. We tested construct validity, criterion validity,…
Descriptors: Likert Scales, Video Technology, Vignettes, Construct Validity
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6