NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
Elementary and Secondary…1
What Works Clearinghouse Rating
Showing 1 to 15 of 131 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Yan Xia; Xinchang Zhou – Educational and Psychological Measurement, 2025
Parallel analysis has been considered one of the most accurate methods for determining the number of factors in factor analysis. One major advantage of parallel analysis over traditional factor retention methods (e.g., Kaiser's rule) is that it addresses the sampling variability of eigenvalues obtained from the identity matrix, representing the…
Descriptors: Factor Analysis, Statistical Analysis, Evaluation Methods, Sampling
Peer reviewed Peer reviewed
Direct linkDirect link
Timothy R. Konold; Elizabeth A. Sanders; Kelvin Afolabi – Structural Equation Modeling: A Multidisciplinary Journal, 2025
Measurement invariance (MI) is an essential part of validity evidence concerned with ensuring that tests function similarly across groups, contexts, and time. Most evaluations of MI involve multigroup confirmatory factor analyses (MGCFA) that assume simple structure. However, recent research has shown that constraining non-target indicators to…
Descriptors: Evaluation Methods, Error of Measurement, Validity, Monte Carlo Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Jingwen Wang; Xiaohong Yang; Dujuan Liu – International Journal of Web-Based Learning and Teaching Technologies, 2024
The large scale expansion of online courses has led to the crisis of course quality issues. In this study, we first established an evaluation index system for online courses using factor analysis, encompassing three key constructs: course resource construction, course implementation, and teaching effectiveness. Subsequently, we employed factor…
Descriptors: Educational Quality, Online Courses, Course Evaluation, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Pere J. Ferrando; Ana Hernández-Dorado; Urbano Lorenzo-Seva – Structural Equation Modeling: A Multidisciplinary Journal, 2024
A frequent criticism of exploratory factor analysis (EFA) is that it does not allow correlated residuals to be modelled, while they can be routinely specified in the confirmatory (CFA) model. In this article, we propose an EFA approach in which both the common factor solution and the residual matrix are unrestricted (i.e., the correlated residuals…
Descriptors: Correlation, Factor Analysis, Models, Goodness of Fit
Peer reviewed Peer reviewed
Direct linkDirect link
Guo, Wenjing; Choi, Youn-Jeng – Educational and Psychological Measurement, 2023
Determining the number of dimensions is extremely important in applying item response theory (IRT) models to data. Traditional and revised parallel analyses have been proposed within the factor analysis framework, and both have shown some promise in assessing dimensionality. However, their performance in the IRT framework has not been…
Descriptors: Item Response Theory, Evaluation Methods, Factor Analysis, Guidelines
Peer reviewed Peer reviewed
Direct linkDirect link
Yuanfang Liu; Mark H. C. Lai; Ben Kelcey – Structural Equation Modeling: A Multidisciplinary Journal, 2024
Measurement invariance holds when a latent construct is measured in the same way across different levels of background variables (continuous or categorical) while controlling for the true value of that construct. Using Monte Carlo simulation, this paper compares the multiple indicators, multiple causes (MIMIC) model and MIMIC-interaction to a…
Descriptors: Classification, Accuracy, Error of Measurement, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Pere J. Ferrando; David Navarro-González; Fabia Morales-Vives – Educational and Psychological Measurement, 2025
The problem of local item dependencies (LIDs) is very common in personality and attitude measures, particularly in those that measure narrow-bandwidth dimensions. At the structural level, these dependencies can be modeled by using extended factor analytic (FA) solutions that include correlated residuals. However, the effects that LIDs have on the…
Descriptors: Scores, Accuracy, Evaluation Methods, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
D'Urso, E. Damiano; Tijmstra, Jesper; Vermunt, Jeroen K.; De Roover, Kim – Educational and Psychological Measurement, 2023
Assessing the measurement model (MM) of self-report scales is crucial to obtain valid measurements of individuals' latent psychological constructs. This entails evaluating the number of measured constructs and determining which construct is measured by which item. Exploratory factor analysis (EFA) is the most-used method to evaluate these…
Descriptors: Factor Analysis, Measurement Techniques, Self Evaluation (Individuals), Psychological Patterns
Peer reviewed Peer reviewed
Direct linkDirect link
Ha Thi Thu Bui; Quyen Thi Tu Bui; Thanh Thi Phuong Nguyen; Quang Huu Cao; Thuy Van Phung; Ha Thanh Nguyen – Quality Assurance in Education: An International Perspective, 2023
Purpose: Service quality has been widely recognized as the core value of any higher education institution (HEIs), especially in the context of higher education reform in Vietnam. The paper aims to assess the student's perceived service quality using SERVPERF scale and to find the relations between perceived service quality, satisfaction and…
Descriptors: Foreign Countries, Higher Education, Educational Experience, Educational Quality
Peer reviewed Peer reviewed
Direct linkDirect link
Montoya, Amanda K.; Edwards, Michael C. – Educational and Psychological Measurement, 2021
Model fit indices are being increasingly recommended and used to select the number of factors in an exploratory factor analysis. Growing evidence suggests that the recommended cutoff values for common model fit indices are not appropriate for use in an exploratory factor analysis context. A particularly prominent problem in scale evaluation is the…
Descriptors: Goodness of Fit, Factor Analysis, Cutting Scores, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Azman Ong, Mohd Hanafi; Mohd Yasin, Norazlina; Ibrahim, Nur Syafikah – Asian Association of Open Universities Journal, 2022
Purpose: Measuring internal response of online learning is seen as fundamental to absorptive capacity which stimulates knowledge assimilation. However, the evaluation of practice and research of validated instruments that could effectively measure online learning response behavior is limited. Thus, in this study, a new instrument was designed…
Descriptors: Online Courses, Student Surveys, Student Attitudes, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Aldosari, Mubarak; Heydarnejad, Tahereh; Hashemifardnia, Arash; Abdalgane, Mohammed – Language Testing in Asia, 2023
Self-assessment and reflective thinking (RT) can arm learners to monitor and evaluate their learning progress. Despite the long history of the core of self-assessment (CSA) and RT, little is known about how they may contribute to learner enjoyment (LE) and learner immunity (LI). Therefore, the current research attempted to propose a model to…
Descriptors: Self Evaluation (Individuals), Second Language Learning, Second Language Instruction, Reflection
Erica Harbatkin; Jason Burns; Samantha Cullum – Education Policy Innovation Collaborative, 2023
School climate is critical to school effectiveness, but there is limited large-scale data available to examine the magnitude and nature of the relationship between school climate and school improvement. Drawing on statewide administrative data linked with unique teacher survey data in Michigan, we examine whether school climate appeared to play a…
Descriptors: Educational Environment, School Turnaround, Trust (Psychology), Leadership Role
Peer reviewed Peer reviewed
Direct linkDirect link
Chung, Seungwon; Houts, Carrie – Measurement: Interdisciplinary Research and Perspectives, 2020
Advanced modeling of item response data through the item response theory (IRT) or item factor analysis frameworks is becoming increasingly popular. In the social and behavioral sciences, the underlying structure of tests/assessments is often multidimensional (i.e., more than 1 latent variable/construct is represented in the items). This review…
Descriptors: Item Response Theory, Evaluation Methods, Models, Factor Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kogar, Hakan – Journal of Education and Learning, 2018
The aim of the present research study was to compare the findings from the nonparametric MSA, DIMTEST and DETECT and the parametric dimensionality determining methods in various simulation conditions by utilizing exploratory and confirmatory methods. For this purpose, various simulation conditions were established based on number of dimensions,…
Descriptors: Evaluation Methods, Nonparametric Statistics, Statistical Analysis, Factor Analysis
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9