Publication Date
In 2025 | 0 |
Since 2024 | 3 |
Since 2021 (last 5 years) | 6 |
Descriptor
Research Problems | 6 |
Sample Size | 3 |
Simulation | 3 |
Accuracy | 2 |
Data | 2 |
Data Analysis | 2 |
Error of Measurement | 2 |
Factor Analysis | 2 |
Item Response Theory | 2 |
Statistical Analysis | 2 |
Ability | 1 |
More ▼ |
Source
Educational and Psychological… | 6 |
Author
A. Corinne Huggins-Manley | 1 |
Eric A. Wright | 1 |
Goretzko, David | 1 |
Jessica A. R. Logan | 1 |
Kuan-Yu Jin | 1 |
M. David Miller | 1 |
Marcoulides, George A. | 1 |
Menglin Xu | 1 |
Raykov, Tenko | 1 |
Selim Havan | 1 |
Thomas Eckes | 1 |
More ▼ |
Publication Type
Journal Articles | 6 |
Reports - Research | 5 |
Reports - Evaluative | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Kuan-Yu Jin; Thomas Eckes – Educational and Psychological Measurement, 2024
Insufficient effort responding (IER) refers to a lack of effort when answering survey or questionnaire items. Such items typically offer more than two ordered response categories, with Likert-type scales as the most prominent example. The underlying assumption is that the successive categories reflect increasing levels of the latent variable…
Descriptors: Item Response Theory, Test Items, Test Wiseness, Surveys
Raykov, Tenko; Marcoulides, George A. – Educational and Psychological Measurement, 2021
The population discrepancy between unstandardized and standardized reliability of homogeneous multicomponent measuring instruments is examined. Within a latent variable modeling framework, it is shown that the standardized reliability coefficient for unidimensional scales can be markedly higher than the corresponding unstandardized reliability…
Descriptors: Test Reliability, Computation, Measures (Individuals), Research Problems
Menglin Xu; Jessica A. R. Logan – Educational and Psychological Measurement, 2024
Research designs that include planned missing data are gaining popularity in applied education research. These methods have traditionally relied on introducing missingness into data collections using the missing completely at random (MCAR) mechanism. This study assesses whether planned missingness can also be implemented when data are instead…
Descriptors: Research Design, Research Methodology, Monte Carlo Methods, Statistical Analysis
Yan Xia; Selim Havan – Educational and Psychological Measurement, 2024
Although parallel analysis has been found to be an accurate method for determining the number of factors in many conditions with complete data, its application under missing data is limited. The existing literature recommends that, after using an appropriate multiple imputation method, researchers either apply parallel analysis to every imputed…
Descriptors: Data Interpretation, Factor Analysis, Statistical Inference, Research Problems
Goretzko, David – Educational and Psychological Measurement, 2022
Determining the number of factors in exploratory factor analysis is arguably the most crucial decision a researcher faces when conducting the analysis. While several simulation studies exist that compare various so-called factor retention criteria under different data conditions, little is known about the impact of missing data on this process.…
Descriptors: Factor Analysis, Research Problems, Data, Prediction
Ziying Li; A. Corinne Huggins-Manley; Walter L. Leite; M. David Miller; Eric A. Wright – Educational and Psychological Measurement, 2022
The unstructured multiple-attempt (MA) item response data in virtual learning environments (VLEs) are often from student-selected assessment data sets, which include missing data, single-attempt responses, multiple-attempt responses, and unknown growth ability across attempts, leading to a complex and complicated scenario for using this kind of…
Descriptors: Sequential Approach, Item Response Theory, Data, Simulation