Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 4 |
Descriptor
Difficulty Level | 6 |
Error of Measurement | 6 |
Factor Analysis | 6 |
Test Items | 6 |
Correlation | 4 |
Item Response Theory | 4 |
Item Analysis | 3 |
Monte Carlo Methods | 3 |
Comparative Analysis | 2 |
Computation | 2 |
Least Squares Statistics | 2 |
More ▼ |
Author
Finch, Holmes | 2 |
Ahn, Soyeon | 1 |
Anwyll, Steve | 1 |
Benson, Jeri | 1 |
Glanville, Matthew | 1 |
He, Qingping | 1 |
Jones, Patricia B. | 1 |
Opposs, Dennis | 1 |
Park, Sung Eun | 1 |
Wilson, Michael | 1 |
Zopluoglu, Cengiz | 1 |
More ▼ |
Publication Type
Reports - Research | 5 |
Journal Articles | 4 |
Speeches/Meeting Papers | 2 |
Reports - Descriptive | 1 |
Education Level
Elementary Education | 1 |
Audience
Researchers | 1 |
Location
United Kingdom (England) | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Park, Sung Eun; Ahn, Soyeon; Zopluoglu, Cengiz – Educational and Psychological Measurement, 2021
This study presents a new approach to synthesizing differential item functioning (DIF) effect size: First, using correlation matrices from each study, we perform a multigroup confirmatory factor analysis (MGCFA) that examines measurement invariance of a test item between two subgroups (i.e., focal and reference groups). Then we synthesize, across…
Descriptors: Item Analysis, Effect Size, Difficulty Level, Monte Carlo Methods
He, Qingping; Anwyll, Steve; Glanville, Matthew; Opposs, Dennis – Research Papers in Education, 2014
Since 2010, the whole national cohort Key Stage 2 (KS2) National Curriculum test in science in England has been replaced with a sampling test taken by pupils at the age of 11 from a nationally representative sample of schools annually. The study reported in this paper compares the performance of different subgroups of the samples (classified by…
Descriptors: National Curriculum, Sampling, Foreign Countries, Factor Analysis
Finch, Holmes – Applied Psychological Measurement, 2011
Estimation of multidimensional item response theory (MIRT) model parameters can be carried out using the normal ogive with unweighted least squares estimation with the normal-ogive harmonic analysis robust method (NOHARM) software. Previous simulation research has demonstrated that this approach does yield accurate and efficient estimates of item…
Descriptors: Item Response Theory, Computation, Test Items, Simulation
Finch, Holmes – Applied Psychological Measurement, 2010
The accuracy of item parameter estimates in the multidimensional item response theory (MIRT) model context is one that has not been researched in great detail. This study examines the ability of two confirmatory factor analysis models specifically for dichotomous data to properly estimate item parameters using common formulae for converting factor…
Descriptors: Item Response Theory, Computation, Factor Analysis, Models
Jones, Patricia B.; And Others – 1987
In order to determine the effectiveness of multidimensional scaling (MDS) in recovering the dimensionality of a set of dichotomously-scored items, data were simulated in one, two, and three dimensions for a variety of correlations with the underlying latent trait. Similarity matrices were constructed from these data using three margin-sensitive…
Descriptors: Cluster Analysis, Correlation, Difficulty Level, Error of Measurement
A Comparison of Three Types of Test Development Procedures Using Classical and Latent Trait Methods.
Benson, Jeri; Wilson, Michael – 1979
Three methods of item selection were used to select sets of 38 items from a 50-item verbal analogies test and the resulting item sets were compared for internal consistency, standard errors of measurement, item difficulty, biserial item-test correlations, and relative efficiency. Three groups of 1,500 cases each were used for item selection. First…
Descriptors: Comparative Analysis, Difficulty Level, Efficiency, Error of Measurement