Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 10 |
Descriptor
Computation | 12 |
Correlation | 12 |
Difficulty Level | 12 |
Test Items | 10 |
Item Response Theory | 8 |
Comparative Analysis | 5 |
Computer Software | 5 |
Sample Size | 5 |
Statistical Analysis | 4 |
Accuracy | 3 |
Models | 3 |
More ▼ |
Source
ProQuest LLC | 3 |
Journal of Educational… | 2 |
Structural Equation Modeling:… | 2 |
Applied Psychological… | 1 |
ETS Research Report Series | 1 |
Educational and Psychological… | 1 |
Intelligence | 1 |
International Journal of… | 1 |
Author
Carrie, Ho Ka Lee | 1 |
DeMars, Christine E. | 1 |
Dorans, Neil J. | 1 |
Finch, Holmes | 1 |
Fu, Qiong | 1 |
He, Wei | 1 |
Jiao, Hong | 1 |
Kim, Hyun Seok John | 1 |
Kumpei, Mizuno | 1 |
Livingston, Samuel A. | 1 |
MacDonald, George T. | 1 |
More ▼ |
Publication Type
Journal Articles | 9 |
Reports - Research | 8 |
Dissertations/Theses -… | 3 |
Reports - Descriptive | 1 |
Education Level
Early Childhood Education | 1 |
Elementary Education | 1 |
Kindergarten | 1 |
Primary Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Marcruz, Ong Yew Lee; Carrie, Ho Ka Lee; Manabu, Kawata; Mayumi, Takahashi; Kumpei, Mizuno – International Journal of Early Years Education, 2022
It has become increasingly clear that the early use of decomposition for addition is associated with later mathematical achievement. This study examined how younger children execute a base-10 decomposition strategy to solve complex arithmetic (e.g. two-digit addition). 24 addition problems in two modalities (WA: Written Arithmetic; OA: Oral…
Descriptors: Cross Cultural Studies, Arithmetic, Foreign Countries, Correlation
Matlock, Ki Lynn; Turner, Ronna – Educational and Psychological Measurement, 2016
When constructing multiple test forms, the number of items and the total test difficulty are often equivalent. Not all test developers match the number of items and/or average item difficulty within subcontent areas. In this simulation study, six test forms were constructed having an equal number of items and average item difficulty overall.…
Descriptors: Item Response Theory, Computation, Test Items, Difficulty Level
Schroeders, Ulrich; Robitzsch, Alexander; Schipolowski, Stefan – Journal of Educational Measurement, 2014
C-tests are a specific variant of cloze tests that are considered time-efficient, valid indicators of general language proficiency. They are commonly analyzed with models of item response theory assuming local item independence. In this article we estimated local interdependencies for 12 C-tests and compared the changes in item difficulties,…
Descriptors: Comparative Analysis, Psychometrics, Cloze Procedure, Language Tests
MacDonald, George T. – ProQuest LLC, 2014
A simulation study was conducted to explore the performance of the linear logistic test model (LLTM) when the relationships between items and cognitive components were misspecified. Factors manipulated included percent of misspecification (0%, 1%, 5%, 10%, and 15%), form of misspecification (under-specification, balanced misspecification, and…
Descriptors: Simulation, Item Response Theory, Models, Test Items
Raykov, Tenko; Marcoulides, George A. – Structural Equation Modeling: A Multidisciplinary Journal, 2011
A directly applicable latent variable modeling procedure for classical item analysis is outlined. The method allows one to point and interval estimate item difficulty, item correlations, and item-total correlations for composites consisting of categorical items. The approach is readily employed in empirical research and as a by-product permits…
Descriptors: Item Analysis, Evaluation, Correlation, Test Items
Jiao, Hong; Wang, Shudong; He, Wei – Journal of Educational Measurement, 2013
This study demonstrated the equivalence between the Rasch testlet model and the three-level one-parameter testlet model and explored the Markov Chain Monte Carlo (MCMC) method for model parameter estimation in WINBUGS. The estimation accuracy from the MCMC method was compared with those from the marginalized maximum likelihood estimation (MMLE)…
Descriptors: Computation, Item Response Theory, Models, Monte Carlo Methods
DeMars, Christine E. – Structural Equation Modeling: A Multidisciplinary Journal, 2012
In structural equation modeling software, either limited-information (bivariate proportions) or full-information item parameter estimation routines could be used for the 2-parameter item response theory (IRT) model. Limited-information methods assume the continuous variable underlying an item response is normally distributed. For skewed and…
Descriptors: Item Response Theory, Structural Equation Models, Computation, Computer Software
Kim, Hyun Seok John – ProQuest LLC, 2011
Cognitive diagnostic assessment (CDA) is a new theoretical framework for psychological and educational testing that is designed to provide detailed information about examinees' strengths and weaknesses in specific knowledge structures and processing skills. During the last three decades, more than a dozen psychometric models have been developed…
Descriptors: Cognitive Measurement, Diagnostic Tests, Bayesian Statistics, Statistical Inference
Fu, Qiong – ProQuest LLC, 2010
This research investigated how the accuracy of person ability and item difficulty parameter estimation varied across five IRT models with respect to the presence of guessing, targeting, and varied combinations of sample sizes and test lengths. The data were simulated with 50 replications under each of the 18 combined conditions. Five IRT models…
Descriptors: Item Response Theory, Guessing (Tests), Accuracy, Computation
Finch, Holmes – Applied Psychological Measurement, 2010
The accuracy of item parameter estimates in the multidimensional item response theory (MIRT) model context is one that has not been researched in great detail. This study examines the ability of two confirmatory factor analysis models specifically for dichotomous data to properly estimate item parameters using common formulae for converting factor…
Descriptors: Item Response Theory, Computation, Factor Analysis, Models
Livingston, Samuel A.; Dorans, Neil J. – ETS Research Report Series, 2004
This paper describes an approach to item analysis that is based on the estimation of a set of response curves for each item. The response curves show, at a glance, the difficulty and the discriminating power of the item and the popularity of each distractor, at any level of the criterion variable (e.g., total score). The curves are estimated by…
Descriptors: Item Analysis, Computation, Difficulty Level, Test Items

Spilsbury, Georgina – Intelligence, 1992
The hypothesis that a task that increases in complexity (increasing its correlation with a central measure of intelligence) does so by increasing its dimensionality by tapping individual differences or another variable was supported by findings from 46 adults aged 20-70 years performing a mental counting task. (SLD)
Descriptors: Adults, Age Differences, Computation, Correlation