NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
French, Brian F.; Finch, W. Holmes – Journal of Educational Measurement, 2010
The purpose of this study was to examine the performance of differential item functioning (DIF) assessment in the presence of a multilevel structure that often underlies data from large-scale testing programs. Analyses were conducted using logistic regression (LR), a popular, flexible, and effective tool for DIF detection. Data were simulated…
Descriptors: Test Bias, Testing Programs, Evaluation, Measurement
Peer reviewed Peer reviewed
Knapp, Lila; Knapp, Robert R. – Journal of Educational Measurement, 1982
The present study was undertaken to determine whether a 14-cluster occupational structure similar to Knapp's (1976) would result from factor analysis based on sex-balanced interest items. Results supported the structure found previously based on non-sex-balanced items. (Author/GK)
Descriptors: Cluster Grouping, Interest Inventories, Secondary Education, Sex Bias
Peer reviewed Peer reviewed
Roussos, Louis A.; Stout, William F.; Marden, John I. – Journal of Educational Measurement, 1998
Introduces a new approach for partitioning test items into dimensionally distinct item clusters. The core of this approach is a new item-pair conditional-covariance-based proximity measure that can be used with hierarchical cluster analysis. The procedure can correctly classify, on average, over 90% of the items for correlations as high as 0.9.…
Descriptors: Cluster Analysis, Cluster Grouping, Correlation, Multidimensional Scaling
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Jee-Seon – Journal of Educational Measurement, 2006
Simulation and real data studies are used to investigate the value of modeling multiple-choice distractors on item response theory linking. Using the characteristic curve linking procedure for Bock's (1972) nominal response model presented by Kim and Hanson (2002), all-category linking (i.e., a linking based on all category characteristic curves…
Descriptors: Multiple Choice Tests, Test Items, Item Response Theory, Simulation
Peer reviewed Peer reviewed
Direct linkDirect link
Gierl, Mark J.; Leighton, Jacqueline P.; Tan, Xuan – Journal of Educational Measurement, 2006
DETECT, the acronym for Dimensionality Evaluation To Enumerate Contributing Traits, is an innovative and relatively new nonparametric dimensionality assessment procedure used to identify mutually exclusive, dimensionally homogeneous clusters of items using a genetic algorithm ( Zhang & Stout, 1999). Because the clusters of items are mutually…
Descriptors: Program Evaluation, Cluster Grouping, Evaluation Methods, Multivariate Analysis
Peer reviewed Peer reviewed
Bejar, Isaac I. – Journal of Educational Measurement, 1980
Two procedures are presented for detecting violations of the unidimensionality assumption made by latent trait models without requiring factor analysis of inter-item correlation matrices. Both procedures require that departures from unidimensionality be hypothesized beforehand. This is usually possible in achievement tests where several content…
Descriptors: Achievement Tests, Bayesian Statistics, Cluster Grouping, Content Analysis