NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)1
Since 2006 (last 20 years)4
Audience
Researchers2
Location
Israel1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Öztürk-Gübes, Nese; Kelecioglu, Hülya – Educational Sciences: Theory and Practice, 2016
The purpose of this study was to examine the impact of dimensionality, common-item set format, and different scale linking methods on preserving equity property with mixed-format test equating. Item response theory (IRT) true-score equating (TSE) and IRT observed-score equating (OSE) methods were used under common-item nonequivalent groups design.…
Descriptors: Test Format, Item Response Theory, True Scores, Equated Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Slonim-Nevo, Vered; Nevo, Isaac – Journal of Mixed Methods Research, 2009
Combining diverse methods in a single study raises a problem: What should be done when the findings of one method of investigation conflict with those of another? The authors illustrate this problem using an example in which three study phases--quantitative, qualitative, and intervention--are applied. The findings from the quantitative phase did…
Descriptors: Methods Research, Immigration, Statistical Analysis, Qualitative Research
Peer reviewed Peer reviewed
Vegelius, Jan – Educational and Psychological Measurement, 1977
Generalizations of the G index as a measure of similarity between persons beyond the dichotomous situation are discussed. An attempt is made to present a generalization that does not require dichotomization of the items for cases where the number of response alternatives may differ. (Author/JKS)
Descriptors: Correlation, Item Analysis, Measurement Techniques, Multidimensional Scaling
Peer reviewed Peer reviewed
Vegelius, Jan – Educational and Psychological Measurement, 1979
The computer program WEIGAN makes the weighted G analysis available for computer users. The input and output of the program are described. (Author/JKS)
Descriptors: Computer Programs, Correlation, Factor Analysis, Item Analysis
Peer reviewed Peer reviewed
Lautenschlager, Gary J.; Park, Dong-Gun – Applied Psychological Measurement, 1988
The consequences of using item response theory (IRT) item bias detecting procedures with multidimensional IRT item data are examined. Limitations in procedures for detecting item bias are discussed. (SLD)
Descriptors: Item Analysis, Latent Trait Theory, Mathematical Models, Multidimensional Scaling
Peer reviewed Peer reviewed
Direct linkDirect link
Walker, Cindy M.; Azen, Razia; Schmitt, Thomas – Educational and Psychological Measurement, 2006
It is believed by some that most tests are multidimensional, meaning that they measure more than one underlying construct. The primary objective of this study is to illustrate how variations in the secondary ability distribution affect the statistical detection of dimensionality and to demonstrate the difference between substantive and statistical…
Descriptors: Multidimensional Scaling, Item Response Theory, Comparative Testing, Statistical Analysis
Peer reviewed Peer reviewed
Reynolds, Thomas J. – Educational and Psychological Measurement, 1981
Cliff's Index "c" derived from an item dominance matrix is utilized in a clustering approach, termed extracting Reliable Guttman Orders (ERGO), to isolate Guttman-type item hierarchies. A comparison of factor analysis to the ERGO is made on social distance data involving multiple ethnic groups. (Author/BW)
Descriptors: Cluster Analysis, Difficulty Level, Factor Analysis, Item Analysis
Korpi, Meg; Haertel, Edward – 1984
The purpose of this paper is to further the cause of clarifying construct interpretations of tests, by proposing that non-metric multidimensional scaling may be more useful than factor analysis or other latent structure models for investigating the internal structure of tests. It also suggests that typical problems associated with scaling…
Descriptors: Correlation, Factor Structure, Intermediate Grades, Item Analysis
Reynolds, Thomas J. – 1976
A method of factor extraction specific to a binary matrix, illustrated here as a person-by-item response matrix, is presented. The extraction procedure, termed ERGO, differs from the more commonly implemented dimensionalizing techniques, factor analysis and multidimensional scaling, by taking into consideration item difficulty. Utilized in the…
Descriptors: Discriminant Analysis, Factor Analysis, Item Analysis, Matrices
Ho, Andrew D.; Haertel, Edward H. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2006
Problems of scale typically arise when comparing test score trends, gaps, and gap trends across different tests. To overcome some of these difficulties, we can express the difference between the observed test performance of two groups with graphs or statistics that are metric-free (i.e., invariant under positive monotonic transformations of the…
Descriptors: Testing Programs, Test Results, Comparative Testing, Multidimensional Scaling
Doody, Evelyn N. – 1985
The effects of varying degrees of correlation between abilities and of various correlation configurations between item parameters on ability and item parameter estimation using the three parameter logistic model were examined. Ten two-trait configurations and one unidimensional test configuration for 30 item tests were simulated. Each…
Descriptors: Computer Simulation, Estimation (Mathematics), Factor Structure, Item Analysis
Peer reviewed Peer reviewed
Jackson, Douglas N.; Helmes, Edward – Applied Psychological Measurement, 1979
A basic structure approach is proposed for obtaining multidimensional scale values for attitude, achievement, or personality items from response data. The technique permits the unconfounding of scale values due to response bias and content and partitions item indices of popularity or difficulty among a number of relevant dimensions. (Author/BH)
Descriptors: Higher Education, Interest Inventories, Item Analysis, Mathematical Models
Bart, William M.; Airasian, Peter W. – 1976
The question of whether test factor structure is indicative of the test item hierarchy was examined. Data from 1,000 subjects on two sets of five bivalued Law School Admission Test items, which were analyzed with latent trait methods of Bock and Lieberman and of Christoffersson in Psychometrika, were analyzed with an ordering-theoretic method to…
Descriptors: Comparative Analysis, Correlation, Factor Analysis, Factor Structure