NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Gierl, Mark J.; Lai, Hollis; Pugh, Debra; Touchie, Claire; Boulais, André-Philippe; De Champlain, André – Applied Measurement in Education, 2016
Item development is a time- and resource-intensive process. Automatic item generation integrates cognitive modeling with computer technology to systematically generate test items. To date, however, items generated using cognitive modeling procedures have received limited use in operational testing situations. As a result, the psychometric…
Descriptors: Psychometrics, Multiple Choice Tests, Test Items, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Seonghoon; Kolen, Michael J. – Applied Measurement in Education, 2006
Four item response theory linking methods (2 moment methods and 2 characteristic curve methods) were compared to concurrent (CO) calibration with the focus on the degree of robustness to format effects (FEs) when applying the methods to multidimensional data that reflected the FEs associated with mixed-format tests. Based on the quantification of…
Descriptors: Item Response Theory, Robustness (Statistics), Test Format, Comparative Analysis
Peer reviewed Peer reviewed
Stecher, Brian M.; Klein, Stephen P.; Solano-Flores, Guillermo; McCaffrey, Dan; Robyn, Abby; Shavelson, Richard J.; Haertel, Edward – Applied Measurement in Education, 2000
Studied content domain, format, and level of inquiry as factors contributing to the large variation in student performance across open-ended measures. Results for more than 1,200 eighth graders do not support the hypothesis that tasks similar in content, format, and level of inquiry would correlate higher with each other than with measures…
Descriptors: Correlation, Inquiry, Junior High School Students, Junior High Schools