NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 10 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Rae Yeong; Yoo, Yun Joo – Journal of Educational Measurement, 2023
In cognitive diagnostic models (CDMs), a set of fine-grained attributes is required to characterize complex problem solving and provide detailed diagnostic information about an examinee. However, it is challenging to ensure reliable estimation and control computational complexity when The test aims to identify the examinee's attribute profile in a…
Descriptors: Models, Diagnostic Tests, Adaptive Testing, Accuracy
Peer reviewed Peer reviewed
Direct linkDirect link
DeCarlo, Lawrence T. – Journal of Educational Measurement, 2023
A conceptualization of multiple-choice exams in terms of signal detection theory (SDT) leads to simple measures of item difficulty and item discrimination that are closely related to, but also distinct from, those used in classical item analysis (CIA). The theory defines a "true split," depending on whether or not examinees know an item,…
Descriptors: Multiple Choice Tests, Test Items, Item Analysis, Test Wiseness
Peer reviewed Peer reviewed
Direct linkDirect link
Haberman, Shelby; Yao, Lili – Journal of Educational Measurement, 2015
Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…
Descriptors: College Entrance Examinations, Repetition, Methods, Error of Measurement
Peer reviewed Peer reviewed
Oshima, Takako C.; Miller, M. David – Journal of Educational Measurement, 1990
A bidimensional 2-parameter logistic model was applied to data generated for 2 groups on a 40-item test. Item parameters were the same across groups; correlation across the 2 traits varied. Results indicate the need for caution in using item-response theory (IRT)-based invariance indexes with multidimensional data for these groups. (TJH)
Descriptors: Computer Simulation, Correlation, Discriminant Analysis, Item Response Theory
Peer reviewed Peer reviewed
Menne, John W.; Tolsma, Robert J. – Journal of Educational Measurement, 1971
Descriptors: Discriminant Analysis, Group Testing, Item Analysis, Psychometrics
Peer reviewed Peer reviewed
Woodson, M. I. Charles E. – Journal of Educational Measurement, 1974
The basis for selection of the calibration sample determines the kind of scale which will be developed. A random sample from a population of individuals leads to a norm-referenced scale, and a sample representative of abilities of a range of characteristics leads to a criterion-referenced scale. (Author/BB)
Descriptors: Criterion Referenced Tests, Discriminant Analysis, Item Analysis, Test Construction
Peer reviewed Peer reviewed
Pyrczak, Fred – Journal of Educational Measurement, 1973
Despite the numerous individual illustrations in the literature showing how the discrimination index may be used to identify items with faults, its overall effectiveness as a measure of item quality, defined in terms of the presence or absence of faults, is not clear. This study investigates its validity. (Author/RK)
Descriptors: Correlation, Discriminant Analysis, Item Banks, Rating Scales
Peer reviewed Peer reviewed
Miller, Timothy R.; Spray, Judith A. – Journal of Educational Measurement, 1993
Presents logistic discriminant analysis as a means of detecting differential item functioning (DIF) in items that are polytomously scored. Provides examples of DIF detection using a 27-item mathematics test with 1,977 examinees. The proposed method is simpler and more practical than polytomous extensions of the logistic regression DIF procedure.…
Descriptors: Discriminant Analysis, Item Bias, Mathematical Models, Mathematics Tests
Peer reviewed Peer reviewed
Doolittle, Allen E.; Cleary, T. Anne – Journal of Educational Measurement, 1987
Eight randomly equivalent samples of high school seniors were each given a unique form of the ACT Assessment Mathematics Usage Test (ACTM). Signed measures of differential item performance (DIP) were obtained for each item in the eight ACTM forms. DIP estimates were analyzed and a significant item category effect was found. (Author/LMO)
Descriptors: Analysis of Variance, College Entrance Examinations, Discriminant Analysis, High School Seniors
Peer reviewed Peer reviewed
Goldman, Roy D.; Warren, Rebecca M. – Journal of Educational Measurement, 1973
Paper discusses a study technique questionnaire given to students enrolled in upper division college classes. Multivariate analysis of variance was employed to compare the centroids of questionnaire responses of students in four major fields who had above and below average grades. (Author)
Descriptors: Academic Achievement, Analysis of Variance, Discriminant Analysis, Grades (Scholastic)