Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 3 |
Descriptor
Item Analysis | 8 |
Test Bias | 8 |
Test Items | 4 |
Computer Software | 2 |
Higher Education | 2 |
Item Response Theory | 2 |
Models | 2 |
Monte Carlo Methods | 2 |
Psychological Studies | 2 |
Rating Scales | 2 |
Response Style (Tests) | 2 |
More ▼ |
Source
Applied Psychological… | 8 |
Author
Publication Type
Journal Articles | 8 |
Reports - Evaluative | 3 |
Reports - Research | 3 |
Information Analyses | 1 |
Reports - Descriptive | 1 |
Education Level
Audience
Location
Netherlands | 1 |
Wisconsin | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Graduate Record Examinations | 1 |
Program for International… | 1 |
What Works Clearinghouse Rating
Unlu, Ali; Sargin, Anatol – Applied Psychological Measurement, 2009
Mondrian is state-of-the-art statistical data visualization software featuring modern interactive visualization techniques for a wide range of data types. This article reviews the capabilities, functionality, and interactive properties of this software package. Key features of Mondrian are illustrated with data from the Programme for International…
Descriptors: Statistical Data, Computer Graphics, Computer Software, Item Analysis
Bolt, Daniel M.; Johnson, Timothy R. – Applied Psychological Measurement, 2009
A multidimensional item response theory model that accounts for response style factors is presented. The model, a multidimensional extension of Bock's nominal response model, is shown to allow for the study and control of response style effects in ordered rating scale data so as to reduce bias in measurement of the intended trait. In the current…
Descriptors: Response Style (Tests), Rating Scales, Item Response Theory, Individual Differences
Meade, Adam W.; Lautenschlager, Gary J.; Johnson, Emily C. – Applied Psychological Measurement, 2007
This article highlights issues associated with the use of the differential functioning of items and tests (DFIT) methodology for assessing measurement invariance (or differential functioning) with Likert-type data. Monte Carlo analyses indicate relatively low sensitivity of the DFIT methodology for identifying differential item functioning (DIF)…
Descriptors: Measures (Individuals), Monte Carlo Methods, Likert Scales, Effect Size

Lautenschlager, Gary J.; Park, Dong-Gun – Applied Psychological Measurement, 1988
The consequences of using item response theory (IRT) item bias detecting procedures with multidimensional IRT item data are examined. Limitations in procedures for detecting item bias are discussed. (SLD)
Descriptors: Item Analysis, Latent Trait Theory, Mathematical Models, Multidimensional Scaling
Penfield, Randall D. – Applied Psychological Measurement, 2005
Differential item functioning (DIF) is an important consideration in assessing the validity of test scores (Camilli & Shepard, 1994). A variety of statistical procedures have been developed to assess DIF in tests of dichotomous (Hills, 1989; Millsap & Everson, 1993) and polytomous (Penfield & Lam, 2000; Potenza & Dorans, 1995) items. Some of these…
Descriptors: Test Bias, Item Analysis, Psychological Studies, Evaluation Methods
Finch, Holmes – Applied Psychological Measurement, 2005
This study compares the ability of the multiple indicators, multiple causes (MIMIC) confirmatory factor analysis model to correctly identify cases of differential item functioning (DIF) with more established methods. Although the MIMIC model might have application in identifying DIF for multiple grouping variables, there has been little…
Descriptors: Identification, Factor Analysis, Test Bias, Models

van Heerden, J.; Hoogstraten, Joh. – Applied Psychological Measurement, 1979
In a replication of an earlier study, a questionnaire with items lacking content and merely containing answer possibilities was administered to a sample of Dutch freshmen psychology students. Subjects showed a preference for positive options over negative options. (Author/JKS)
Descriptors: Content Analysis, Foreign Countries, Higher Education, Item Analysis

Donlon, Thomas F.; And Others – Applied Psychological Measurement, 1980
The scope and nature of sex differences in the Graduate Record Examination are explored by identifying individual test items that differ from the other items in terms of the magnitude of the difference in item difficulty for the sexes. In general, limited evidence of differences was established. (Author/CTM)
Descriptors: Aptitude Tests, College Entrance Examinations, Graduate Students, Higher Education