NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1460456
Record Type: Journal
Publication Date: 2025-Mar
Pages: 11
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0731-1745
EISSN: EISSN-1745-3992
Available Date: 2024-10-27
Investigating Approaches to Controlling Item Position Effects in Computerized Adaptive Tests
Ye Ma1; Deborah J. Harris2
Educational Measurement: Issues and Practice, v44 n1 p44-54 2025
Item position effect (IPE) refers to situations where an item performs differently when it is administered in different positions on a test. The majority of previous research studies have focused on investigating IPE under linear testing. There is a lack of IPE research under adaptive testing. In addition, the existence of IPE might violate Item Response Theory (IRT)'s item parameter invariance assumption, which facilitates applications of IRT in various psychometric tasks such as computerized adaptive testing (CAT). Ignoring IPE might lead to issues such as inaccurate ability estimation in CAT. This article extends research on IPE by proposing and evaluating approaches to controlling position effects under an item-level computerized adaptive test via a simulation study. The results show that adjusting IPE via a pretesting design (approach 3) or a pool design (approach 4) results in better ability estimation accuracy compared to no adjustment (baseline approach) and item-level adjustment (approach 2). Practical implications of each approach as well as future research directions are discussed as well.
Wiley. Available from: John Wiley & Sons, Inc. 111 River Street, Hoboken, NJ 07030. Tel: 800-835-6770; e-mail: cs-journals@wiley.com; Web site: https://www.wiley.com/en-us
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: 1Amazon Web Services; 2The University of Iowa