Publication Date
In 2025 | 1 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 4 |
Descriptor
Computer Assisted Testing | 5 |
Context Effect | 5 |
Item Response Theory | 5 |
Test Items | 5 |
Adaptive Testing | 4 |
Computation | 2 |
Psychometrics | 2 |
Scoring | 2 |
College Entrance Examinations | 1 |
Comparative Analysis | 1 |
Early Childhood Education | 1 |
More ▼ |
Source
ETS Research Report Series | 1 |
Educational Measurement:… | 1 |
Educational Testing Service | 1 |
Educational and Psychological… | 1 |
Grantee Submission | 1 |
Author
Davey, Tim | 2 |
Albano, Anthony D. | 1 |
Brown, Anna | 1 |
Cai, Liuhan | 1 |
Deborah J. Harris | 1 |
Herbert, Erin | 1 |
Lease, Erin M. | 1 |
Lee, Yi-Hsuan | 1 |
Lin, Yin | 1 |
McConnell, Scott R. | 1 |
Rizavi, Saba | 1 |
More ▼ |
Publication Type
Reports - Research | 5 |
Journal Articles | 4 |
Education Level
Early Childhood Education | 1 |
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Location
United Kingdom | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Graduate Record Examinations | 1 |
What Works Clearinghouse Rating
Ye Ma; Deborah J. Harris – Educational Measurement: Issues and Practice, 2025
Item position effect (IPE) refers to situations where an item performs differently when it is administered in different positions on a test. The majority of previous research studies have focused on investigating IPE under linear testing. There is a lack of IPE research under adaptive testing. In addition, the existence of IPE might violate Item…
Descriptors: Computer Assisted Testing, Adaptive Testing, Item Response Theory, Test Items
Albano, Anthony D.; McConnell, Scott R.; Lease, Erin M.; Cai, Liuhan – Grantee Submission, 2020
Research has shown that the context of practice tasks can have a significant impact on learning, with long-term retention and transfer improving when tasks of different types are mixed by interleaving (abcabcabc) compared with grouping together in blocks (aaabbbccc). This study examines the influence of context via interleaving from a psychometric…
Descriptors: Context Effect, Test Items, Preschool Children, Computer Assisted Testing
Lin, Yin; Brown, Anna – Educational and Psychological Measurement, 2017
A fundamental assumption in computerized adaptive testing is that item parameters are invariant with respect to context--items surrounding the administered item. This assumption, however, may not hold in forced-choice (FC) assessments, where explicit comparisons are made between items included in the same block. We empirically examined the…
Descriptors: Personality Measures, Measurement Techniques, Context Effect, Test Items
Davey, Tim; Lee, Yi-Hsuan – ETS Research Report Series, 2011
Both theoretical and practical considerations have led the revision of the Graduate Record Examinations® (GRE®) revised General Test, here called the rGRE, to adopt a multistage adaptive design that will be continuously or nearly continuously administered and that can provide immediate score reporting. These circumstances sharply constrain the…
Descriptors: Context Effect, Scoring, Equated Scores, College Entrance Examinations
Rizavi, Saba; Way, Walter D.; Davey, Tim; Herbert, Erin – Educational Testing Service, 2004
Item parameter estimates vary for a variety of reasons, including estimation error, characteristics of the examinee samples, and context effects (e.g., item location effects, section location effects, etc.). Although we expect variation based on theory, there is reason to believe that observed variation in item parameter estimates exceeds what…
Descriptors: Adaptive Testing, Test Items, Computation, Context Effect