Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 2 |
Descriptor
Computer Assisted Testing | 3 |
Correlation | 3 |
Test Items | 3 |
Simulation | 2 |
Adaptive Testing | 1 |
Answer Sheets | 1 |
College Entrance Examinations | 1 |
College Students | 1 |
Comparative Testing | 1 |
Computation | 1 |
Design | 1 |
More ▼ |
Source
Journal of Educational… | 3 |
Author
Albano, Anthony D. | 1 |
Bridgeman, Brent | 1 |
Cai, Liuhan | 1 |
Chen, Ping | 1 |
Huang, Yingshi | 1 |
Lease, Erin M. | 1 |
Li, Shuhang | 1 |
McConnell, Scott R. | 1 |
Yuan, Lu | 1 |
Publication Type
Journal Articles | 3 |
Reports - Research | 3 |
Education Level
Elementary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
Graduate Record Examinations | 1 |
What Works Clearinghouse Rating
Yuan, Lu; Huang, Yingshi; Li, Shuhang; Chen, Ping – Journal of Educational Measurement, 2023
Online calibration is a key technology for item calibration in computerized adaptive testing (CAT) and has been widely used in various forms of CAT, including unidimensional CAT, multidimensional CAT (MCAT), CAT with polytomously scored items, and cognitive diagnostic CAT. However, as multidimensional and polytomous assessment data become more…
Descriptors: Computer Assisted Testing, Adaptive Testing, Computation, Test Items
Albano, Anthony D.; Cai, Liuhan; Lease, Erin M.; McConnell, Scott R. – Journal of Educational Measurement, 2019
Studies have shown that item difficulty can vary significantly based on the context of an item within a test form. In particular, item position may be associated with practice and fatigue effects that influence item parameter estimation. The purpose of this research was to examine the relevance of item position specifically for assessments used in…
Descriptors: Test Items, Computer Assisted Testing, Item Analysis, Difficulty Level

Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing