Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 1 |
Descriptor
Test Format | 2 |
Test Items | 2 |
Cognitive Theory | 1 |
Correlation | 1 |
Cutting Scores | 1 |
English | 1 |
Gender Differences | 1 |
Graphs | 1 |
Language Processing | 1 |
Language Proficiency | 1 |
Language Tests | 1 |
More ▼ |
Source
Educational Testing Service | 2 |
Publication Type
Reports - Evaluative | 1 |
Reports - Research | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Kim, Sooyeon; Walker, Michael E. – Educational Testing Service, 2011
This study examines the use of subpopulation invariance indices to evaluate the appropriateness of using a multiple-choice (MC) item anchor in mixed-format tests, which include both MC and constructed-response (CR) items. Linking functions were derived in the nonequivalent groups with anchor test (NEAT) design using an MC-only anchor set for 4…
Descriptors: Test Format, Multiple Choice Tests, Test Items, Gender Differences
Katz, Irvin R.; Xi, Xiaoming; Kim, Hyun-Joo; Cheng, Peter C. H. – Educational Testing Service, 2004
This research applied a cognitive model to identify item features that lead to irrelevant variance on the Test of Spoken English[TM] (TSE[R]). The TSE is an assessment of English oral proficiency and includes an item that elicits a description of a statistical graph. This item type sometimes appears to tap graph-reading skills--an irrelevant…
Descriptors: Test Format, English, Test Items, Language Proficiency