Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 2 |
Descriptor
Item Analysis | 5 |
Test Items | 5 |
College Entrance Examinations | 4 |
Difficulty Level | 4 |
Models | 4 |
Graduate Study | 3 |
Item Response Theory | 3 |
Psychometrics | 3 |
High Stakes Tests | 2 |
Test Construction | 2 |
Academic Aptitude | 1 |
More ▼ |
Author
Albano, Anthony D. | 1 |
Embretson, Susan E. | 1 |
Gorin, Joanna S. | 1 |
Graf, Edith Aurora | 1 |
Johnson, Matthew | 1 |
Kingston, Neal M. | 1 |
Lawless, René | 1 |
Peterson, Stephen | 1 |
Sinharay, Sandip | 1 |
Steffen, Manfred | 1 |
Publication Type
Journal Articles | 4 |
Reports - Research | 4 |
Reports - Descriptive | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Higher Education | 3 |
Postsecondary Education | 3 |
Audience
Researchers | 1 |
Location
Laws, Policies, & Programs
Assessments and Surveys
Graduate Record Examinations | 5 |
What Works Clearinghouse Rating
Albano, Anthony D. – Journal of Educational Measurement, 2013
In many testing programs it is assumed that the context or position in which an item is administered does not have a differential effect on examinee responses to the item. Violations of this assumption may bias item response theory estimates of item and person parameters. This study examines the potentially biasing effects of item position. A…
Descriptors: Test Items, Item Response Theory, Test Format, Questioning Techniques
Sinharay, Sandip; Johnson, Matthew – ETS Research Report Series, 2005
"Item models" (LaDuca, Staples, Templeton, & Holzman, 1986) are classes from which it is possible to generate/produce items that are equivalent/isomorphic to other items from the same model (e.g., Bejar, 1996; Bejar, 2002). They have the potential to produce large number of high-quality items at reduced cost. This paper introduces…
Descriptors: Item Analysis, Test Items, Scoring, Psychometrics
Gorin, Joanna S.; Embretson, Susan E. – Applied Psychological Measurement, 2006
Recent assessment research joining cognitive psychology and psychometric theory has introduced a new technology, item generation. In algorithmic item generation, items are systematically created based on specific combinations of features that underlie the processing required to correctly solve a problem. Reading comprehension items have been more…
Descriptors: Difficulty Level, Test Items, Modeling (Psychology), Paragraph Composition
Graf, Edith Aurora; Peterson, Stephen; Steffen, Manfred; Lawless, René – ETS Research Report Series, 2005
We describe the item modeling development and evaluation process as applied to a quantitative assessment with high-stakes outcomes. In addition to expediting the item-creation process, a model-based approach may reduce pretesting costs, if the difficulty and discrimination of model-generated items may be predicted to a predefined level of…
Descriptors: Psychometrics, Accuracy, Item Analysis, High Stakes Tests
Kingston, Neal M. – 1985
Birnbaum's three-parameter logistic item response model was used to study guessing behavior of low ability examinees on the Graduate Record Examinations (GRE) General Test, Verbal Measure. GRE scoring procedures had recently changed, from a scoring formula which corrected for guessing, to number-right scoring. The three-parameter theory was used…
Descriptors: Academic Aptitude, Analysis of Variance, College Entrance Examinations, Difficulty Level