Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 3 |
Descriptor
Sampling | 8 |
Equated Scores | 6 |
Item Response Theory | 6 |
Statistical Analysis | 3 |
Difficulty Level | 2 |
Error of Measurement | 2 |
Evaluation Methods | 2 |
Item Analysis | 2 |
Sample Size | 2 |
Simulation | 2 |
Test Items | 2 |
More ▼ |
Source
Applied Measurement in… | 2 |
ETS Research Report Series | 2 |
Applied Psychological… | 1 |
Educational Testing Service | 1 |
Author
Dorans, Neil J. | 8 |
Lawrence, Ida M. | 2 |
Guo, Hongwen | 1 |
Haberman, Shelby J. | 1 |
Hammond, Shelby | 1 |
Liu, Jinghua | 1 |
Livingston, Samuel A. | 1 |
Lu, Ru | 1 |
Tateneni, Krishna | 1 |
Yang, Wen-Ling | 1 |
Publication Type
Journal Articles | 5 |
Reports - Evaluative | 4 |
Reports - Research | 3 |
Reports - Descriptive | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
SAT (College Admission Test) | 2 |
Advanced Placement… | 1 |
What Works Clearinghouse Rating
Lu, Ru; Guo, Hongwen; Dorans, Neil J. – ETS Research Report Series, 2021
Two families of analysis methods can be used for differential item functioning (DIF) analysis. One family is DIF analysis based on observed scores, such as the Mantel-Haenszel (MH) and the standardized proportion-correct metric for DIF procedures; the other is analysis based on latent ability, in which the statistic is a measure of departure from…
Descriptors: Robustness (Statistics), Weighted Scores, Test Items, Item Analysis
Haberman, Shelby J.; Dorans, Neil J. – Educational Testing Service, 2011
For testing programs that administer multiple forms within a year and across years, score equating is used to ensure that scores can be used interchangeably. In an ideal world, samples sizes are large and representative of populations that hardly change over time, and very reliable alternate test forms are built with nearly identical psychometric…
Descriptors: Scores, Reliability, Equated Scores, Test Construction
Dorans, Neil J.; Liu, Jinghua; Hammond, Shelby – Applied Psychological Measurement, 2008
This exploratory study was built on research spanning three decades. Petersen, Marco, and Stewart (1982) conducted a major empirical investigation of the efficacy of different equating methods. The studies reported in Dorans (1990) examined how different equating methods performed across samples selected in different ways. Recent population…
Descriptors: Test Format, Equated Scores, Sampling, Evaluation Methods

Lawrence, Ida M.; Dorans, Neil J. – Applied Measurement in Education, 1990
The sample invariant properties of five anchor test equating methods are addressed. Equating results across two sampling conditions--representative sampling and new-form matched sampling--are compared for Tucker and Levine equally reliable linear equating, item response theory true-score equating, and two equipercentile methods. (SLD)
Descriptors: Equated Scores, Item Response Theory, Sampling, Statistical Analysis
Yang, Wen-Ling; Dorans, Neil J.; Tateneni, Krishna – 2002
Scores on the multiple-choice sections of alternate forms are equated through anchor-test equating for the Advanced Placement Program (AP) examinations. There is no linkage of free-response sections since different free-response items are given yearly. However, the free-response and multiple-choice sections are combined to produce a composite.…
Descriptors: Cutting Scores, Equated Scores, Multiple Choice Tests, Sample Size
Lawrence, Ida M.; Dorans, Neil J. – 1988
This paper addresses the sample invariant properties of four equating methods (Tucker and Levine linear equating, equipercentile equating through an anchor test, and three-parameter item response theory equating). Data from several national administrations of the Scholastic Aptitude Test served as the source of data for the study. Equating results…
Descriptors: Ability, College Entrance Examinations, Comparative Analysis, Equated Scores
Livingston, Samuel A.; Dorans, Neil J. – ETS Research Report Series, 2004
This paper describes an approach to item analysis that is based on the estimation of a set of response curves for each item. The response curves show, at a glance, the difficulty and the discriminating power of the item and the popularity of each distractor, at any level of the criterion variable (e.g., total score). The curves are estimated by…
Descriptors: Item Analysis, Computation, Difficulty Level, Test Items

Dorans, Neil J. – Applied Measurement in Education, 1990
The equating methods and sampling designs used in the empirical studies in this special issue on the use of matched samples for test equating are described. Four requisites for equating are listed, and the invariance of equating functions requisite is identified as the focus of this issue. (SLD)
Descriptors: Equated Scores, Equations (Mathematics), Evaluation Methods, Item Response Theory