Descriptor
Statistical Studies | 7 |
Test Construction | 7 |
Test Theory | 7 |
Latent Trait Theory | 5 |
Mathematical Models | 5 |
Test Items | 5 |
Correlation | 4 |
Item Analysis | 3 |
Adaptive Testing | 2 |
Computer Assisted Testing | 2 |
Computer Simulation | 2 |
More ▼ |
Author
Ackerman, Terry A. | 2 |
Cattell, Raymond B. | 1 |
Harris, Deborah J. | 1 |
Holland, Paul W. | 1 |
Krug, Samuel E. | 1 |
McKinley, Robert L. | 1 |
Reckase, Mark D. | 1 |
Subkoviak, Michael J. | 1 |
Thayer, Dorothy T. | 1 |
Weiss, David J., Ed. | 1 |
Publication Type
Reports - Research | 5 |
Journal Articles | 3 |
Speeches/Meeting Papers | 2 |
Collected Works - Proceedings | 1 |
Reports - Evaluative | 1 |
Education Level
Audience
Researchers | 4 |
Location
Laws, Policies, & Programs
Assessments and Surveys
ACT Assessment | 2 |
Sixteen Personality Factor… | 1 |
What Works Clearinghouse Rating

Holland, Paul W.; Thayer, Dorothy T. – Journal of Educational Statistics, 1985
Section pre-equating (SPE) equates a new test to an old test prior to the actual use of a new test by making extensive use of experimental sections of a testing instrument. SPE theory is extended to allow for practice effects on both the old and new tests. (Author/BS)
Descriptors: Equated Scores, Mathematical Models, Statistical Studies, Test Construction

Harris, Deborah J.; Subkoviak, Michael J. – Educational and Psychological Measurement, 1986
This study examined three statistical methods for selecting items for mastery tests: (1) pretest-posttest; (2) latent trait; and (3) agreement statistics. The correlation between the latent trait method and agreement statistics, proposed here as an alternative, was substantial. Results for the pretest-posttest method confirmed its reputed…
Descriptors: Computer Simulation, Correlation, Item Analysis, Latent Trait Theory
Ackerman, Terry A. – 1987
One of the important underlying assumptions of all item response theory (IRT) models is that of local independence. This assumption requires that the response to an item on a test not be influenced by the response to any other items. This assumption is often taken for granted, with little or no scrutiny of the response process required to answer…
Descriptors: Computer Software, Correlation, Estimation (Mathematics), Latent Trait Theory
McKinley, Robert L.; Reckase, Mark D. – 1984
To assess the effects of correlated abilities on test characteristics, and to explore the effects of correlated abilities on the use of a multidimensional item response theory model which does not explicitly account for such a correlation, two tests were constructed. One had two relatively unidimensional subsets of items, the other had all…
Descriptors: Ability, Correlation, Factor Structure, Item Analysis

Cattell, Raymond B.; Krug, Samuel E. – Educational and Psychological Measurement, 1986
Critics have occasionally asserted that the number of factors in the 16PF tests is too large. This study discusses factor-analytic methodology and reviews more than 50 studies in the field. It concludes that the number of important primaries encapsulated in the series is no fewer than the stated number. (Author/JAZ)
Descriptors: Correlation, Cross Cultural Studies, Factor Analysis, Maximum Likelihood Statistics
Ackerman, Terry A. – 1987
The purpose of this study was to investigate the effect of using multidimensional items in a computer adaptive test (CAT) setting which assumes a unidimensional item response theory (IRT) framework. Previous research has suggested that the composite of multidimensional abilities being estimated by a unidimensional IRT model is not constant…
Descriptors: Adaptive Testing, College Entrance Examinations, Computer Assisted Testing, Computer Simulation
Weiss, David J., Ed. – 1985
This report contains the Proceedings of the 1982 Item Response Theory and Computerized Adaptive Testing Conference. The papers and their discussions are organized into eight sessions: (1) "Developments in Latent Trait Theory," with papers by Fumiko Samejima and Michael V. Levine; (2) "Parameter Estimation," with papers by…
Descriptors: Achievement Tests, Adaptive Testing, Branching, Computer Assisted Testing