Descriptor
Author
Publication Type
Education Level
Adult Education | 1 |
Audience
Researchers | 21 |
Location
California | 1 |
New Zealand | 1 |
Virginia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating

Wilson, Mark; Wright, Benjamin D. – 1983
A common problem in practical educational research is that of perfect scores which result when latent trait models are used. A simple procedure for managing the perfect and zero response problem encountered in converting test scores into measures is presented. It allows the test user to chose among two or three reasonable finite representations of…
Descriptors: Factor Analysis, Item Analysis, Latent Trait Theory, Mathematical Models
Garrison, Wayne M.; Stanwyck, Douglas J. – 1979
The susceptibility to faking on the Tennessee Self Concept Scale was examined among college students. Additionally, groups of respondents, instructed to respond in a "random" fashion to pre-determined numbers of items in the TSCS, were subjected to a plausibility analysis of their test response vectors using the Rasch measurement model.…
Descriptors: College Students, Higher Education, Item Analysis, Response Style (Tests)
Shannon, Gregory A. – 1983
Rescoring of Center for Occupational and Professional Assessment objective-referenced tests is decided largely by content experts selected by client organizations. A few of the test items, statistically flagged for review, are not rescored. Some of this incongruence could be due to the use of the biserial correlation (r-biserial) as an…
Descriptors: Adults, Criterion Referenced Tests, Item Analysis, Occupational Tests
Wainer, Howard; Wright, Benjamin D. – 1980
The pure Rasch model was compared with four modifications of the model in a number of different simulations in order to ascertain the comparative efficiencies of the parameter estimations of these modifications. Because there is always noise in test score data, some individuals may have response patterns that do not fit the model and their…
Descriptors: Error of Measurement, Guessing (Tests), Item Analysis, Latent Trait Theory

Kingston, Neal M. – 1985
This research investigated the effect on estimated lower asymptotes of the instructions to Graduate Record Examination (GRE) examinees about how the test would be scored. This effect was assessed for four different verbal item types (analogies, antonyms, sentence completion, and reading comprehension) using a two-way, unweighted means analysis of…
Descriptors: Analysis of Variance, College Entrance Examinations, Guessing (Tests), Higher Education
Rubin, Lois S.; Mott, David E. W. – 1984
An investigation of the effect on the difficulty value of an item due to position placement within a test was made. Using a 60-item operational test comprised of 5 subtests, 60 items were placed as experimental items on a number of spiralled test forms in three different positions (first, middle, last) within the subtest composed of like items.…
Descriptors: Difficulty Level, Item Analysis, Minimum Competency Testing, Reading Tests
Siskind, Theresa G.; Anderson, Lorin W. – 1982
The study was designed to examine the similarity of response options generated by different item writers using a systematic approach to item writing. The similarity of response options to student responses for the same item stems presented in an open-ended format was also examined. A non-systematic (subject matter expertise) approach and a…
Descriptors: Algorithms, Item Analysis, Multiple Choice Tests, Quality Control
Chipman, Susan F. – 1988
The problem of sex bias in mathematics word problems is discussed, with references to the appropriate literature. Word problems are assessed via cognitive science analysis of word problem solving. It has been suggested that five basic semantic relations are adequate to classify nearly all story problems, namely, change, combine, compare, vary, and…
Descriptors: Arithmetic, Elementary Secondary Education, Item Analysis, Mathematics Tests
Klein, Stephen P.; Bolus, Roger – 1983
A solution to reduce the likelihood of one examinee copying another's answers on large scale tests that require all examinees to answer the same set of questions is to use multiple test forms that differ in terms of item ordering. This study was conducted to determine whether varying the sequence in which blocks of items were presented to…
Descriptors: Adults, Cheating, Cost Effectiveness, Item Analysis
Doolittle, Allen E. – 1985
Differential item performance (DIP) is discussed as a concept that does not necessarily imply item bias or unfairness to subgroups of examinees. With curriculum-based achievement tests, DIP is presented as a valid reflection of group differences in requisite skills and instruction. Using data from a national testing of the ACT Assessment, this…
Descriptors: Achievement Tests, High Schools, Item Analysis, Mathematics Achievement
Lutkus, Anthony D.; Laskaris, George – 1981
Analyses of student responses to Introductory Psychology test questions were discussed. The publisher supplied a two thousand item test bank on computer tape. Instructors selected questions for fifteen item tests. The test questions were labeled by the publisher as factual or conceptual. The semester course used a mastery learning format in which…
Descriptors: Difficulty Level, Higher Education, Item Analysis, Item Banks
Jaeger, Richard M. – 1980
Five statistical indices are developed and described which may be used for determining (1) when linear equating of two approximately parallel tests is adequate, and (2) whan a more complex method such as equipercentile equating must be used. The indices were based on: (1) similarity of cumulative score distributions; (2) shape of the raw-score to…
Descriptors: College Entrance Examinations, Difficulty Level, Equated Scores, Higher Education
Lynch, Michael – 1977
The way in which diagnostic reading test scores were used in planning a remedial program is examined in this report. Among the topics discussed are the administration of a diagnostic test in 13 Irish schools; the organization of a remedial program for students identified as having severe reading problems; weaknesses of diagnostic tests currently…
Descriptors: Diagnostic Teaching, Diagnostic Tests, Elementary Secondary Education, Item Analysis
Reckase, Mark D.; And Others – 1985
Factor analysis is the traditional method for studying the dimensionality of test data. However, under common conditions, the factor analysis of tetrachoric correlations does not recover the underlying structure of dichotomous data. The purpose of this paper is to demonstrate that the factor analyses of tetrachoric correlations is unlikely to…
Descriptors: Correlation, Difficulty Level, Factor Analysis, Item Analysis
Secolsky, Charles – 1980
Undergraduates responded to an objective test in electronics and classified each item by domain (one of 14 topics covered in their text), and by type of knowledge (definition, fact, principle, or interpretation). These judgments were compared to their instructor's "standard" judgments. From these data, an index of item-domain divergence…
Descriptors: Ambiguity, Criterion Referenced Tests, Electronics, Higher Education