NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
Plake, Barbara S.; Huntley, Renee M. – Educational and Psychological Measurement, 1984
Two studies examined the effect of making the correct answer of a multiple choice test item grammatically consistent with the item. American College Testing Assessment experimental items were constructed to investigate grammatical compliance to investigate grammatical compliance for plural-singular and vowel-consonant agreement. Results suggest…
Descriptors: Grammar, Higher Education, Item Analysis, Multiple Choice Tests
Peer reviewed Peer reviewed
Plake, Barbara S. – Applied Measurement in Education, 1998
Credentialing programs were surveyed to determine the procedures they use to set performance standards on multiple-choice and open-ended assessments. Implications of the various standard-setting approaches for the National Assessment of Educational Progress are discussed, and it is asserted that generalizing from standard-setting in professional…
Descriptors: Certification, Credentials, Elementary Secondary Education, Licensing Examinations (Professions)
Peer reviewed Peer reviewed
Plake, Barbara S.; And Others – Journal of Experimental Education, 1981
Number right and elimination scores were analyzed on a college level mathematics exam assembled from pretest data. Anxiety measures were administered along with the experimental forms to undergraduates. Results suggest that neither test scores nor attitudes are influenced by item order knowledge thereof, or anxiety level. (Author/GK)
Descriptors: College Mathematics, Difficulty Level, Higher Education, Multiple Choice Tests
Plake, Barbara S.; And Others – 1980
Number right and elimination scores were analyzed on a 48-item college level mathematics test that was assembled from pretest data in three forms by varying the item orderings: easy-hard, uniform, or random. Half of the forms contained information explaining the item arrangement and suggesting strategies for taking the test. Several anxiety…
Descriptors: Difficulty Level, Higher Education, Multiple Choice Tests, Quantitative Tests
Peer reviewed Peer reviewed
Plake, Barbara S.; Ansorge, Charles J. – Educational and Psychological Measurement, 1984
Scores representing number of items right and self-perceptions were analyzed for a nonquantitative examination that was assembled into three forms. Multivariate ANCOVA revealed no significant effects for the cognitive measure. However, significant sex and sex x order effects were found for perceptions scores not parallel to those reported…
Descriptors: Analysis of Covariance, Higher Education, Multiple Choice Tests, Scores
Plake, Barbara S.; Melican, Gerald J. – 1985
A methodology for investigating the influence of correction-for-guessing directions and formula scoring on test performance was studied. Experts in the test content field used a judgmental item appraisal system to estimate the knowledge of the minimally competent candidate (MCC) and to predict those items that the MCC would omit on the test under…
Descriptors: College Students, Guessing (Tests), Higher Education, Mathematics Tests
Melican, Gerald; Plake, Barbara S. – 1984
The validity of combining a correction for guessing with the Nedelsky-based cutscore was investigated. A five option multiple choice Mathematics Achievement Test was used in the study. Items were selected to meet several criteria. These included: the capability of measuring mathematics concepts related to performance in introductory statistics;…
Descriptors: Cutting Scores, Guessing (Tests), Higher Education, Multiple Choice Tests
Buckendahl, Chad W.; Plake, Barbara S.; Impara, James C. – 1999
Many school districts are developing assessments that incorporate both selected response and constructed response formats. Scores on these assessments can be used for a variety of purposes ranging from subject remediation to promotion decisions. These policy decisions are informed by recommendations for Minimum Passing Scores (MPSs) from standard…
Descriptors: Academic Standards, Constructed Response, Cutting Scores, Educational Assessment
Huntley, Renee M.; Plake, Barbara S. – 1988
The combinational-format item (CFI)--multiple-choice item with combinations of alternatives presented as response choices--was studied to determine whether CFIs were different from regular multiple-choice items in item characteristics or in cognitive processing demands. Three undergraduate Foundations of Education classes (consisting of a total of…
Descriptors: Cognitive Processes, Computer Assisted Testing, Difficulty Level, Educational Psychology
Huntley, Renee M.; Plake, Barbara S. – 1980
Guidelines for test item-writing have traditionally recommended making the correct answer of a multiple-choice item grammatically consistent with its stem. To investigate the effects of adhering to this practice, certain item formats were designed to determine whether the practice of providing relevant grammatical clues, in itself, created cue…
Descriptors: College Entrance Examinations, Cues, Difficulty Level, Grammar
Plake, Barbara S.; Wise, Steven L. – 1986
One question regarding the utility of adaptive testing is the effect of individualized item arrangements on examinee test scores. The purpose of this study was to analyze the item difficulty choices by examinees as a function of previous item performance. The examination was a 25-item test of basic algebra skills given to 36 students in an…
Descriptors: Adaptive Testing, Algebra, College Students, Computer Assisted Testing