NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 14 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Kaliski, Pamela K.; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna L.; Plake, Barbara S.; Reshetar, Rosemary A. – Educational and Psychological Measurement, 2013
The many-faceted Rasch (MFR) model has been used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR model for examining the quality of ratings obtained from a standard…
Descriptors: Item Response Theory, Models, Standard Setting (Scoring), Science Tests
Peer reviewed Peer reviewed
Plake, Barbara S. – Journal of Experimental Education, 1980
Three-item orderings and two levels of knowledge of ordering were used to study differences in test results, student's perception of the test's fairness and difficulty, and student's estimation of test performance. No significant order effect was found. (Author/GK)
Descriptors: Difficulty Level, Higher Education, Scores, Test Format
Plake, Barbara S.; Hoover, H. D. – 1978
A method of investigating for possible bias in test items is proposed that uses analysis of variance for item data based on groups that have been selected to have identical test score distributions. The item data used are arcsin transformations of item difficulties. The methodological procedure has the following advantages: (1) The arcsin…
Descriptors: Achievement Tests, Analysis of Variance, Difficulty Level, Item Analysis
Plake, Barbara S.; And Others – 1983
Differential test performance by undergraduate males and females enrolled in a developmental educational psychology course (n=167) was reported on a quantitative examination as a function of item arrangement. Males were expected to perform better than females on tests whose items arranged easy to hard. Plake and Ansorge (1982) speculated this may…
Descriptors: Difficulty Level, Feedback, Higher Education, Scoring
Plake, Barbara S.; And Others – 1981
Effects of item arrangement (easy-hard, uniform, and random), test anxiety, and sex on a 48-item multiple-choice mathematics test assembled from items of the American College Testing Program and taken by motivated upper level undergraduates and beginning graduate students were investigated. Four measures of anxiety were used: the Achievement Test…
Descriptors: Academic Achievement, Achievement Tests, Difficulty Level, Higher Education
Peer reviewed Peer reviewed
Plake, Barbara S.; And Others – Journal of Experimental Education, 1981
Number right and elimination scores were analyzed on a college level mathematics exam assembled from pretest data. Anxiety measures were administered along with the experimental forms to undergraduates. Results suggest that neither test scores nor attitudes are influenced by item order knowledge thereof, or anxiety level. (Author/GK)
Descriptors: College Mathematics, Difficulty Level, Higher Education, Multiple Choice Tests
Plake, Barbara S.; And Others – 1980
Number right and elimination scores were analyzed on a 48-item college level mathematics test that was assembled from pretest data in three forms by varying the item orderings: easy-hard, uniform, or random. Half of the forms contained information explaining the item arrangement and suggesting strategies for taking the test. Several anxiety…
Descriptors: Difficulty Level, Higher Education, Multiple Choice Tests, Quantitative Tests
PDF pending restoration PDF pending restoration
Plake, Barbara S.; And Others – 1994
In self-adapted testing (SAT), examinees select the difficulty level of items administered. This study investigated three variations of prior information provided when taking an SAT: (1) no information (examinees selected item difficulty levels without prior information); (2) view (examinees inspected a typical item from each difficulty level…
Descriptors: Adaptive Testing, College Students, Computer Assisted Testing, Difficulty Level
Peer reviewed Peer reviewed
Plake, Barbara S.; And Others – Educational and Psychological Measurement, 1995
No significant differences in performance on a self-adapted test or anxiety were found for college students (n=218) taking a self-adapted test who selected item difficulty without any prior information, inspected an item before selecting, or answered a typical item and received performance feedback. (SLD)
Descriptors: Achievement, Adaptive Testing, College Students, Computer Assisted Testing
Peer reviewed Peer reviewed
Impara, James C.; Plake, Barbara S. – Journal of Educational Measurement, 1998
Sixth-grade teachers (n=26) estimated item performance for their students (724 total students) on a 50-item district-wide science test. Teachers were more accurate in estimating performance of the total group than of the borderline group, but in neither case was their accuracy high. Estimating proportion-correct values using the Angoff standard…
Descriptors: Difficulty Level, Elementary School Teachers, Grade 6, Intermediate Grades
Peer reviewed Peer reviewed
Plake, Barbara S.; Melican, Gerald J. – Educational and Psychological Measurement, 1989
The impact of overall test length and difficulty on the expert judgments of item performance by the Nedelsky method were studied. Five university-level instructors predicting the performance of minimally competent candidates on a mathematics examination were fairly consistent in their assessments regardless of length or difficulty of the test.…
Descriptors: Difficulty Level, Estimation (Mathematics), Evaluators, Higher Education
Huntley, Renee M.; Plake, Barbara S. – 1988
The combinational-format item (CFI)--multiple-choice item with combinations of alternatives presented as response choices--was studied to determine whether CFIs were different from regular multiple-choice items in item characteristics or in cognitive processing demands. Three undergraduate Foundations of Education classes (consisting of a total of…
Descriptors: Cognitive Processes, Computer Assisted Testing, Difficulty Level, Educational Psychology
Huntley, Renee M.; Plake, Barbara S. – 1980
Guidelines for test item-writing have traditionally recommended making the correct answer of a multiple-choice item grammatically consistent with its stem. To investigate the effects of adhering to this practice, certain item formats were designed to determine whether the practice of providing relevant grammatical clues, in itself, created cue…
Descriptors: College Entrance Examinations, Cues, Difficulty Level, Grammar
Plake, Barbara S.; Wise, Steven L. – 1986
One question regarding the utility of adaptive testing is the effect of individualized item arrangements on examinee test scores. The purpose of this study was to analyze the item difficulty choices by examinees as a function of previous item performance. The examination was a 25-item test of basic algebra skills given to 36 students in an…
Descriptors: Adaptive Testing, Algebra, College Students, Computer Assisted Testing