NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Peer reviewed Peer reviewed
Bennett, Randy Elliot; Steffen, Manfred; Singley, Mark Kevin; Morley, Mary; Jacquemin, Daniel – Journal of Educational Measurement, 1997
Scoring accuracy and item functioning were studied for an open-ended response type test in which correct answers can take many different surface forms. Results with 1,864 graduate school applicants showed automated scoring to approximate the accuracy of multiple-choice scoring. Items functioned similarly to other item types being considered. (SLD)
Descriptors: Adaptive Testing, Automation, College Applicants, Computer Assisted Testing
Peer reviewed Peer reviewed
Bennett, Randy Elliot; Rock, Donald A. – Journal of Educational Measurement, 1995
Examined the generalizability and validity and examinee perceptions of a computer-delivered version of 8 formulating-hypotheses tasks administered to 192 graduate students. Results support previous research that has suggested that formulating-hypotheses items can broaden the abilities measured by graduate admissions measures. (SLD)
Descriptors: Admission (School), College Entrance Examinations, Computer Assisted Testing, Generalizability Theory
Peer reviewed Peer reviewed
Enright, Mary K.; Rock, Donald A.; Bennett, Randy Elliot – Journal of Educational Measurement, 1998
Examined alternative-item types and section configurations for improving the discriminant and convergent validity of the Graduate Record Examination (GRE) general test using a computer-based test given to 388 examinees who had taken the GRE previously. Adding new variations of logical meaning appeared to decrease discriminant validity. (SLD)
Descriptors: Admission (School), College Entrance Examinations, College Students, Computer Assisted Testing
Peer reviewed Peer reviewed
Bennett, Randy Elliot; Morley, Mary; Quardt, Dennis; Rock, Donald A.; Singley, Mark K.; Katz, Irvin R.; Nhouyvanisvong, Adisack – Journal of Educational Measurement, 1999
Evaluated a computer-delivered response type for measuring quantitative skill, the "Generating Examples" (GE) response type, which presents under-determined problems that can have many right answers. Results from 257 graduate students and applicants indicate that GE scores are reasonably reliable, but only moderately related to Graduate…
Descriptors: College Applicants, Computer Assisted Testing, Graduate Students, Graduate Study
Peer reviewed Peer reviewed
Bennett, Randy Elliot; Sebrechts, Marc M. – Journal of Educational Measurement, 1997
A computer-delivered problem-solving task based on cognitive research literature was developed and its validity for graduate admissions assessment was studied with 107 undergraduates. Use of the test, which asked examinees to sort word-problem stems by prototypes, was supported by the findings. (SLD)
Descriptors: Admission (School), College Entrance Examinations, Computer Assisted Testing, Graduate Study