NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Attali, Yigal – ETS Research Report Series, 2014
Previous research on calculator use in standardized assessments of quantitative ability focused on the effect of calculator availability on item difficulty and on whether test developers can predict these effects. With the introduction of an on-screen calculator on the Quantitative Reasoning measure of the "GRE"® revised General Test, it…
Descriptors: College Entrance Examinations, Graduate Study, Calculators, Test Items
Lin, Miao-Hsiang – 1986
Specific questions addressed in this study include how time limits affect a test's construct and predictive validities, how time limits affect an examinee's time allocation and test performance, and whether the assumption about how examinees answer items is valid. Interactions involving an examinee's sex and age are studied. Two parallel forms of…
Descriptors: Age Differences, Computer Assisted Testing, Construct Validity, Difficulty Level
Ragosta, Marjorie; Kaplan, Bruce A. – 1986
The Survey of Special Test Administrations was administered to people with disabilities concerning their responses to special testing accommodations, both for college testing and for the Scholastic Aptitude Test (SAT) and Graduate Record Examinations (GRE). The questionnaires were developed to evaluate testing accommodations for disabled people…
Descriptors: Achievement Tests, College Entrance Examinations, Difficulty Level, Disabilities
Peer reviewed Peer reviewed
Freedle, Roy; Kostin, Irene – Journal of Educational Measurement, 1990
The importance of item difficulty (equated delta) was explored as a predictor of differential item functioning of Black versus White examinees for 4 verbal item types using 13 Graduate Record Examination forms and 11 Scholastic Aptitude Test forms. Several significant racial differences were found. (TJH)
Descriptors: Black Students, College Bound Students, College Entrance Examinations, Comparative Testing
Peer reviewed Peer reviewed
Scheuneman, Janice Dowd; Gerritz, Kalle – Journal of Educational Measurement, 1990
Differential item functioning (DIF) methodology for revealing sources of item difficulty and performance characteristics of different groups was explored. A total of 150 Scholastic Aptitude Test items and 132 Graduate Record Examination general test items were analyzed. DIF was evaluated for males and females and Blacks and Whites. (SLD)
Descriptors: Black Students, College Entrance Examinations, College Students, Comparative Testing