NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Attali, Yigal; Bridgeman, Brent; Trapani, Catherine – Journal of Technology, Learning, and Assessment, 2010
A generic approach in automated essay scoring produces scores that have the same meaning across all prompts, existing or new, of a writing assessment. This is accomplished by using a single set of linguistic indicators (or features), a consistent way of combining and weighting these features into essay scores, and a focus on features that are not…
Descriptors: Writing Evaluation, Writing Tests, Scoring, Test Scoring Machines
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Young, I. Phillip – Journal of Research on Leadership Education, 2008
Empirical studies addressing admission to and graduation from a doctoral program focusing on educational leadership are noticeably absent within the professional literature, and this study seeks to fill partially this void through testing specific hypotheses. Archival data were used to conduct a three group discriminant analyses where the…
Descriptors: Grade Point Average, Predictive Validity, Doctoral Programs, Sampling
Peer reviewed Peer reviewed
Stricker, Lawrence J.; Rock, Donald A. – Developmental Psychology, 1987
Evaluated the extent to which the Graduate Record Examinations General Test measures the same constructs for older test takers that it does for younger examinees. Results suggest that the convergent validity of the test is similar across the age groups, but discriminant validity is somewhat different for older examinees. (Author/RWB)
Descriptors: Adults, Age Differences, Comparative Testing, Factor Analysis
Peer reviewed Peer reviewed
Bridgeman, Brent; Rock, Donald A. – Journal of Educational Measurement, 1993
Exploratory and confirmatory factor analyses were used to explore relationships among existing item types and three new computer-administered item types for the analytical scale of the Graduate Record Examination General Test. Results with 349 students indicate constructs the item types are measuring. (SLD)
Descriptors: College Entrance Examinations, College Students, Comparative Testing, Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gu, Lixiong; Drake, Samuel; Wolfe, Edward W. – Journal of Technology, Learning, and Assessment, 2006
This study seeks to determine whether item features are related to observed differences in item difficulty (DIF) between computer- and paper-based test delivery media. Examinees responded to 60 quantitative items similar to those found on the GRE general test in either a computer-based or paper-based medium. Thirty-eight percent of the items were…
Descriptors: Test Bias, Test Items, Educational Testing, Student Evaluation
Peer reviewed Peer reviewed
Freedle, Roy; Kostin, Irene – Journal of Educational Measurement, 1990
The importance of item difficulty (equated delta) was explored as a predictor of differential item functioning of Black versus White examinees for 4 verbal item types using 13 Graduate Record Examination forms and 11 Scholastic Aptitude Test forms. Several significant racial differences were found. (TJH)
Descriptors: Black Students, College Bound Students, College Entrance Examinations, Comparative Testing
Wild, Cheryl L. – 1979
Three sections of the Graduate Record Examinations (GRE) Aptitude Test were reviewed before the introduction of the restructured test in October, 1977: research on (1) the GRE-Verbal section; (2) the GRE-Quantitative section; and (3) a planned third section, measuring analytical thinking skills. Research in all three areas focused on test…
Descriptors: Abstract Reasoning, Aptitude Tests, Cognitive Processes, College Entrance Examinations
Peer reviewed Peer reviewed
Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing
Parshall, Cynthia G.; Kromrey, Jeffrey D. – 1993
This paper studies whether examinee characteristics are systematically related to mode effect across paper and computer versions of the same instrument, using data from the Graduate Record Examination (GRE) of the Educational Testing Service in its Computer-Based Testing Pilot Study of 1991. The following characteristics of 1,114 examinees were…
Descriptors: Age Differences, College Entrance Examinations, College Students, Comparative Testing
Mowsesian, Richard; Hays, William L. – 1985
The predictive validity of the revised Graduate Record Examination Analytical Test (GRE-A) was compared with the experimental form of the GRE-A, in terms of graduate school admissions as well as advancement to Ph.D. candidacy decisions. Prior to 1974 the Graduate Record Examination included just a verbal and a quantitative test; in 1974 it was…
Descriptors: Admission Criteria, Cognitive Tests, College Admission, College Entrance Examinations
Peer reviewed Peer reviewed
Scheuneman, Janice Dowd; Gerritz, Kalle – Journal of Educational Measurement, 1990
Differential item functioning (DIF) methodology for revealing sources of item difficulty and performance characteristics of different groups was explored. A total of 150 Scholastic Aptitude Test items and 132 Graduate Record Examination general test items were analyzed. DIF was evaluated for males and females and Blacks and Whites. (SLD)
Descriptors: Black Students, College Entrance Examinations, College Students, Comparative Testing