NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)1
Since 2006 (last 20 years)3
Audience
Location
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Buzick, Heather; Oliveri, Maria Elena; Attali, Yigal; Flor, Michael – Applied Measurement in Education, 2016
Automated essay scoring is a developing technology that can provide efficient scoring of large numbers of written responses. Its use in higher education admissions testing provides an opportunity to collect validity and fairness evidence to support current uses and inform its emergence in other areas such as K-12 large-scale assessment. In this…
Descriptors: Essays, Learning Disabilities, Attention Deficit Hyperactivity Disorder, Scoring
Peer reviewed Peer reviewed
Direct linkDirect link
Davison, Mark L.; Semmes, Robert; Huang, Lan; Close, Catherine N. – Educational and Psychological Measurement, 2012
Data from 181 college students were used to assess whether math reasoning item response times in computerized testing can provide valid and reliable measures of a speed dimension. The alternate forms reliability of the speed dimension was .85. A two-dimensional structural equation model suggests that the speed dimension is related to the accuracy…
Descriptors: Computer Assisted Testing, Reaction Time, Reliability, Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Lewandowski, Lawrence; Gathje, Rebecca A.; Lovett, Benjamin J.; Gordon, Michael – Journal of Psychoeducational Assessment, 2013
College students with attention deficit hyperactivity disorder (ADHD) often request and receive extended time to complete high-stakes exams and classroom tests. This study examined the performances and behaviors of college students on computerized simulations of high-stakes exams. Thirty-five college students with ADHD were compared to 185 typical…
Descriptors: Attention Deficit Disorders, Comparative Analysis, Testing, Vocabulary
Peer reviewed Peer reviewed
Enright, Mary K.; Rock, Donald A.; Bennett, Randy Elliot – Journal of Educational Measurement, 1998
Examined alternative-item types and section configurations for improving the discriminant and convergent validity of the Graduate Record Examination (GRE) general test using a computer-based test given to 388 examinees who had taken the GRE previously. Adding new variations of logical meaning appeared to decrease discriminant validity. (SLD)
Descriptors: Admission (School), College Entrance Examinations, College Students, Computer Assisted Testing
Slater, Sharon C.; Schaeffer, Gary A. – 1996
The General Computer Adaptive Test (CAT) of the Graduate Record Examinations (GRE) includes three operational sections that are separately timed and scored. A "no score" is reported if the examinee answers fewer than 80% of the items or if the examinee does not answer all of the items and leaves the section before time expires. The 80%…
Descriptors: Adaptive Testing, College Students, Computer Assisted Testing, Equal Education
Kobrin, Jennifer L. – 2000
The comparability of computerized and paper-and-pencil tests was examined from cognitive perspective, using verbal protocols rather than psychometric methods, as the primary mode of inquiry. Reading comprehension items from the Graduate Record Examinations were completed by 48 college juniors and seniors, half of whom took the computerized test…
Descriptors: Cognitive Processes, College Students, Computer Assisted Testing, Higher Education
Peer reviewed Peer reviewed
Bridgeman, Brent; Rock, Donald A. – Journal of Educational Measurement, 1993
Exploratory and confirmatory factor analyses were used to explore relationships among existing item types and three new computer-administered item types for the analytical scale of the Graduate Record Examination General Test. Results with 349 students indicate constructs the item types are measuring. (SLD)
Descriptors: College Entrance Examinations, College Students, Comparative Testing, Computer Assisted Testing
Bridgeman, Brent; Rock, Donald A. – 1993
Three new computer-administered item types for the analytical scale of the Graduate Record Examination (GRE) General Test were developed and evaluated. One item type was a free-response version of the current analytical reasoning item type. The second item type was a somewhat constrained free-response version of the pattern identification (or…
Descriptors: Adaptive Testing, College Entrance Examinations, College Students, Computer Assisted Testing
Powers, Donald E.; Potenza, Maria T. – 1996
The degree to which laptop and standard-size desktop computers are likely to produce comparable test results for the Graduate Record Examination (GRE) General Test was studied. Verbal, quantitative, and writing sections of a retired version of the GRE were used, since it was expected that performance on reading passages or mathematics items might…
Descriptors: College Students, Comparative Analysis, Computer Assisted Testing, Higher Education
Sebrechts, Marc M.; And Others – 1991
This study evaluated agreement between expert system and human scores on 12 algebra word problems taken by Graduate Record Examinations (GRE) General Test examinees from a general sample of 285 and a study sample of 30. Problems were drawn from three content classes (rate x time, work, and interest) and presented in four constructed-response…
Descriptors: Algebra, Automation, College Students, Computer Assisted Testing
Bennett, Randy Elliot; Sebrechts, Marc M. – 1994
This study evaluated expert system diagnoses of examinees' solutions to complex constructed-response algebra word problems. Problems were presented to three samples (30 college students each), each of which had taken the Graduate Record Examinations General Test. One sample took the problems in paper-and-pencil form and the other two on computer.…
Descriptors: Algebra, Automation, Classification, College Entrance Examinations
Peer reviewed Peer reviewed
Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing
Parshall, Cynthia G.; Kromrey, Jeffrey D. – 1993
This paper studies whether examinee characteristics are systematically related to mode effect across paper and computer versions of the same instrument, using data from the Graduate Record Examination (GRE) of the Educational Testing Service in its Computer-Based Testing Pilot Study of 1991. The following characteristics of 1,114 examinees were…
Descriptors: Age Differences, College Entrance Examinations, College Students, Comparative Testing