NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Buzick, Heather; Oliveri, Maria Elena; Attali, Yigal; Flor, Michael – Applied Measurement in Education, 2016
Automated essay scoring is a developing technology that can provide efficient scoring of large numbers of written responses. Its use in higher education admissions testing provides an opportunity to collect validity and fairness evidence to support current uses and inform its emergence in other areas such as K-12 large-scale assessment. In this…
Descriptors: Essays, Learning Disabilities, Attention Deficit Hyperactivity Disorder, Scoring
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Attali, Yigal; Sinharay, Sandip – ETS Research Report Series, 2015
The "e-rater"® automated essay scoring system is used operationally in the scoring of the argument and issue tasks that form the Analytical Writing measure of the "GRE"® General Test. For each of these tasks, this study explored the value added of reporting 4 trait scores for each of these 2 tasks over the total e-rater score.…
Descriptors: Scores, Computer Assisted Testing, Computer Software, Grammar
Peer reviewed Peer reviewed
Direct linkDirect link
Lewandowski, Lawrence; Gathje, Rebecca A.; Lovett, Benjamin J.; Gordon, Michael – Journal of Psychoeducational Assessment, 2013
College students with attention deficit hyperactivity disorder (ADHD) often request and receive extended time to complete high-stakes exams and classroom tests. This study examined the performances and behaviors of college students on computerized simulations of high-stakes exams. Thirty-five college students with ADHD were compared to 185 typical…
Descriptors: Attention Deficit Disorders, Comparative Analysis, Testing, Vocabulary
Peer reviewed Peer reviewed
Powers, Donald E. – Journal of Educational Computing Research, 2001
Tests the hypothesis that the introduction of computer-adaptive testing may help to alleviate test anxiety and diminish the relationship between test anxiety and test performance. Compares a sample of Graduate Record Examinations (GRE) General Test takers who took the computer-adaptive version of the test with another sample who took the…
Descriptors: Comparative Analysis, Computer Assisted Testing, Nonprint Media, Performance
Peer reviewed Peer reviewed
Vogel, Lora Ann – Journal of Educational Computing Research, 1994
Reports on a study conducted to evaluate how individual differences in anxiety levels affect performance on computer versus paper-and-pencil forms of verbal sections of the Graduate Record Examination. Contrary to the research hypothesis, analysis of scores revealed that extroverted and less computer anxious subjects scored significantly lower on…
Descriptors: Comparative Analysis, Computer Anxiety, Computer Assisted Testing, Computer Attitudes
Powers, Donald E.; Potenza, Maria T. – 1996
The degree to which laptop and standard-size desktop computers are likely to produce comparable test results for the Graduate Record Examination (GRE) General Test was studied. Verbal, quantitative, and writing sections of a retired version of the GRE were used, since it was expected that performance on reading passages or mathematics items might…
Descriptors: College Students, Comparative Analysis, Computer Assisted Testing, Higher Education
Schaeffer, Gary A.; And Others – 1995
This report summarizes the results from two studies. The first assessed the comparability of scores derived from linear computer-based (CBT) and computer adaptive (CAT) versions of the three Graduate Record Examinations (GRE) General Test measures. A verbal CAT was taken by 1,507, a quantitative CAT by 1,354, and an analytical CAT by 995…
Descriptors: Adaptive Testing, Comparative Analysis, Computer Assisted Testing, Equated Scores