NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Quinlan, Thomas; Higgins, Derrick; Wolff, Susanne – Educational Testing Service, 2009
This report evaluates the construct coverage of the e-rater[R[ scoring engine. The matter of construct coverage depends on whether one defines writing skill, in terms of process or product. Originally, the e-rater engine consisted of a large set of components with a proven ability to predict human holistic scores. By organizing these capabilities…
Descriptors: Guides, Writing Skills, Factor Analysis, Writing Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gu, Lixiong; Drake, Samuel; Wolfe, Edward W. – Journal of Technology, Learning, and Assessment, 2006
This study seeks to determine whether item features are related to observed differences in item difficulty (DIF) between computer- and paper-based test delivery media. Examinees responded to 60 quantitative items similar to those found on the GRE general test in either a computer-based or paper-based medium. Thirty-eight percent of the items were…
Descriptors: Test Bias, Test Items, Educational Testing, Student Evaluation
Bennett, Randy Elliot; Sebrechts, Marc M. – 1994
This study evaluated expert system diagnoses of examinees' solutions to complex constructed-response algebra word problems. Problems were presented to three samples (30 college students each), each of which had taken the Graduate Record Examinations General Test. One sample took the problems in paper-and-pencil form and the other two on computer.…
Descriptors: Algebra, Automation, Classification, College Entrance Examinations
Lin, Miao-Hsiang – 1986
Specific questions addressed in this study include how time limits affect a test's construct and predictive validities, how time limits affect an examinee's time allocation and test performance, and whether the assumption about how examinees answer items is valid. Interactions involving an examinee's sex and age are studied. Two parallel forms of…
Descriptors: Age Differences, Computer Assisted Testing, Construct Validity, Difficulty Level
Sebrechts, Marc M.; And Others – 1993
This report describes the development of a new tool for assessment research in graduate education. The tool, the Algebra Assessment System, is based on GIDE, a pre-existing program that diagnostically analyzes complex constructed responses to algebra word problems. The project had three goals. The first goal was to build a generically usable…
Descriptors: Algebra, College Entrance Examinations, Computer Assisted Testing, Computer Interfaces