Descriptor
Author
Bhola, Dennison S. | 1 |
Buckendahl, Chad W. | 1 |
Hansen, Kim | 1 |
Juszkiewicz, Piotr J. | 1 |
Kump, Ann | 1 |
Newsom, Robert S. | 1 |
Yang, Yongwei | 1 |
Publication Type
Guides - Non-Classroom | 1 |
Journal Articles | 1 |
Reports - Evaluative | 1 |
Reports - Research | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Kump, Ann – 1992
Directions are given for scoring typing tests taken on a typewriter or on a computer using special software. The speed score (gross words per minute) is obtained by determining the total number of strokes typed, and dividing by 25. The accuracy score is obtained by comparing the examinee's test paper to the appropriate scoring key and counting the…
Descriptors: Computer Assisted Testing, Employment Qualifications, Guidelines, Job Applicants
Newsom, Robert S.; And Others – Evaluation Quarterly, 1978
For the training and placement of professional workers, multiple-choice instruments are the norm for wide-scale measurement and evaluation efforts. These instruments contain fundamental problems. Computer-based management simulations may provide solutions to these problems, appear scoreable and reliable, offer increased validity, and are better…
Descriptors: Computer Assisted Testing, Multiple Choice Tests, Occupational Tests, Personnel Evaluation
Hansen, Kim – 1992
One hundred and sixteen test administrators in Job Service offices throughout the United States who are currently using the automated typing test software were contacted by telephone about the software. Sixty-nine percent have used the software less than 1 year, and 21 percent have used it more than 1 year. In 78 percent of the offices, there is…
Descriptors: Computer Assisted Testing, Computer Software, Computer Software Evaluation, Employment Qualifications
Yang, Yongwei; Buckendahl, Chad W.; Juszkiewicz, Piotr J.; Bhola, Dennison S. – Journal of Applied Testing Technology, 2005
With the continual progress of computer technologies, computer automated scoring (CAS) has become a popular tool for evaluating writing assessments. Research of applications of these methodologies to new types of performance assessments is still emerging. While research has generally shown a high agreement of CAS system generated scores with those…
Descriptors: Scoring, Validity, Interrater Reliability, Comparative Analysis