NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Bodmann, Shawn M.; Robinson, Daniel H. – Journal of Educational Computing Research, 2004
This study investigated the effect of several different modes of test administration on scores and completion times. In Experiment 1, paper-based assessment was compared to computer-based assessment. Undergraduates completed the computer-based assessment faster than the paper-based assessment, with no difference in scores. Experiment 2 assessed…
Descriptors: Computer Assisted Testing, Higher Education, Undergraduate Students, Evaluation Methods
Walls, Richard T.; And Others – 1987
An experiment involving 20 graduate and undergraduate students (7 males and 13 females) at West Virginia University (Morgantown) assessed "fan network structures" of recognition memory. A fan in network memory structure occurs when several facts are connected into a single node (concept). The more links from that concept to various…
Descriptors: College Students, Computer Assisted Testing, Dialogs (Language), Distractors (Tests)
Lazarte, Alejandro A. – 1999
Two experiments reproduced in a simulated computerized test-taking situation the effect of two of the main determinants in answering an item in a test: the difficulty of the item and the time available to answer it. A model is proposed for the time to respond or abandon an item and for the probability of abandoning it or answering it correctly. In…
Descriptors: Computer Assisted Testing, Difficulty Level, Higher Education, Probability
Slater, Sharon C.; Schaeffer, Gary A. – 1996
The General Computer Adaptive Test (CAT) of the Graduate Record Examinations (GRE) includes three operational sections that are separately timed and scored. A "no score" is reported if the examinee answers fewer than 80% of the items or if the examinee does not answer all of the items and leaves the section before time expires. The 80%…
Descriptors: Adaptive Testing, College Students, Computer Assisted Testing, Equal Education
Peer reviewed Peer reviewed
Llabre, Maria Magdalena; Froman, Terry Wayne – Journal of Experimental Education, 1987
This study compared 38 Hispanic and 28 Anglo college students with respect to the amount of time allocated to items on a reasoning test administered by microcomputer. Results suggested that a time constraint may penalize Hispanic examinees. The applicability of computerized testing for studying test-taking strategy problems is illustrated.…
Descriptors: Analysis of Variance, Anglo Americans, College Students, Computer Assisted Testing
Schnipke, Deborah L.; Scrams, David J. – 1999
The availability of item response times made possible by computerized testing represents an entirely new type of information about test items. This study explores the issue of how to represent response-time information in item banks. Empirical response-time distribution functions can be fit with statistical distribution functions with known…
Descriptors: Adaptive Testing, Admission (School), Arithmetic, College Entrance Examinations
Lin, Miao-Hsiang – 1986
Specific questions addressed in this study include how time limits affect a test's construct and predictive validities, how time limits affect an examinee's time allocation and test performance, and whether the assumption about how examinees answer items is valid. Interactions involving an examinee's sex and age are studied. Two parallel forms of…
Descriptors: Age Differences, Computer Assisted Testing, Construct Validity, Difficulty Level
Legg, Sue M.; Buhr, Dianne C. – 1990
Possible causes of a 16-point mean score increase for the computer adaptive form of the College Level Academic Skills Test (CLAST) in reading over the paper-and-pencil test (PPT) in reading are examined. The adaptive form of the CLAST was used in a state-wide field test in which reading, writing, and computation scores for approximately 1,000…
Descriptors: Adaptive Testing, College Entrance Examinations, Community Colleges, Comparative Testing