Descriptor
Timed Tests | 9 |
Test Items | 7 |
Computer Assisted Testing | 6 |
Responses | 6 |
Guessing (Tests) | 4 |
Adaptive Testing | 3 |
College Entrance Examinations | 3 |
Difficulty Level | 3 |
Item Response Theory | 3 |
Law Schools | 3 |
Admission (School) | 2 |
More ▼ |
Source
Applied Psychological… | 1 |
Author
Schnipke, Deborah L. | 9 |
Scrams, David J. | 5 |
van der Linden, Wim J. | 2 |
Pashley, Peter J. | 1 |
Publication Type
Reports - Research | 4 |
Speeches/Meeting Papers | 4 |
Reports - Evaluative | 3 |
Reports - Descriptive | 2 |
Journal Articles | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
Law School Admission Test | 3 |
Graduate Record Examinations | 2 |
Armed Services Vocational… | 1 |
What Works Clearinghouse Rating
Schnipke, Deborah L.; Scrams, David J. – 1999
Speededness refers to the extent to which time limits affect test takers' performance. With regard to the Law School Admission Test (LSAT), speededness is currently measured by calculating the proportion of test takers who do not reach each item on the test. These proportions typically increase slightly toward the end of the test, indicating that…
Descriptors: Admission (School), College Entrance Examinations, Guessing (Tests), Law Schools
van der Linden, Wim J.; Scrams, David J.; Schnipke, Deborah L. – 2003
This paper proposes an item selection algorithm that can be used to neutralize the effect of time limits in computer adaptive testing. The method is based on a statistical model for the response-time distributions of the test takers on the items in the pool that is updated each time a new item has been administered. Predictions from the model are…
Descriptors: Adaptive Testing, Algorithms, Computer Assisted Testing, Linear Programming

van der Linden, Wim J.; Scrams, David J.; Schnipke, Deborah L. – Applied Psychological Measurement, 1999
Proposes an item-selection algorithm for neutralizing the differential effects of time limits on computerized adaptive test scores. Uses a statistical model for distributions of examinees' response times on items in a bank that is updated each time an item is administered. Demonstrates the method using an item bank from the Armed Services…
Descriptors: Adaptive Testing, Algorithms, Computer Assisted Testing, Item Banks
Schnipke, Deborah L. – 1996
When running out of time on a multiple-choice test, some examinees are likely to respond rapidly to the remaining unanswered items in an attempt to get some items right by chance. Because these responses will tend to be incorrect, the presence of "rapid-guessing behavior" could cause these items to appear to be more difficult than they…
Descriptors: Difficulty Level, Estimation (Mathematics), Guessing (Tests), Item Response Theory
Schnipke, Deborah L. – 1999
When running out of time on a multiple-choice test such as the Law School Admission Test (LSAT), some test takers are likely to respond rapidly to the remaining unanswered items in an attempt to get some items right by chance. Because these responses will tend to be incorrect, the presence of rapid-guessing behavior could cause these items to…
Descriptors: College Entrance Examinations, Difficulty Level, Estimation (Mathematics), Guessing (Tests)
Making Use of Response Times in Standardized Tests: Are Accuracy and Speed Measuring the Same Thing?
Scrams, David J.; Schnipke, Deborah L. – 1997
Response accuracy and response speed provide separate measures of performance. Psychometricians have tended to focus on accuracy with the goal of characterizing examinees on the basis of their ability to respond correctly to items from a given content domain. With the advent of computerized testing, response times can now be recorded unobtrusively…
Descriptors: Computer Assisted Testing, Difficulty Level, Item Response Theory, Psychometrics
Schnipke, Deborah L. – 1995
Time limits on tests often prevent some examinees from finishing all of the items on the test; the extent of this effect has been called the "speededness" of the test. Traditional speededness indices focus on the number of unreached items. Other examinees in the same situation rapidly fill in answers in the hope of getting some of the…
Descriptors: Computer Assisted Testing, Educational Assessment, Evaluation Methods, Guessing (Tests)
Schnipke, Deborah L.; Pashley, Peter J. – 1997
Differences in test performance on time-limited tests may be due in part to differential response-time rates between subgroups, rather than real differences in the knowledge, skills, or developed abilities of interest. With computer-administered tests, response times are available and may be used to address this issue. This study investigates…
Descriptors: Computer Assisted Testing, Data Analysis, English, High Stakes Tests
Schnipke, Deborah L.; Scrams, David J. – 1999
The availability of item response times made possible by computerized testing represents an entirely new type of information about test items. This study explores the issue of how to represent response-time information in item banks. Empirical response-time distribution functions can be fit with statistical distribution functions with known…
Descriptors: Adaptive Testing, Admission (School), Arithmetic, College Entrance Examinations