Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 2 |
Descriptor
| Computer Assisted Testing | 5 |
| Scores | 5 |
| Higher Education | 3 |
| Test Items | 3 |
| Correlation | 2 |
| Grade 8 | 2 |
| Graduate Students | 2 |
| Hypothesis Testing | 2 |
| Measurement Techniques | 2 |
| National Competency Tests | 2 |
| Scoring | 2 |
| More ▼ | |
Author
| Bennett, Randy Elliot | 5 |
| Rock, Donald A. | 2 |
| Braswell, James | 1 |
| Jenkins, Frank | 1 |
| Kaplan, Bruce | 1 |
| Kaplan, Randy M. | 1 |
| Katz, Irvin R. | 1 |
| Morley, Mary | 1 |
| Nhouyvanisvong, Adisack | 1 |
| Oranje, Andreas | 1 |
| Persky, Hilary | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 3 |
| Reports - Research | 3 |
| Reports - Evaluative | 2 |
| Tests/Questionnaires | 1 |
Education Level
| Grade 8 | 2 |
| Elementary Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
| Graduate Record Examinations | 1 |
| National Assessment of… | 1 |
What Works Clearinghouse Rating
Bennett, Randy Elliot; Persky, Hilary; Weiss, Andy; Jenkins, Frank – Journal of Technology, Learning, and Assessment, 2010
This paper describes a study intended to demonstrate how an emerging skill, problem solving with technology, might be measured in the National Assessment of Educational Progress (NAEP). Two computer-delivered assessment scenarios were designed, one on solving science-related problems through electronic information search and the other on solving…
Descriptors: National Competency Tests, Problem Solving, Technology Uses in Education, Computer Assisted Testing
Bennett, Randy Elliot; Braswell, James; Oranje, Andreas; Sandene, Brent; Kaplan, Bruce; Yan, Fred – Journal of Technology, Learning, and Assessment, 2008
This article describes selected results from the Math Online (MOL) study, one of three field investigations sponsored by the National Center for Education Statistics (NCES) to explore the use of new technology in NAEP. Of particular interest in the MOL study was the comparability of scores from paper- and computer-based tests. A nationally…
Descriptors: National Competency Tests, Familiarity, Computer Assisted Testing, Mathematics Tests
Kaplan, Randy M.; Bennett, Randy Elliot – 1994
This study explores the potential for using a computer-based scoring procedure for the formulating-hypotheses (F-H) item. This item type presents a situation and asks the examinee to generate explanations for it. Each explanation is judged right or wrong, and the number of creditable explanations is summed to produce an item score. Scores were…
Descriptors: Automation, Computer Assisted Testing, Correlation, Higher Education
Peer reviewedBennett, Randy Elliot; Morley, Mary; Quardt, Dennis; Rock, Donald A.; Singley, Mark K.; Katz, Irvin R.; Nhouyvanisvong, Adisack – Journal of Educational Measurement, 1999
Evaluated a computer-delivered response type for measuring quantitative skill, the "Generating Examples" (GE) response type, which presents under-determined problems that can have many right answers. Results from 257 graduate students and applicants indicate that GE scores are reasonably reliable, but only moderately related to Graduate…
Descriptors: College Applicants, Computer Assisted Testing, Graduate Students, Graduate Study
Bennett, Randy Elliot; Rock, Donald A. – 1993
Formulating-Hypotheses (F-H) items present a situation and ask the examinee to generate as many explanations for it as possible. This study examined the generalizability, validity, and examinee perceptions of a computer-delivered version of the task. Eight F-H questions were administered to 192 graduate students. Half of the items restricted…
Descriptors: Computer Assisted Testing, Difficulty Level, Generalizability Theory, Graduate Students

Direct link
