Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 1 |
Descriptor
Adaptive Testing | 4 |
Difficulty Level | 4 |
Undergraduate Students | 4 |
Comparative Testing | 3 |
Computer Assisted Testing | 3 |
Higher Education | 3 |
Test Items | 3 |
Graduate Students | 2 |
Multiple Choice Tests | 2 |
Test Anxiety | 2 |
Ability Identification | 1 |
More ▼ |
Author
Wise, Steven L. | 2 |
Enders, Craig K. | 1 |
Finney, Sara J. | 1 |
Freeman, Sharon A. | 1 |
Morse, David T. | 1 |
Roos, Linda L. | 1 |
Severance, Donald D. | 1 |
Vuk, Jasna | 1 |
Publication Type
Journal Articles | 3 |
Reports - Research | 3 |
Reports - Evaluative | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Higher Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Vuk, Jasna; Morse, David T. – Research in the Schools, 2009
In this study we observed college students' behavior on two self-tailored, multiple-choice exams. Self-tailoring was defined as an option to omit up to five items from being scored on an exam. Participants, 80 undergraduate college students enrolled in two sections of an educational psychology course, statistically significantly improved their…
Descriptors: College Students, Educational Psychology, Academic Achievement, Correlation

Wise, Steven L.; Finney, Sara J.; Enders, Craig K.; Freeman, Sharon A.; Severance, Donald D. – Applied Measurement in Education, 1999
Examined whether providing item review on a computerized adaptive test could be used by examinees to inflate their scores. Two studies involving 139 undergraduates suggest that examinees are not highly proficient at discriminating item difficulty. A simulation study showed the usefulness of a strategy identified by G. Kingsbury (1996) as a way to…
Descriptors: Adaptive Testing, Computer Assisted Testing, Difficulty Level, Higher Education
Roos, Linda L.; And Others – 1992
Computerized adaptive (CA) testing uses an algorithm to match examinee ability to item difficulty, while self-adapted (SA) testing allows the examinee to choose the difficulty of his or her items. Research comparing SA and CA testing has shown that examinees experience lower anxiety and improved performance with SA testing. All previous research…
Descriptors: Ability Identification, Adaptive Testing, Algebra, Algorithms

Wise, Steven L.; And Others – Journal of Educational Measurement, 1992
Performance of 156 undergraduate and 48 graduate students on a self-adapted test (SFAT)--students choose the difficulty level of their test items--was compared with performance on a computer-adapted test (CAT). Those taking the SFAT obtained higher ability scores and reported lower posttest state anxiety than did CAT takers. (SLD)
Descriptors: Adaptive Testing, Comparative Testing, Computer Assisted Testing, Difficulty Level