Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 2 |
Descriptor
Difficulty Level | 8 |
Multiple Choice Tests | 8 |
Test Items | 7 |
Higher Education | 6 |
Test Format | 4 |
Test Reliability | 3 |
College Students | 2 |
Comparative Analysis | 2 |
Foreign Countries | 2 |
Item Analysis | 2 |
Scoring Formulas | 2 |
More ▼ |
Source
Journal of Experimental… | 8 |
Author
Weiten, Wayne | 2 |
Bulut, Okan | 1 |
DiBattista, David | 1 |
Foos, Paul W. | 1 |
Fortuna, Glenda | 1 |
Gierl, Mark J. | 1 |
Hansen, Richard | 1 |
Hsu, Tse-Chi | 1 |
Plake, Barbara S. | 1 |
Shin, Jinnie | 1 |
Sinnige-Egger, Jo-Anne | 1 |
More ▼ |
Publication Type
Journal Articles | 8 |
Reports - Research | 8 |
Education Level
Higher Education | 2 |
Postsecondary Education | 2 |
Audience
Location
Canada | 2 |
Laws, Policies, & Programs
Assessments and Surveys
State Trait Anxiety Inventory | 1 |
Watson Glaser Critical… | 1 |
What Works Clearinghouse Rating
Shin, Jinnie; Bulut, Okan; Gierl, Mark J. – Journal of Experimental Education, 2020
The arrangement of response options in multiple-choice (MC) items, especially the location of the most attractive distractor, is considered critical in constructing high-quality MC items. In the current study, a sample of 496 undergraduate students taking an educational assessment course was given three test forms consisting of the same items but…
Descriptors: Foreign Countries, Undergraduate Students, Multiple Choice Tests, Item Response Theory
DiBattista, David; Sinnige-Egger, Jo-Anne; Fortuna, Glenda – Journal of Experimental Education, 2014
The authors assessed the effects of using "none of the above" as an option in a 40-item, general-knowledge multiple-choice test administered to undergraduate students. Examinees who selected "none of the above" were given an incentive to write the correct answer to the question posed. Using "none of the above" as the…
Descriptors: Multiple Choice Tests, Testing, Undergraduate Students, Test Items

Vidler, Derek; Hansen, Richard – Journal of Experimental Education, 1980
Relationships among patterns of answer changing and item characteristics on multiple-choice tests are discussed. Results obtained were similar to those found in previous studies but pointed to further relationships among these variables. (Author/GK)
Descriptors: College Students, Difficulty Level, Higher Education, Multiple Choice Tests

Weiten, Wayne – Journal of Experimental Education, 1982
A comparison of double as opposed to single multiple-choice questions yielded significant differences in regard to item difficulty, item discrimination, and internal reliability, but not concurrent validity. (Author/PN)
Descriptors: Difficulty Level, Educational Testing, Higher Education, Multiple Choice Tests

Weiten, Wayne – Journal of Experimental Education, 1984
The effects of violating four item construction principles were examined to assess the validity of the principles and the importance of students' test wiseness. While flawed items were significantly less difficult than sound items, differences in item discrimination, test reliability, and concurrent validity were not observed. (Author/BW)
Descriptors: Difficulty Level, Higher Education, Item Analysis, Multiple Choice Tests

Plake, Barbara S.; And Others – Journal of Experimental Education, 1981
Number right and elimination scores were analyzed on a college level mathematics exam assembled from pretest data. Anxiety measures were administered along with the experimental forms to undergraduates. Results suggest that neither test scores nor attitudes are influenced by item order knowledge thereof, or anxiety level. (Author/GK)
Descriptors: College Mathematics, Difficulty Level, Higher Education, Multiple Choice Tests

Hsu, Tse-Chi; And Others – Journal of Experimental Education, 1984
The indices of item difficulty and discrimination, the coefficients of effective length, and the average item information for both single- and multiple-answer items using six different scoring formulas were computed and compared. These formulas vary in terms of the assignment of partial credit and the correction for guessing. (Author/BW)
Descriptors: College Entrance Examinations, Comparative Analysis, Difficulty Level, Guessing (Tests)

Foos, Paul W. – Journal of Experimental Education, 1992
Effects of expected form and expected difficulty of a test were examined for 84 college students expecting an easy or difficult multiple-choice or essay examination but taking a combined test. Results support the hypothesis that individuals work harder, rather than reduce their effort, when difficult work is expected. (SLD)
Descriptors: College Students, Difficulty Level, Essay Tests, Expectation