Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 5 |
Descriptor
Difficulty Level | 16 |
Responses | 16 |
Test Format | 16 |
Multiple Choice Tests | 13 |
Test Items | 12 |
Test Construction | 6 |
Achievement Tests | 3 |
Foreign Countries | 3 |
Grade 4 | 3 |
College Students | 2 |
Comparative Analysis | 2 |
More ▼ |
Source
Author
Albanese, Mark A. | 1 |
Allen, Nancy L. | 1 |
Bennett, Randy Elliot | 1 |
Berger, Aliza E. | 1 |
Burton, Nancy W. | 1 |
Currie, Michael | 1 |
Frisbie, David A. | 1 |
Henning, Grant | 1 |
Hohensinn, Christine | 1 |
Höhne, Jan Karem | 1 |
Ilhan, Mustafa | 1 |
More ▼ |
Publication Type
Reports - Research | 13 |
Journal Articles | 10 |
Speeches/Meeting Papers | 4 |
Information Analyses | 3 |
Reports - Evaluative | 2 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 2 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Grade 4 | 1 |
Grade 8 | 1 |
Intermediate Grades | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Postsecondary Education | 1 |
Secondary Education | 1 |
Audience
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 2 |
Advanced Placement… | 1 |
Test of English as a Foreign… | 1 |
Trends in International… | 1 |
What Works Clearinghouse Rating
Ilhan, Mustafa; Öztürk, Nagihan Boztunç; Sahin, Melek Gülsah – Participatory Educational Research, 2020
In this research, the effect of an item's type and cognitive level on its difficulty index was investigated. The data source of the study consisted of the responses of the 12535 students in the Turkey sample (6079 and 6456 students from eighth and fourth grade respectively) of TIMSS 2015. The responses were a total of 215 items at the eighth-grade…
Descriptors: Test Items, Difficulty Level, Cognitive Processes, Responses
Höhne, Jan Karem; Schlosser, Stephan; Krebs, Dagmar – Field Methods, 2017
Measuring attitudes and opinions employing agree/disagree (A/D) questions is a common method in social research because it appears to be possible to measure different constructs with identical response scales. However, theoretical considerations suggest that A/D questions require a considerable cognitive processing. Item-specific (IS) questions,…
Descriptors: Online Surveys, Test Format, Test Items, Difficulty Level
Thanyapa, Inadaphat; Currie, Michael – Language Testing in Asia, 2014
The overall study from which findings are presented gave stem-equivalent short answer and multiple choice tests of English structure and reading to students at Prince of Songkla University, Thailand. A comparison of scores, facility, discrimination, reliability and validity from 3-, 4- and 5-option versions of the multiple choice test found little…
Descriptors: Foreign Countries, Multiple Choice Tests, College Students, Language Tests
Hohensinn, Christine; Kubinger, Klaus D. – Educational and Psychological Measurement, 2011
In aptitude and achievement tests, different response formats are usually used. A fundamental distinction must be made between the class of multiple-choice formats and the constructed response formats. Previous studies have examined the impact of different response formats applying traditional statistical approaches, but these influences can also…
Descriptors: Item Response Theory, Multiple Choice Tests, Responses, Test Format
Kim, Sooyeon; Walker, Michael E.; McHale, Frederick – ETS Research Report Series, 2008
This study examined variations of a nonequivalent groups equating design used with constructed-response (CR) tests to determine which design was most effective in producing equivalent scores across the two tests to be equated. Using data from a large-scale exam, the study investigated the use of anchor CR item rescoring in the context of classical…
Descriptors: Equated Scores, Comparative Analysis, Test Format, Responses

Albanese, Mark A. – Educational Measurement: Issues and Practice, 1993
A comprehensive review is given of evidence, with a bearing on the recommendation to avoid use of complex multiple choice (CMC) items. Avoiding Type K items (four primary responses and five secondary choices) seems warranted, but evidence against CMC in general is less clear. (SLD)
Descriptors: Cues, Difficulty Level, Multiple Choice Tests, Responses

Knowles, Susan L.; Welch, Cynthia A. – Educational and Psychological Measurement, 1992
A meta-analysis of the difficulty and discrimination of the "none-of-the-above" (NOTA) test option was conducted with 12 articles (20 effect sizes) for difficulty and 7 studies (11 effect sizes) for discrimination. Findings indicate that using the NOTA option does not result in items of lesser quality. (SLD)
Descriptors: Difficulty Level, Effect Size, Meta Analysis, Multiple Choice Tests

Katz, Irvin R.; Bennett, Randy Elliot; Berger, Aliza E. – Journal of Educational Measurement, 2000
Studied the solution strategies of 55 high school students who solved parallel constructed response and multiple-choice items that differed only in the presence of response options. Differences in difficulty between response formats did not correspond to differences in strategy choice. Interprets results in light of the relative comprehension…
Descriptors: College Entrance Examinations, Constructed Response, Difficulty Level, High School Students
Tollefson, Nona; Tripp, Alice – 1983
This study compared the item difficulty and item discrimination of three multiple choice item formats. The multiple choice formats studied were: a complex alternative (none of the above) as the correct answer; a complex alternative as a foil, and the one-correct answer format. One hundred four graduate students were randomly assigned to complete…
Descriptors: Analysis of Variance, Difficulty Level, Graduate Students, Higher Education
Wang, Xiang-bo; And Others – 1993
An increasingly popular test format allows examinees to choose the items they will answer from among a larger set. When examinee choice is allowed fairness requires that the different test forms thus formed be equated for their possible differential difficulty. For this equating to be possible it is necessary to know how well examinees would have…
Descriptors: Adaptive Testing, Advanced Placement, Difficulty Level, Equated Scores
Henning, Grant – 1991
In order to evaluate the Test of English as a Foreign Language (TOEFL) vocabulary item format and to determine the effectiveness of alternative vocabulary test items, this study investigated the functioning of eight different multiple-choice formats that differed with regard to: (1) length and inference-generating quality of the stem; (2) the…
Descriptors: Adults, Context Effect, Difficulty Level, English (Second Language)

Frisbie, David A. – Educational Measurement: Issues and Practice, 1992
Literature related to the multiple true-false (MTF) item format is reviewed. Each answer cluster of a MTF item may have several true items and the correctness of each is judged independently. MTF tests appear efficient and reliable, although they are a bit harder than multiple choice items for examinees. (SLD)
Descriptors: Achievement Tests, Difficulty Level, Literature Reviews, Multiple Choice Tests
Burton, Nancy W.; And Others – 1976
Assessment exercises (items) in three different formats--multiple-choice with an "I don't know" (IDK) option, multiple-choice without the IDK, and open-ended--were placed at the beginning, middle and end of 45-minute assessment packages (instruments). A balanced incomplete blocks analysis of variance was computed to determine the biasing…
Descriptors: Age Differences, Difficulty Level, Educational Assessment, Guessing (Tests)

Kumar, V. K.; And Others – Contemporary Educational Psychology, 1979
Ninth-graders read a passage for a test to be taken the next day, anticipating a recall test, a multiple-choice test, and a retention test. Half received either a recall or a recognition test regardless of prior instructions. Subjects did better on the recognition tests in all conditions. (Author/RD)
Descriptors: Difficulty Level, Educational Testing, Expectation, Junior High Schools
Kleinke, David J. – 1979
Four forms of a 36-item adaptation of the Stanford Achievement Test were administered to 484 fourth graders. External factors potentially influencing test performance were examined, namely: (1) item order (easy-to-difficult vs. uniform); (2) response location (left column vs. right column); (3) handedness which may interact with response location;…
Descriptors: Achievement Tests, Answer Sheets, Difficulty Level, Eye Hand Coordination
Previous Page | Next Page »
Pages: 1 | 2