NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 70 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Omarov, Nazarbek Bakytbekovich; Mohammed, Aisha; Alghurabi, Ammar Muhi Khleel; Alallo, Hajir Mahmood Ibrahim; Ali, Yusra Mohammed; Hassan, Aalaa Yaseen; Demeuova, Lyazat; Viktorovna, Shvedova Irina; Nazym, Bekenova; Al Khateeb, Nashaat Sultan Afif – International Journal of Language Testing, 2023
The Multiple-choice (MC) item format is commonly used in educational assessments due to its economy and effectiveness across a variety of content domains. However, numerous studies have examined the quality of MC items in high-stakes and higher-education assessments and found many flawed items, especially in terms of distractors. These faulty…
Descriptors: Test Items, Multiple Choice Tests, Item Response Theory, English (Second Language)
Peer reviewed Peer reviewed
Direct linkDirect link
Stiller, Jurik; Hartmann, Stefan; Mathesius, Sabrina; Straube, Philipp; Tiemann, Rüdiger; Nordmeier, Volkhard; Krüger, Dirk; Upmeier zu Belzen, Annette – Assessment & Evaluation in Higher Education, 2016
The aim of this study was to improve the criterion-related test score interpretation of a text-based assessment of scientific reasoning competencies in higher education by evaluating factors which systematically affect item difficulty. To provide evidence about the specific demands which test items of various difficulty make on pre-service…
Descriptors: Logical Thinking, Scientific Concepts, Difficulty Level, Test Items
Peer reviewed Peer reviewed
Vidler, Derek; Hansen, Richard – Journal of Experimental Education, 1980
Relationships among patterns of answer changing and item characteristics on multiple-choice tests are discussed. Results obtained were similar to those found in previous studies but pointed to further relationships among these variables. (Author/GK)
Descriptors: College Students, Difficulty Level, Higher Education, Multiple Choice Tests
Peer reviewed Peer reviewed
Green, Kathy – Educational and Psychological Measurement, 1984
Two factors, language difficulty and option set convergence, were experimentally manipulated and their effects on item difficulty assessed. Option convergence was found to have a significant effect on item difficulty while the effect of language difficulty was not significant. (Author/BW)
Descriptors: Difficulty Level, Error Patterns, Higher Education, Multiple Choice Tests
Peer reviewed Peer reviewed
Tollefson, Nona – Educational and Psychological Measurement, 1987
This study compared the item difficulty, item discrimination, and test reliability of three forms of multiple-choice items: (1) one correct answer; (2) "none of the above" as a foil; and (3) "none of the above" as the correct answer. Twelve items in the three formats were administered in a college statistics examination. (BS)
Descriptors: Difficulty Level, Higher Education, Item Analysis, Multiple Choice Tests
Peer reviewed Peer reviewed
Newman, Dianna L.; And Others – Applied Measurement in Education, 1988
The effect of using statistical and cognitive item difficulty to determine item order on multiple-choice tests was examined, using 120 undergraduate students. Students performed better when items were ordered by increasing cognitive difficulty rather than decreasing difficulty. The statistical ordering of difficulty had little effect on…
Descriptors: Cognitive Tests, Difficulty Level, Higher Education, Multiple Choice Tests
Peer reviewed Peer reviewed
Rocklin, Thomas; Thompson, Joan M. – Journal of Educational Psychology, 1985
Interactive effects of item difficulty, test anxiety, and failure feedback are examined in a study using multiple choice verbal aptitude items. Results indicate that ability estimates can be affected by the examinee's improved performance, especially for students given an easy test. (Author/DWH)
Descriptors: Academic Ability, Difficulty Level, Feedback, Higher Education
Peer reviewed Peer reviewed
Blumberg, Phyllis; And Others – Educational and Psychological Measurement, 1982
First year medical students answered parallel multiple-choice questions at different taxonomic levels as part of their diagnostic examinations. The results show that when content is held constant, students perform as well on interpretation and problem-solving questions as on recall questions. (Author/BW)
Descriptors: Classification, Cognitive Processes, Difficulty Level, Higher Education
Peer reviewed Peer reviewed
McMillan, James R.; And Others – Delta Pi Epsilon Journal, 1989
An investigation analyzed difficulty and discrimination statistics for 91 multiple-choice tests written by 46 business administration instructors and administered to 7,511 students. A large percentage of the tests failed the difficulty and discrimination standards proposed by several testing experts, implying that teachers need more preparation in…
Descriptors: Business Administration Education, Difficulty Level, Discriminant Analysis, Higher Education
Peer reviewed Peer reviewed
Weiten, Wayne – Journal of Experimental Education, 1982
A comparison of double as opposed to single multiple-choice questions yielded significant differences in regard to item difficulty, item discrimination, and internal reliability, but not concurrent validity. (Author/PN)
Descriptors: Difficulty Level, Educational Testing, Higher Education, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Jensen, Murray; Duranczyk, Irene; Staats, Susan; Moore, Randy; Hatch, Jay; Somdahl, Chas – American Biology Teacher, 2006
This paper describes and evaluates a new type of multiple-choice test question that is relatively easy to construct and that challenges students' understandings of biological concepts. The questions involve a small narrative of scientific text that students must evaluate for accuracy. These are termed "You are the Teacher" questions because the…
Descriptors: Reciprocal Teaching, Multiple Choice Tests, Biology, Evaluation
Green, Kathy – 1981
Item response changing as a function of test anxiety was investigated. Seventy graduate students enrolled in a basic statistics course completed 73 multiple-choice items on the course content and the Test Anxiety Scale (TAS). The TAS consisted of 25 items that students indicated were descriptive (true) or not descriptive (false) of themselves.…
Descriptors: Difficulty Level, Graduate Students, Higher Education, Multiple Choice Tests
Peer reviewed Peer reviewed
Green, Kathy E. – Educational and Psychological Measurement, 1983
This study was concerned with the reliability and validity of subjective judgments about five characteristics of multiple-choice test items from an introductory college-level astronomy test: (1) item difficulty, (2) language complexity, (3) content importance or relevance, (4) response set convergence, and (5) process complexity. (Author)
Descriptors: Achievement Tests, Astronomy, Difficulty Level, Evaluative Thinking
Peer reviewed Peer reviewed
Straton, Ralph G.; Catts, Ralph M. – Educational and Psychological Measurement, 1980
Multiple-choice tests composed entirely of two-, three-, or four-choice items were investigated. Results indicated that number of alternatives per item was inversely related to item difficulty, but directly related to item discrimination. Reliability and standard error of measurement of three-choice item tests was equivalent or superior.…
Descriptors: Difficulty Level, Error of Measurement, Foreign Countries, Higher Education
Freedle, Roy; Kostin, Irene – 1992
This study examines the predictability of Graduate Record Examinations (GRE) reading item difficulty (equated delta) for the three major reading item types: main idea, inference, and explicit statement items. Each item type is analyzed separately, using 110 GRE reading passages and their associated 244 reading items; selective analyses of 285…
Descriptors: College Entrance Examinations, Correlation, Difficulty Level, Higher Education
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5