NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 9 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
DeCarlo, Lawrence T. – Journal of Educational Measurement, 2023
A conceptualization of multiple-choice exams in terms of signal detection theory (SDT) leads to simple measures of item difficulty and item discrimination that are closely related to, but also distinct from, those used in classical item analysis (CIA). The theory defines a "true split," depending on whether or not examinees know an item,…
Descriptors: Multiple Choice Tests, Test Items, Item Analysis, Test Wiseness
Wilhite, Stephen C. – 1986
A study examined the effect of headings and adjunct questions embedded in an expository test on the delayed multiple-choice test performance of 88 undergraduate students enrolled in psychology courses. The subject of the passage read by the students was the settling of Anglo America; the subheadings in the passage listed names of major subtopics…
Descriptors: Higher Education, Motivation, Multiple Choice Tests, Questioning Techniques
Peer reviewed Peer reviewed
Fagley, N. S. – Journal of Educational Psychology, 1987
This article investigates positional response bias, testwiseness, and guessing strategy as components of variance in test responses on multiple-choice tests. University students responded to two content exams, a testwiseness measure, and a guessing strategy measure. The proportion of variance in test scores accounted for by positional response…
Descriptors: Achievement Tests, Guessing (Tests), Higher Education, Multiple Choice Tests
Zin, Than Than; Williams, John – 1991
Brief explanations are presented of some of the different methods used to score multiple-choice tests; and some studies of partial information, guessing strategies, and test-taking behaviors are reviewed. Studies are grouped in three categories of effort to improve scoring: (1) those that require extra effort from the examinee to answer…
Descriptors: Educational Research, Estimation (Mathematics), Guessing (Tests), Literature Reviews
Torrence, David R. – 1986
This was a replicative study that was initiated with a journeyman level certification instrument for an international union, when industry monitors were observed suggesting to examinees to "go with your first response." The question arose whether this was a researched-based practice. If not, wouldn't this practice inject constant error…
Descriptors: Adults, Correlation, Error of Measurement, Guessing (Tests)
White, David M. – 1986
This book discusses tricks for answering questions on the Law School Admission Test (LSAT). The tricks are based on an analysis of 12 editions of the LSAT which have been made public pursuant to New York's Truth in Testing Law. Sample LSAT questions published by the Law School Admission Council are referenced to exemplify the tricks' applications…
Descriptors: College Entrance Examinations, Higher Education, Multiple Choice Tests, Pretesting
White, David M. – 1985
This book discusses tricks for answering questions on the Graduate Management Admission Test (GMAT). The tricks are based on an analysis of 20 editions of the GMAT which have been made public pursuant to New York's Truth in Testing Law. Sample GMAT questions published by the Graduate Management Admission Council are referenced to exemplify the…
Descriptors: College Entrance Examinations, Higher Education, Multiple Choice Tests, Pretesting
Peer reviewed Peer reviewed
Mitchell, G.; And Others – Medical Teacher, 1986
Describes a study designed to determine if the amount of time allocated for answering multiple true/false type questions affects the grades of the medical students taking the tests. Students who had 2-1/4 minutes to answer each question scored significantly better than those who had 1-1/2 minutes or 3 minutes. (TW)
Descriptors: Biochemistry, College Science, Higher Education, Medical Education
Peer reviewed Peer reviewed
Barnett-Foster, Debora; Nagy, Philip – Alberta Journal of Educational Research, 1995
Analysis of response strategies employed by 261 undergraduate chemistry students when answering multiple-choice and stem-equivalent constructed-response questions revealed no significant differences in types of solution strategies or types of errors across test format. However, analysis of student oral reports revealed a higher frequency of…
Descriptors: Chemistry, Constructed Response, Educational Research, Educational Testing