Publication Date
| In 2026 | 0 |
| Since 2025 | 85 |
| Since 2022 (last 5 years) | 453 |
| Since 2017 (last 10 years) | 1241 |
| Since 2007 (last 20 years) | 2515 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 131 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| United Kingdom | 51 |
| Germany | 50 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 35 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Peer reviewedCohen, Andrew D. – Language Testing, 1984
Discusses methods for obtaining verbal report data on second language test-taking strategies. Reports on the findings obtained from unpublished studies dealing with how language learners take reading tests. Concludes that there should be a closer fit between how test constructors intend their tests to be taken and how respondents actually take…
Descriptors: Cloze Procedure, Language Tests, Multiple Choice Tests, Reading Tests
Peer reviewedFabrey, Lawrence J.; Case, Susan M. – Journal of Medical Education, 1985
The effect on test scores of changing answers to multiple-choice questions was studied and compared to earlier research. The current setting was a nationally administered, in-training, specialty examination for medical residents in obstetrics and gynecology. Both low and high scorers improved their scores when they changed answers. (SW)
Descriptors: Educational Testing, Graduate Medical Students, Guessing (Tests), Gynecology
Peer reviewedBradbard, David A.; Green, Samuel B. – Journal of Experimental Education, 1986
The effectiveness of the Coombs elimination procedure was evaluated with 29 college students enrolled in a statistics course. Five multiple-choice tests were employed and scored using the Coombs procedure. Results suggest that the Coombs procedure decreased guessing, and this effect increased over the grading period. (Author/LMO)
Descriptors: Analysis of Variance, College Students, Guessing (Tests), Higher Education
Peer reviewedSkinner, Nicholas F. – Teaching of Psychology, 1983
Because of a belief that the alternatives they had chosen initially were probably correct, most subjects were reluctant to change answers, and, consequently, did so only when they were highly confident in the change. Results were that more than half the changes were correct. (RM)
Descriptors: Decision Making, Educational Research, Females, Higher Education
Peer reviewedFrederiksen, Norman – American Psychologist, 1984
Argues that because widely used multiple choice tests do not measure more complex cognitive skills, such skills are not taught. Suggests that greater costs of tests in other formats can be justified by their value for instruction--to encourage teaching of higher level cognitive skills and provide practice with feedback. (CMG)
Descriptors: Cognitive Ability, Curriculum, Educational Testing, Elementary Secondary Education
Peer reviewedPlake, Barbara S.; Huntley, Renee M. – Educational and Psychological Measurement, 1984
Two studies examined the effect of making the correct answer of a multiple choice test item grammatically consistent with the item. American College Testing Assessment experimental items were constructed to investigate grammatical compliance to investigate grammatical compliance for plural-singular and vowel-consonant agreement. Results suggest…
Descriptors: Grammar, Higher Education, Item Analysis, Multiple Choice Tests
Arizona Department of Education, 2006
Arizona's Instrument to Measure Standards Dual Purpose Assessment (AIMS DPA) is a combination of two separate tests. One test is AIMS, which measures how well the student knows the reading, writing , and mathematics content that all Arizona students at the third grade level are expected to know and be able to do. AIMS includes multiple-choice…
Descriptors: Grade 3, Test Items, Scoring, Academic Standards
Arizona Department of Education, 2006
Arizona's Instrument to Measure Standards Dual Purpose Assessment (AIMS DPA) is a combination of two separate tests. One test is AIMS, which measures how well the student knows the reading, writing, and mathematics content that all Arizona students in the student's grade level are expected to know and be able to do. AIMS DPA includes…
Descriptors: Grade 4, Scoring, Academic Standards, Writing Tests
Arizona Department of Education, 2006
Arizona's Instrument to Measure Standards Dual Purpose Assessment (AIMS DPA) is a combination of two separate tests. One test is AIMS, which measures how well the student knows the reading, writing, and mathematics content that all Arizona students in the student's grade level are expected to know and be able to do. AIMS DPA includes…
Descriptors: Grade 5, Scoring, Academic Standards, Writing Tests
Arizona Department of Education, 2006
Arizona's Instrument to Measure Standards Dual Purpose Assessment (AIMS DPA) is a combination of two separate tests. One test is AIMS, which measures how well students know the reading, writing, and mathematics content that all Arizona students at the sixth grade level are expected to know and be able to do. AIMS DPA includes multiple-choice…
Descriptors: Grade 6, Scoring, Academic Standards, Writing Tests
Schnipke, Deborah L. – 1999
When running out of time on a multiple-choice test such as the Law School Admission Test (LSAT), some test takers are likely to respond rapidly to the remaining unanswered items in an attempt to get some items right by chance. Because these responses will tend to be incorrect, the presence of rapid-guessing behavior could cause these items to…
Descriptors: College Entrance Examinations, Difficulty Level, Estimation (Mathematics), Guessing (Tests)
Johnson, Matthew S.; Sinharay, Sandip – 2003
For complex educational assessments, there is an increasing use of "item families," which are groups of related items. However, calibration or scoring for such an assessment requires fitting models that take into account the dependence structure inherent among the items that belong to the same item family. C. Glas and W. van der Linden…
Descriptors: Bayesian Statistics, Constructed Response, Educational Assessment, Estimation (Mathematics)
Nasser, Fadia – 2000
This exploratory study examined errors that students commit in solving multiple-choice questions about descriptive statistics and basic concepts in research methods. The sample consisted of 81 undergraduate students in an introductory statistics course. The majority (85%) of the participants were female students enrolled in education, sociology,…
Descriptors: Concept Formation, Females, Foreign Countries, Higher Education
Peer reviewedTraub, Ross E.; Hambleton, Ronald K. – Educational and Psychological Measurement, 1973
Descriptors: Grade 8, Guessing (Tests), Multiple Choice Tests, Pacing
Peer reviewedHoffmann, Hans G. – Zielsprache Englisch, 1973
Descriptors: Administration, Diagnostic Tests, English (Second Language), Evaluation


