Publication Date
| In 2026 | 0 |
| Since 2025 | 81 |
| Since 2022 (last 5 years) | 449 |
| Since 2017 (last 10 years) | 1237 |
| Since 2007 (last 20 years) | 2511 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 130 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| United Kingdom | 51 |
| Germany | 50 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 34 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Stokes, Michael T.; And Others – Journal of Computer-Based Instruction, 1988
Results of a study that assessed the effect of requiring students to wait for a short time interval before responding to computer-generated multiple choice test items support the notion that moderate delays enhance user performance on cognitive tasks. Three conditions of computer lockout were examined in a university psychology course. (12…
Descriptors: Academic Achievement, Analysis of Variance, Computer Assisted Testing, Higher Education
Peer reviewedLennon, Paul – ELT Journal, 1989
Analysis of advanced English-as-a-second-language students' responses to proficiency tests and conversational cloze tests after a six-month residency in England revealed that, while written multiple-choice tests clearly showed linguistic improvement, the oral cloze tests separated out subjects more effectively. (Author/CB)
Descriptors: Advanced Students, Cloze Procedure, English (Second Language), Foreign Countries
Peer reviewedFoos, Paul W. – Journal of Experimental Education, 1995
Performances of 75 college students, matched for total study time, who wrote 1, 2, or no summaries while studying a text for recall were compared. Results support the hypothesis that less frequent summarizing (only 1) produces better performance. The effect can be obtained for recognition as well as recall. (SLD)
Descriptors: College Students, Higher Education, Multiple Choice Tests, Recall (Psychology)
Peer reviewedHarasym, P. H.; And Others – Evaluation and the Health Professions, 1992
Findings from a study with approximately 200 first-year University of Calgary (Canada) nursing students provide evidence that the use of negation (e.g., not, except) should be limited in stems of multiple-choice test items and that a single-response negatively worded item should be converted to a multiple-response positively worded item. (SLD)
Descriptors: College Students, Foreign Countries, Higher Education, Multiple Choice Tests
Peer reviewedSchwarz, Shirley P.; And Others – Journal of Educational Measurement, 1991
Interviews were conducted with 104 students in masters' level classes to determine their reasons for changing test answers. Subjects previously had been instructed in answer-changing strategies. Most changes were for thought out reasons; few were because of clerical errors. Reconsideration of test items is probably underestimated in…
Descriptors: Achievement Gains, Graduate Students, Guessing (Tests), Higher Education
Peer reviewedWainer, Howard; Thissen, David – Applied Measurement in Education, 1993
Because assessment instruments of the future may well be composed of a combination of types of questions, a way to combine those scores effectively is discussed. Two new graphic tools are presented that show that it may not be practical to equalize the reliability of different components. (SLD)
Descriptors: Constructed Response, Educational Assessment, Graphs, Item Response Theory
Peer reviewedBeckwith, J. B. – Higher Education, 1991
Relationships between three approaches to learning (surface, deep, and achieving), prior knowledge of subject area, and performance on a multiple-choice test following a unit in basic psychology were investigated with 105 college freshmen. Approaches to learning were unrelated to test performance. Prior knowledge did not relate to a deep approach…
Descriptors: Cognitive Style, College Freshmen, Educational Attitudes, Goal Orientation
Peer reviewedFrary, Robert B. – Applied Measurement in Education, 1991
The use of the "none-of-the-above" option (NOTA) in 20 college-level multiple-choice tests was evaluated for classes with 100 or more students. Eight academic disciplines were represented, and 295 NOTA and 724 regular test items were used. It appears that the NOTA can be compatible with good classroom measurement. (TJH)
Descriptors: College Students, Comparative Testing, Difficulty Level, Discriminant Analysis
Peer reviewedAnbar, Michael – Academic Medicine, 1991
Interactive computerized tests accepting unrestricted natural-language input were used to assess knowledge of clinical biophysics at the State University of New York at Buffalo. Comparison of responses to open-ended sequential questions and multiple-choice questions on the same material found the two formats test different aspects of competence.…
Descriptors: Biology, Comparative Analysis, Computer Assisted Testing, Higher Education
Peer reviewedLaufer, Batia – Applied Linguistics, 1990
Native speaking learners of English were compared with foreign learners with regard to confusion of "synforms" (similar lexical forms). Synform-induced errors were similar in native speaking learners and foreign learners indicating that all learners, native and foreign, follow coinciding developmental sequences. (24 references)…
Descriptors: Comparative Analysis, English (Second Language), Error Analysis (Language), Language Research
Peer reviewedLewis, Robert; Berghoff, Paul; Pheeney, Pierette – Innovative Higher Education, 1999
Three professors share techniques for helping students focus on assessments required in classes. Charts are used to show students the specific concepts, principles, and problems that will be included on multiple-choice tests; rubrics developed for assigned work are used to increase student expectations and direct their explorations; and negotiated…
Descriptors: Academic Standards, Assignments, Attention Control, Charts
Peer reviewedErcikan, Kadriye; Schwartz, Richard D.; Julian, Marc W.; Burket, George R.; Weber, Melba M.; Link, Valerie – Journal of Educational Measurement, 1998
Discusses and demonstrates combining scores from multiple-choice (MC) and constructed-response (CR) items to create a common scale using Item Response Theory methodology. Provides empirical results using a set of tests in reading, language, mathematics, and science in three grades. (SLD)
Descriptors: Constructed Response, Elementary Secondary Education, Item Response Theory, Language Arts
Peer reviewedKatz, Irvin R.; Bennett, Randy Elliot; Berger, Aliza E. – Journal of Educational Measurement, 2000
Studied the solution strategies of 55 high school students who solved parallel constructed response and multiple-choice items that differed only in the presence of response options. Differences in difficulty between response formats did not correspond to differences in strategy choice. Interprets results in light of the relative comprehension…
Descriptors: College Entrance Examinations, Constructed Response, Difficulty Level, High School Students
Stiggins, Richard J. – School Administrator, 1998
Today's teachers are unprepared to meet increasingly complex assessment challenges. The poor state of assessment literacy arises from naive assumptions about standardized testing and student motivation, fear of being held accountable for student achievement, parents' nostalgic views of testing, and confusion over achievement expectations for high…
Descriptors: Accountability, Educational Objectives, Elementary Secondary Education, Misconceptions
Peer reviewedWheeler, Patricia H. – Evaluation Practice, 1995
This volume is the fourth in a series for college faculty and advanced graduate students, "Survival Skills for Scholars." It offers practical advice for developing, using, and grading classroom examinations, focusing on traditional multiple-choice and constructed-response tests rather than alternative assessments. (SLD)
Descriptors: College Faculty, Constructed Response, Grading, Higher Education


