Publication Date
| In 2026 | 0 |
| Since 2025 | 81 |
| Since 2022 (last 5 years) | 449 |
| Since 2017 (last 10 years) | 1237 |
| Since 2007 (last 20 years) | 2511 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 130 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| United Kingdom | 51 |
| Germany | 50 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 34 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Peer reviewedHarris, Diana K.; And Others – Educational Gerontology, 1996
Multiple-choice and true-false versions of Palmore's first Facts on Aging Quiz were completed by 501 college students. Multiple choice reduced the chances of guessing and had less measurement error for average and above-average respondents. (Author/SK)
Descriptors: Aging (Individuals), College Students, Error of Measurement, Guessing (Tests)
Ingram, Ella L.; Nelson, Craig E. – American Biology Teacher, 2006
Multiple choice questions are a common method of assessing student understanding. In this article, the authors discuss and evaluate a student-focused collaborative learning strategy for working with such questions that results in greater student learning and allows instructors to better understand student thinking and ultimately to write better…
Descriptors: Multiple Choice Tests, Misconceptions, Cooperative Learning, Teaching Methods
Kim, Seonghoon; Kolen, Michael J. – Applied Measurement in Education, 2006
Four item response theory linking methods (2 moment methods and 2 characteristic curve methods) were compared to concurrent (CO) calibration with the focus on the degree of robustness to format effects (FEs) when applying the methods to multidimensional data that reflected the FEs associated with mixed-format tests. Based on the quantification of…
Descriptors: Item Response Theory, Robustness (Statistics), Test Format, Comparative Analysis
Briggs, Derek C.; Alonzo, Alicia C.; Schwab, Cheryl; Wilson, Mark – Educational Assessment, 2006
In this article we describe the development, analysis, and interpretation of a novel item format we call Ordered Multiple-Choice (OMC). A unique feature of OMC items is that they are linked to a model of student cognitive development for the construct being measured. Each of the possible answer choices in an OMC item is linked to developmental…
Descriptors: Diagnostic Tests, Multiple Choice Tests, Cognitive Development, Item Response Theory
Bush, Martin E. – Quality Assurance in Education: An International Perspective, 2006
Purpose: To provide educationalists with an understanding of the key quality issues relating to multiple-choice tests, and a set of guidelines for the quality assurance of such tests. Design/methodology/approach: The discussion of quality issues is structured to reflect the order in which those issues naturally arise. It covers the design of…
Descriptors: Multiple Choice Tests, Test Reliability, Educational Quality, Quality Control
Hautau, Briana; Turner, Haley C.; Carroll, Erin; Jaspers, Kathryn; Krohn, Katy; Parker, Megan; Williams, Robert L. – Journal of Behavioral Education, 2006
Students (N=153) in three equivalent sections of an undergraduate human development course compared pairs of related concepts via either written or oral discussion at the beginning of most class sessions. A writing-for-random-credit section achieved significantly higher ratings on the writing activities than did a writing-for-no-credit section.…
Descriptors: Writing Exercises, Multiple Choice Tests, Undergraduate Study, Credits
Bleske-Rechek, April; Zeug, Nicole; Webb, Rose Mary – Assessment & Evaluation in Higher Education, 2007
We conducted correlational and performance discrepancy analyses on exam and achievement data taken from students in three psychology courses. Across courses, the same findings emerged. First, only a small fraction of students consistently performed more strongly on one type of assessment (e.g., multiple-choice) than on another (e.g., short…
Descriptors: Psychology, Scores, Academic Aptitude, Academic Achievement
Hartley, James; Betts, Lucy; Murray, Wayne – Psychology Teaching Review, 2007
Background: Recent changes in higher education in the UK have led to much discussion about the performance of men and women students with different methods of assessment. Aim: To see whether or not there were differences between the marks awarded to men and women final-year psychology students as a function of the modes of assessment used. Method:…
Descriptors: Student Evaluation, Females, Psychology, Males
Emurian, Henry H. – Behavior Analyst Today, 2007
At the beginning of a Java computer programming course, nine students in an undergraduate class and nine students in a graduate class completed a web-based programmed instruction tutoring system that taught a simple computer program. All students exited the tutor with an identical level of skill, at least as determined by the tutor's required…
Descriptors: Multiple Choice Tests, Computer Software, Computers, Program Effectiveness
Badgett, John L.; Christmann, Edwin P. – Corwin, 2009
While today's curriculum is largely driven by standards, many teachers find the lack of specificity in the standards to be confounding and even intimidating. Now this practical book provides middle and high school teachers with explicit guidance on designing specific objectives and developing appropriate formative and summative assessments to…
Descriptors: Test Items, Student Evaluation, Knowledge Level, National Standards
Kehoe, Jerard – 1995
This digest presents a list of recommendations for writing multiple-choice test items, based on psychometrics statistics are typically provided by a measurement, or test scoring, service, where tests are machine-scored or by testing software packages. Test makers can capitalize on the fact that "bad" items can be differentiated from…
Descriptors: Item Analysis, Item Banks, Measurement Techniques, Multiple Choice Tests
Lyu, C. Felicia; And Others – 1995
A smoothed version of standardization, which merges kernel smoothing with the traditional standardization differential item functioning (DIF) approach, was used to examine DIF for student-produced response (SPR) items on the Scholastic Assessment Test (SAT) I mathematics test at both the item and testlet levels. This nonparametric technique avoids…
Descriptors: Aptitude Tests, Item Bias, Mathematics Tests, Multiple Choice Tests
Ammeraal, Brenda – 1997
A study examined the correlation between students' placement test scores on a multiple-choice test and their passing rate on the Advanced Placement (AP) language exam. Statistics show that the number of students taking advanced placement tests is increasing, and a review of the literature supports the need for further research in the area of…
Descriptors: Advanced Placement, Catholic Schools, Correlation, English
Dorans, Neil J.; Potenza, Maria T. – 1994
Educational reform efforts have led to increased use of alternatives to the traditional binary-scored multiple choice item. Many stimuli employed by these alternative assessments yield complex responses that require complex scoring rules. Some of these new item types can be polytomously-scored. Differential item functioning (DIF) assessment is a…
Descriptors: Classification, Educational Assessment, Educational Change, Equal Education
Bonnot, Lois; And Others – 1992
This document contains four certification examinations (Forms A-D) for a certified nurse assistant in Missouri. Each of the four test booklets contains general instructions for completing the information section of the answer sheet, general instructions for taking the certification examination, and 100 questions. All test problems are in a…
Descriptors: Allied Health Occupations Education, Certification, Licensing Examinations (Professions), Multiple Choice Tests

Direct link
