NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)7
Since 2006 (last 20 years)9
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 39 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Moon, Jung Aa; Keehner, Madeleine; Katz, Irvin R. – Educational Assessment, 2020
We investigated how item formats influence test takers' response tendencies under uncertainty. Adult participants solved content-equivalent math items in three formats: multiple-selection multiple-choice, grid with forced-choice (true-false) options, and grid with non-forced-choice options. Participants showed a greater tendency to commit (rather…
Descriptors: College Students, Test Wiseness, Test Format, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Moon, Jung Aa; Keehner, Madeleine; Katz, Irvin R. – Educational Measurement: Issues and Practice, 2019
The current study investigated how item formats and their inherent affordances influence test-takers' cognition under uncertainty. Adult participants solved content-equivalent math items in multiple-selection multiple-choice and four alternative grid formats. The results indicated that participants' affirmative response tendency (i.e., judge the…
Descriptors: Affordances, Test Items, Test Format, Test Wiseness
Peer reviewed Peer reviewed
Direct linkDirect link
Brassil, Chad E.; Couch, Brian A. – International Journal of STEM Education, 2019
Background: Within undergraduate science courses, instructors often assess student thinking using closed-ended question formats, such as multiple-choice (MC) and multiple-true-false (MTF), where students provide answers with respect to predetermined response options. While MC and MTF questions both consist of a question stem followed by a series…
Descriptors: Multiple Choice Tests, Objective Tests, Student Evaluation, Thinking Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Lahner, Felicitas-Maria; Lörwald, Andrea Carolin; Bauer, Daniel; Nouns, Zineb Miriam; Krebs, René; Guttormsen, Sissel; Fischer, Martin R.; Huwendiek, Sören – Advances in Health Sciences Education, 2018
Multiple true-false (MTF) items are a widely used supplement to the commonly used single-best answer (Type A) multiple choice format. However, an optimal scoring algorithm for MTF items has not yet been established, as existing studies yielded conflicting results. Therefore, this study analyzes two questions: What is the optimal scoring algorithm…
Descriptors: Scoring Formulas, Scoring Rubrics, Objective Tests, Multiple Choice Tests
Haladyna, Thomas M. – IDEA Center, Inc., 2018
Writing multiple-choice test items to measure student learning in higher education is a challenge. Based on extensive scholarly research and experience, the author describes various item formats, offers guidelines for creating these items, and provides many examples of both good and bad test items. He also suggests some shortcuts for developing…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Higher Education
Peer reviewed Peer reviewed
Direct linkDirect link
Kolomuç, Ali – Asia-Pacific Forum on Science Learning and Teaching, 2017
This study aimed to discover subject-specific science teachers' views of alternative assessment. The questionnaire by Okur (2008) was adapted and deployed for data collection. The sample consisted of 80 subject-specific science teachers drawn from the cities of Trabzon, Rize and Erzurum in Turkey. In analyzing data, descriptive analysis was…
Descriptors: Science Teachers, Teacher Attitudes, Alternative Assessment, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Tetteh, Godson Ayertei; Sarpong, Frederick Asafo-Adjei – Journal of International Education in Business, 2015
Purpose: The purpose of this paper is to explore the influence of constructivism on assessment approach, where the type of question (true or false, multiple-choice, calculation or essay) is used productively. Although the student's approach to learning and the teacher's approach to teaching are concepts that have been widely researched, few…
Descriptors: Foreign Countries, Outcomes of Education, Student Evaluation, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Schaap, Lydia; Verkoeijen, Peter; Schmidt, Henk – Assessment & Evaluation in Higher Education, 2014
This study investigated the effects of two different true-false questions on memory awareness and long-term retention of knowledge. Participants took four subsequent knowledge tests on curriculum learning material that they studied at different retention intervals prior to the start of this study (i.e. prior to the first test). At the first and…
Descriptors: Objective Tests, Test Items, Memory, Long Term Memory
Westhuizen, Duan vd – Commonwealth of Learning, 2016
This work starts with a brief overview of education in developing countries, to contextualise the use of the guidelines. Although this document is intended to be a practical tool, it is necessary to include some theoretical analysis of the concept of online assessment. This is given in Sections 3 and 4, together with the identification and…
Descriptors: Guidelines, Student Evaluation, Computer Assisted Testing, Evaluation Methods
Peer reviewed Peer reviewed
Shaha, Steven H. – Educational and Psychological Measurement, 1984
It was hypothesized that matching test formats would reduce test anxiety. Three experiments were conducted in which high school juniors and seniors took parallel matching and multiple-choice tests covering topics of prior knowledge or recently learned information. Results showed that matching tests were superior to multiple choice formats.…
Descriptors: High Schools, Multiple Choice Tests, Objective Tests, Scores
Peer reviewed Peer reviewed
Kolstad, Rosemarie K.; And Others – Educational Research Quarterly, 1983
Complex multiple choice (CMC) items are frequently used to test knowledge about repetitive information. In two independent comparisons, performance on the CMC items surpassed that of the multiple true-false clusters. Data indicate that performance on CMC items is inflated, and distractors on CMC items fail to prevent guessing. (Author/PN)
Descriptors: Guessing (Tests), Higher Education, Multiple Choice Tests, Objective Tests
Hogan, Thomas P. – 1981
Do choice-type tests (multiple-choice, true-false, etc.) measure the same abilities or traits as free response (essay, recall, completion, etc.) tests? A large number of studies conducted with several different methodologies and spanning a long period of time have addressed this question. In this review, attention will be focused almost…
Descriptors: Achievement Tests, Correlation, Essay Tests, Measurement Techniques
Peer reviewed Peer reviewed
Wilcox, Rand R.; Wilcox, Karen Thompson – Journal of Educational Measurement, 1988
Use of latent class models to examine strategies that examinees (92 college students) use for a specific task is illustrated, via a multiple-choice test of spatial ability. Under an answer-until-correct scoring procedure, models representing an improvement over simplistic random guessing are proposed. (SLD)
Descriptors: College Students, Decision Making, Guessing (Tests), Multiple Choice Tests
Peer reviewed Peer reviewed
Haladyna, Thomas M.; Downing, Steven M. – Applied Measurement in Education, 1989
A taxonomy of 43 rules for writing multiple-choice test items is presented, based on a consensus of 46 textbooks. These guidelines are presented as complete and authoritative, with solid consensus apparent for 33 of the rules. Four rules lack consensus, and 5 rules were cited fewer than 10 times. (SLD)
Descriptors: Classification, Interrater Reliability, Multiple Choice Tests, Objective Tests
Ebel, Robert L. – 1981
An alternate-choice test item is a simple declarative sentence, one portion of which is given with two different wordings. For example, "Foundations like Ford and Carnegie tend to be (1) eager (2) hesitant to support innovative solutions to educational problems." The examinee's task is to choose the alternative that makes the sentence…
Descriptors: Comparative Testing, Difficulty Level, Guessing (Tests), Multiple Choice Tests
Previous Page | Next Page »
Pages: 1  |  2  |  3