NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 86 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
McGuire, Michael J. – International Journal for the Scholarship of Teaching and Learning, 2023
College students in a lower-division psychology course made metacognitive judgments by predicting and postdicting performance for true-false, multiple-choice, and fill-in-the-blank question sets on each of three exams. This study investigated which question format would result in the most accurate metacognitive judgments. Extending Koriat's (1997)…
Descriptors: Metacognition, Multiple Choice Tests, Accuracy, Test Format
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2022
As implementation of the "Next Generation Science Standards" moves forward, there is a need for new assessments that can measure students' integrated three-dimensional science learning. The National Research Council has suggested that these assessments be multicomponent tasks that utilize a combination of item formats including…
Descriptors: Multiple Choice Tests, Conditioning, Test Items, Item Response Theory
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wolkowitz, Amanda A.; Foley, Brett; Zurn, Jared – Practical Assessment, Research & Evaluation, 2023
The purpose of this study is to introduce a method for converting scored 4-option multiple-choice (MC) items into scored 3-option MC items without re-pretesting the 3-option MC items. This study describes a six-step process for achieving this goal. Data from a professional credentialing exam was used in this study and the method was applied to 24…
Descriptors: Multiple Choice Tests, Test Items, Accuracy, Test Format
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gorney, Kylie; Wollack, James A. – Practical Assessment, Research & Evaluation, 2022
Unlike the traditional multiple-choice (MC) format, the discrete-option multiple-choice (DOMC) format does not necessarily reveal all answer options to an examinee. The purpose of this study was to determine whether the reduced exposure of item content affects test security. We conducted an experiment in which participants were allowed to view…
Descriptors: Test Items, Test Format, Multiple Choice Tests, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Liao, Ray J. T. – Language Testing, 2023
Among the variety of selected response formats used in L2 reading assessment, multiple-choice (MC) is the most commonly adopted, primarily due to its efficiency and objectiveness. Given the impact of assessment results on teaching and learning, it is necessary to investigate the degree to which the MC format reliably measures learners' L2 reading…
Descriptors: Reading Tests, Language Tests, Second Language Learning, Second Language Instruction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wang, Lin – ETS Research Report Series, 2019
Rearranging response options in different versions of a test of multiple-choice items can be an effective strategy against cheating on the test. This study investigated if rearranging response options would affect item performance and test score comparability. A study test was assembled as the base version from which 3 variant versions were…
Descriptors: Multiple Choice Tests, Test Items, Test Format, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Bulut, Okan; Bulut, Hatice Cigdem; Cormier, Damien C.; Ilgun Dibek, Munevver; Sahin Kursad, Merve – Educational Assessment, 2023
Some statewide testing programs allow students to receive corrective feedback and revise their answers during testing. Despite its pedagogical benefits, the effects of providing revision opportunities remain unknown in the context of alternate assessments. Therefore, this study examined student data from a large-scale alternate assessment that…
Descriptors: Error Correction, Alternative Assessment, Feedback (Response), Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Opstad, Leiv – Athens Journal of Education, 2021
The discussion of whether multiple-choice questions can replace the traditional exam with essays and constructed questions in introductory courses has just started in Norway. There is not an easy answer. The findings depend on the pattern of the questions. Therefore, one must be careful in drawing conclusions. In this research, one will explore a…
Descriptors: Multiple Choice Tests, Essay Tests, Introductory Courses, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Shear, Benjamin R. – Journal of Educational Measurement, 2023
Large-scale standardized tests are regularly used to measure student achievement overall and for student subgroups. These uses assume tests provide comparable measures of outcomes across student subgroups, but prior research suggests score comparisons across gender groups may be complicated by the type of test items used. This paper presents…
Descriptors: Gender Bias, Item Analysis, Test Items, Achievement Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Papenberg, Martin; Diedenhofen, Birk; Musch, Jochen – Journal of Experimental Education, 2021
Testwiseness may introduce construct-irrelevant variance to multiple-choice test scores. Presenting response options sequentially has been proposed as a potential solution to this problem. In an experimental validation, we determined the psychometric properties of a test based on the sequential presentation of response options. We created a strong…
Descriptors: Test Wiseness, Test Validity, Test Reliability, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Çakiroglu, Ünal; Saylan, Esin; Çevik, Isak; Özkan, Adem – International Review of Research in Open and Distributed Learning, 2022
This quasi-experimental study explored how different online exam types differentiate learners' academic achievement and perceived learning. The participants comprised 95 undergraduate students enrolled in an English course at a Turkish university in three groups, each taking a different type of quiz: with multiple-choice, open-ended, and mixed…
Descriptors: Test Format, Computer Assisted Testing, Electronic Learning, English (Second Language)
Peer reviewed Peer reviewed
Direct linkDirect link
Moon, Jung Aa; Keehner, Madeleine; Katz, Irvin R. – Educational Assessment, 2020
We investigated how item formats influence test takers' response tendencies under uncertainty. Adult participants solved content-equivalent math items in three formats: multiple-selection multiple-choice, grid with forced-choice (true-false) options, and grid with non-forced-choice options. Participants showed a greater tendency to commit (rather…
Descriptors: College Students, Test Wiseness, Test Format, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
O'Grady, Stefan – Language Teaching Research, 2023
The current study explores the impact of varying multiple-choice question preview and presentation formats in a test of second language listening proficiency targeting different levels of text comprehension. In a between-participant design, participants completed a 30-item test of listening comprehension featuring implicit and explicit information…
Descriptors: Language Tests, Multiple Choice Tests, Scores, Second Language Learning
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Turhan, Nihan Sölpük – International Journal of Progressive Education, 2020
Measurement tools that are used in education are important factors that affect course success and motivation of students. This study aims to determine the opinions of high school students on different question types. As the subgoals of the research, the study aims to determine the reasons for multiple choice test preference and its effect on…
Descriptors: Test Items, Preferences, High School Students, Learning Motivation
Peer reviewed Peer reviewed
Direct linkDirect link
Wörner, Salome; Becker, Sebastian; Küchemann, Stefan; Scheiter, Katharina; Kuhn, Jochen – Physical Review Physics Education Research, 2022
Optics is a core field in the curricula of secondary physics education. In this study, we present the development and validation of a test instrument in the field of optics, the ray optics in converging lenses concept inventory (ROC-CI). It was developed for and validated with middle school students, but can also be adapted for the use in higher…
Descriptors: Optics, Physics, Science Instruction, Concept Formation
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6