Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 10 |
Since 2006 (last 20 years) | 19 |
Descriptor
Correlation | 26 |
Difficulty Level | 26 |
Multiple Choice Tests | 26 |
Test Items | 17 |
Foreign Countries | 9 |
Comparative Analysis | 7 |
Item Analysis | 6 |
Test Format | 6 |
Test Reliability | 6 |
Cognitive Processes | 5 |
College Entrance Examinations | 5 |
More ▼ |
Source
Author
Kobrin, Jennifer L. | 2 |
Abu Kassim, Noor Lide | 1 |
Alsma, Jelmer | 1 |
Anderson, Paul S. | 1 |
Armstrong, Norris | 1 |
Arth, Thomas O. | 1 |
Badrasawi, Kamal J. I. | 1 |
Bielinski, John | 1 |
Biria, Reza | 1 |
Brickman, Peggy | 1 |
Burr, W. S. | 1 |
More ▼ |
Publication Type
Journal Articles | 19 |
Reports - Research | 18 |
Reports - Evaluative | 6 |
Speeches/Meeting Papers | 2 |
Tests/Questionnaires | 2 |
Information Analyses | 1 |
Non-Print Media | 1 |
Reference Materials - General | 1 |
Reports - Descriptive | 1 |
Education Level
Higher Education | 10 |
Postsecondary Education | 7 |
Secondary Education | 3 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Grade 5 | 1 |
Intermediate Grades | 1 |
Middle Schools | 1 |
Audience
Location
Canada | 2 |
Germany | 2 |
Iran (Tehran) | 1 |
Malaysia | 1 |
Netherlands | 1 |
Turkey | 1 |
Laws, Policies, & Programs
Assessments and Surveys
SAT (College Admission Test) | 4 |
Graduate Record Examinations | 2 |
National Assessment of… | 2 |
Test of English as a Foreign… | 1 |
Trends in International… | 1 |
What Works Clearinghouse Rating
Slepkov, A. D.; Van Bussel, M. L.; Fitze, K. M.; Burr, W. S. – SAGE Open, 2021
There is a broad literature in multiple-choice test development, both in terms of item-writing guidelines, and psychometric functionality as a measurement tool. However, most of the published literature concerns multiple-choice testing in the context of expert-designed high-stakes standardized assessments, with little attention being paid to the…
Descriptors: Foreign Countries, Undergraduate Students, Student Evaluation, Multiple Choice Tests
Wikle, Jocelyn S.; West, Richard E. – International Journal on E-Learning, 2019
Each year students spend thousands of hours participating in discussion forums. Yet there has not been sufficient scholarship on the observable effects of discussion forums on learning outcomes, and there is not a clear understanding of the role discussion forums play in transmitting knowledge of course concepts to students. This study…
Descriptors: Student Participation, Discussion Groups, Outcomes of Education, Difficulty Level
Papenberg, Martin; Musch, Jochen – Applied Measurement in Education, 2017
In multiple-choice tests, the quality of distractors may be more important than their number. We therefore examined the joint influence of distractor quality and quantity on test functioning by providing a sample of 5,793 participants with five parallel test sets consisting of items that differed in the number and quality of distractors.…
Descriptors: Multiple Choice Tests, Test Items, Test Validity, Test Reliability
Smith, J. Alexander; Dickinson, John R. – International Journal for Business Education, 2017
Published banks of multiple-choice questions are ubiquitous, the questions in those banks often being classified into levels of difficulty. The specific level of difficulty into which a question is classified might or should be a function of the question's substance. Possibly, though, insubstantive aspects of the question, such as the incidence of…
Descriptors: Correlation, Multiple Choice Tests, Difficulty Level, Classification
Szulewski, Adam; Gegenfurtner, Andreas; Howes, Daniel W.; Sivilotti, Marco L. A.; van Merriënboer, Jeroen J. G. – Advances in Health Sciences Education, 2017
In general, researchers attempt to quantify cognitive load using physiologic and psychometric measures. Although the construct measured by both of these metrics is thought to represent overall cognitive load, there is a paucity of studies that compares these techniques to one another. The authors compared data obtained from one physiologic tool…
Descriptors: Physicians, Cognitive Processes, Difficulty Level, Physiology
Krell, Moritz – Cogent Education, 2017
This study evaluates a 12-item instrument for subjective measurement of mental load (ML) and mental effort (ME) by analysing different sources of validity evidence. The findings of an expert judgement (N = 8) provide "evidence based on test content" that the formulation of the items corresponds to the meaning of ML and ME. An empirical…
Descriptors: Cognitive Processes, Test Validity, Secondary School Students, Multiple Choice Tests
Sener, Nilay; Tas, Erol – Journal of Education and Learning, 2017
The purpose of this study is to prepare a multiple-choice achievement test with high reliability and validity for the "Let's Solve the Puzzle of Our Body" unit. For this purpose, a multiple choice achievement test consisting of 46 items was applied to 178 fifth grade students in total. As a result of the test and material analysis…
Descriptors: Achievement Tests, Grade 5, Science Instruction, Biology
Badrasawi, Kamal J. I.; Abu Kassim, Noor Lide; Daud, Nuraihan Mat – Malaysian Journal of Learning and Instruction, 2017
Purpose: The study sought to determine the hierarchical nature of reading skills. Whether reading is a "unitary" or "multi-divisible" skill is still a contentious issue. So is the hierarchical order of reading skills. Determining the hierarchy of reading skills is challenging as item difficulty is greatly influenced by factors…
Descriptors: Foreign Countries, Secondary School Students, Reading Tests, Test Items
DiBattista, David; Sinnige-Egger, Jo-Anne; Fortuna, Glenda – Journal of Experimental Education, 2014
The authors assessed the effects of using "none of the above" as an option in a 40-item, general-knowledge multiple-choice test administered to undergraduate students. Examinees who selected "none of the above" were given an incentive to write the correct answer to the question posed. Using "none of the above" as the…
Descriptors: Multiple Choice Tests, Testing, Undergraduate Students, Test Items
Hoshino, Yuko – Language Testing in Asia, 2013
This study compares the effect of different kinds of distractors on the level of difficulty of multiple-choice (MC) vocabulary tests in sentential contexts. This type of test is widely used in practical testing but it has received little attention so far. Furthermore, although distractors, which represent the unique characteristics of MC tests,…
Descriptors: Vocabulary Development, Comparative Analysis, Difficulty Level, Multiple Choice Tests
Liaghat, Farahnaz; Biria, Reza – International Journal of Instruction, 2018
This study aimed at exploring the impact of mentor text modelling on Iranian English as a Foreign Language (EFL) learners' accuracy and fluency in writing tasks with different cognitive complexity in comparison with two conventional approaches to teaching writing; namely, process-based and product-based approaches. To this end, 60 Iranian EFL…
Descriptors: Foreign Countries, Comparative Analysis, Teaching Methods, Writing Instruction
Wolkowitz, Amanda A.; Skorupski, William P. – Educational and Psychological Measurement, 2013
When missing values are present in item response data, there are a number of ways one might impute a correct or incorrect response to a multiple-choice item. There are significantly fewer methods for imputing the actual response option an examinee may have provided if he or she had not omitted the item either purposely or accidentally. This…
Descriptors: Multiple Choice Tests, Statistical Analysis, Models, Accuracy
Dankbaar, Mary E. W.; Alsma, Jelmer; Jansen, Els E. H.; van Merrienboer, Jeroen J. G.; van Saase, Jan L. C. M.; Schuit, Stephanie C. E. – Advances in Health Sciences Education, 2016
Simulation games are becoming increasingly popular in education, but more insight in their critical design features is needed. This study investigated the effects of fidelity of open patient cases in adjunct to an instructional e-module on students' cognitive skills and motivation. We set up a three-group randomized post-test-only design: a…
Descriptors: Experimental Groups, Thinking Skills, Computer Games, Motivation
Cawthon, Stephanie – American Annals of the Deaf, 2011
Linguistic complexity of test items is one test format element that has been studied in the context of struggling readers and their participation in paper-and-pencil tests. The present article presents findings from an exploratory study on the potential relationship between linguistic complexity and test performance for deaf readers. A total of 64…
Descriptors: Language Styles, Test Content, Syntax, Linguistics
Kibble, Jonathan D.; Johnson, Teresa – Advances in Physiology Education, 2011
The purpose of this study was to evaluate whether multiple-choice item difficulty could be predicted either by a subjective judgment by the question author or by applying a learning taxonomy to the items. Eight physiology faculty members teaching an upper-level undergraduate human physiology course consented to participate in the study. The…
Descriptors: Test Items, Hidden Curriculum, Reliability, Physiology
Previous Page | Next Page »
Pages: 1 | 2