Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 6 |
Descriptor
Educational Testing | 16 |
Multiple Choice Tests | 16 |
Test Format | 16 |
Test Items | 10 |
Test Construction | 7 |
Higher Education | 6 |
Responses | 3 |
Scores | 3 |
Test Reliability | 3 |
Testing Problems | 3 |
Achievement Tests | 2 |
More ▼ |
Source
Author
Arneson, Amy | 1 |
Barnett-Foster, Debora | 1 |
Curren, Randall R. | 1 |
Gulliksen, Harold | 1 |
Hagge, Sarah Lynn | 1 |
Holley, Charles D. | 1 |
Huang, Yi-Min | 1 |
Joseph, Dane Christian | 1 |
Kim, Sooyeon | 1 |
Kon, Jane Heckley | 1 |
Kumar, V. K. | 1 |
More ▼ |
Publication Type
Journal Articles | 11 |
Reports - Research | 9 |
Dissertations/Theses -… | 3 |
Reports - Evaluative | 3 |
Opinion Papers | 2 |
Historical Materials | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Elementary Secondary Education | 3 |
Postsecondary Education | 3 |
Higher Education | 2 |
High Schools | 1 |
Secondary Education | 1 |
Audience
Practitioners | 1 |
Teachers | 1 |
Location
Canada | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Advanced Placement… | 1 |
What Works Clearinghouse Rating
Arneson, Amy – ProQuest LLC, 2019
This three-paper dissertation explores item cluster-based assessments, first in general as it relates to modeling, and then, specific issues surrounding a particular item cluster-based assessment designed. There should be a reasonable analogy between the structure of a psychometric model and the cognitive theory that the assessment is based upon.…
Descriptors: Item Response Theory, Test Items, Critical Thinking, Cognitive Tests
Joseph, Dane Christian – ProQuest LLC, 2010
Multiple-choice item-writing guideline research is in its infancy. Haladyna (2004) calls for a science of item-writing guideline research. The purpose of this study is to respond to such a call. The purpose of this study was to examine the impact of student ability and method for varying the location of correct answers in classroom multiple-choice…
Descriptors: Evidence, Test Format, Guessing (Tests), Program Effectiveness
Kim, Sooyeon; Walker, Michael E.; McHale, Frederick – Journal of Educational Measurement, 2010
In this study we examined variations of the nonequivalent groups equating design for tests containing both multiple-choice (MC) and constructed-response (CR) items to determine which design was most effective in producing equivalent scores across the two tests to be equated. Using data from a large-scale exam, this study investigated the use of…
Descriptors: Measures (Individuals), Scoring, Equated Scores, Test Bias
Ventouras, Errikos; Triantis, Dimos; Tsiakas, Panagiotis; Stergiopoulos, Charalampos – Computers & Education, 2010
The aim of the present research was to compare the use of multiple-choice questions (MCQs) as an examination method, to the examination based on constructed-response questions (CRQs). Despite that MCQs have an advantage concerning objectivity in the grading process and speed in production of results, they also introduce an error in the final…
Descriptors: Computer Assisted Instruction, Scoring, Grading, Comparative Analysis
Hagge, Sarah Lynn – ProQuest LLC, 2010
Mixed-format tests containing both multiple-choice and constructed-response items are widely used on educational tests. Such tests combine the broad content coverage and efficient scoring of multiple-choice items with the assessment of higher-order thinking skills thought to be provided by constructed-response items. However, the combination of…
Descriptors: Test Format, True Scores, Equated Scores, Psychometrics
Huang, Yi-Min; Trevisan, Mike; Storfer, Andrew – International Journal for the Scholarship of Teaching and Learning, 2007
Despite the prevalence of multiple choice items in educational testing, there is a dearth of empirical evidence for multiple choice item writing rules. The purpose of this study was to expand the base of empirical evidence by examining the use of the "all-of-the-above" option in a multiple choice examination in order to assess how…
Descriptors: Multiple Choice Tests, Educational Testing, Ability Grouping, Test Format

Weiten, Wayne – Journal of Experimental Education, 1982
A comparison of double as opposed to single multiple-choice questions yielded significant differences in regard to item difficulty, item discrimination, and internal reliability, but not concurrent validity. (Author/PN)
Descriptors: Difficulty Level, Educational Testing, Higher Education, Multiple Choice Tests

Mentzer, Thomas L. – Educational and Psychological Measurement, 1982
Evidence of biases in the correct answers in multiple-choice test item files were found to include "all of the above" bias in which that answer was correct more than 25 percent of the time, and a bias that the longest answer was correct too frequently. Seven bias types were studied. (Author/CM)
Descriptors: Educational Testing, Higher Education, Multiple Choice Tests, Psychology

Lederman, Marie Jean – Journal of Basic Writing, 1988
Explores the history of testing, motivations for testing, testing procedures, and the inevitable limitations of testing. Argues that writing program faculty and administrators must clarify and profess their values, decide what they want students to know and what sort of thinkers they should be, and develop tests reflecting those needs. (SR)
Descriptors: Educational Objectives, Educational Testing, Essay Tests, Multiple Choice Tests

Kon, Jane Heckley; Martin-Kniep, Giselle O. – Social Education, 1992
Describes a case study to determine whether performance tests are a feasible alternative to multiple-choice tests. Examines the difficulties of administering and scoring performance assessments. Explains that the study employed three performance tests and one multiple-choice test. Concludes that performance test administration and scoring was no…
Descriptors: Educational Objectives, Educational Research, Educational Testing, Geography Instruction
Gulliksen, Harold – 1985
This article presents the perspective that the quality of teacher-made, small classroom tests has not improved, and may have declined in recent years. This decline may be due to the fact that teachers have come to believe that the kinds of objective items used in national standardized tests are the only item types appropriate for classroom use.…
Descriptors: Adults, Classroom Techniques, Educational Testing, Educational Trends

Kumar, V. K.; And Others – Contemporary Educational Psychology, 1979
Ninth-graders read a passage for a test to be taken the next day, anticipating a recall test, a multiple-choice test, and a retention test. Half received either a recall or a recognition test regardless of prior instructions. Subjects did better on the recognition tests in all conditions. (Author/RD)
Descriptors: Difficulty Level, Educational Testing, Expectation, Junior High Schools

Holley, Charles D.; And Others – Contemporary Educational Psychology, 1979
College students were trained on a hierarchical mapping technique designed to facilitate prose processing. The students studied a geology passage and five days later were given four types of tests. The treatment group significantly outperformed a control group; the major differences were attributable to concept cloze and essay exams. (Author/RD)
Descriptors: Cloze Procedure, Educational Testing, Essay Tests, Higher Education
Curren, Randall R. – Theory and Research in Education, 2004
This article addresses the capacity of high stakes tests to measure the most significant kinds of learning. It begins by examining a set of philosophical arguments pertaining to construct validity and alleged conceptual obstacles to attributing specific knowledge and skills to learners. The arguments invoke philosophical doctrines of holism and…
Descriptors: Test Items, Educational Testing, Construct Validity, High Stakes Tests

Barnett-Foster, Debora; Nagy, Philip – Alberta Journal of Educational Research, 1995
Analysis of response strategies employed by 261 undergraduate chemistry students when answering multiple-choice and stem-equivalent constructed-response questions revealed no significant differences in types of solution strategies or types of errors across test format. However, analysis of student oral reports revealed a higher frequency of…
Descriptors: Chemistry, Constructed Response, Educational Research, Educational Testing
Previous Page | Next Page ยป
Pages: 1 | 2