Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 3 |
Descriptor
Difficulty Level | 24 |
Item Banks | 24 |
Multiple Choice Tests | 24 |
Reading Comprehension | 12 |
Test Items | 11 |
Cloze Procedure | 10 |
Elementary Secondary Education | 10 |
Reading Tests | 10 |
Answer Keys | 9 |
Test Construction | 9 |
Test Format | 6 |
More ▼ |
Author
Andrich, David | 1 |
Bulut, Okan | 1 |
Cook Smith, Nancy | 1 |
Coyle, Harold | 1 |
Gershon, Richard C. | 1 |
Gierl, Mark J. | 1 |
Gray, George T. | 1 |
Haladyna, Thomas M. | 1 |
Humphry, Stephen | 1 |
Laskaris, George | 1 |
Linacre, John M. | 1 |
More ▼ |
Publication Type
Tests/Questionnaires | 12 |
Reports - Research | 8 |
Speeches/Meeting Papers | 8 |
Journal Articles | 3 |
Reports - Evaluative | 3 |
Reports - Descriptive | 2 |
Computer Programs | 1 |
Guides - Non-Classroom | 1 |
Numerical/Quantitative Data | 1 |
Education Level
Elementary Education | 1 |
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Location
California | 1 |
Canada | 1 |
Netherlands | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Raven Advanced Progressive… | 1 |
What Works Clearinghouse Rating
Shin, Jinnie; Bulut, Okan; Gierl, Mark J. – Journal of Experimental Education, 2020
The arrangement of response options in multiple-choice (MC) items, especially the location of the most attractive distractor, is considered critical in constructing high-quality MC items. In the current study, a sample of 496 undergraduate students taking an educational assessment course was given three test forms consisting of the same items but…
Descriptors: Foreign Countries, Undergraduate Students, Multiple Choice Tests, Item Response Theory
Andrich, David; Marais, Ida; Humphry, Stephen – Journal of Educational and Behavioral Statistics, 2012
Andersen (1995, 2002) proves a theorem relating variances of parameter estimates from samples and subsamples and shows its use as an adjunct to standard statistical analyses. The authors show an application where the theorem is central to the hypothesis tested, namely, whether random guessing to multiple choice items affects their estimates in the…
Descriptors: Test Items, Item Response Theory, Multiple Choice Tests, Guessing (Tests)
Sadler, Philip M.; Coyle, Harold; Cook Smith, Nancy; Miller, Jaimie; Mintzes, Joel; Tanner, Kimberly; Murray, John – CBE - Life Sciences Education, 2013
We report on the development of an item test bank and associated instruments based on the National Research Council (NRC) K-8 life sciences content standards. Utilizing hundreds of studies in the science education research literature on student misconceptions, we constructed 476 unique multiple-choice items that measure the degree to which test…
Descriptors: National Standards, Knowledge Level, Biological Sciences, Item Banks
Haladyna, Thomas M.; And Others – 1987
This paper discusses the development and use of "item shells" in constructing multiple-choice tests. An item shell is a "hollow" item that contains the syntactic structure and context of an item without specific content. Item shells are empirically developed from successfully used items selected from an existing item pool. Use…
Descriptors: Difficulty Level, Health Personnel, Item Banks, Multiple Choice Tests
New York State Education Dept., Albany. Div. of Research. – 1977
The "Test Development Notebook" is a resource designed for the preparation of tests of literal comprehension for students in grades 1 through 12. This volume contains 120 multiple-choice cloze exercises taken from front page stories, feature stories, and editorials of newspapers; and the accompanying answer key. Each exercise carries the code…
Descriptors: Answer Keys, Cloze Procedure, Difficulty Level, Editorials
New York State Education Dept., Albany. Div. of Research. – 1977
The "Test Development Notebook" is a resource designed for the preparation of tests of literal comprehension for students in grades 1 through 12. This volume contains 120 multiple-choice cloze exercises taken from news magazines, and the accompanying answer key. Each exercise carries the code letter of the section to which it belongs.…
Descriptors: Answer Keys, Cloze Procedure, Difficulty Level, Editorials
San Diego Community Coll. District, CA. Continuing Education Centers. – 1984
The test items in this booklet are sample life skill items representative of those found in the California Adult Student Assessment System (CASAS) Item Bank. The items are arranged in four sections--A, B, C, and D-- from easy to difficult. A four-digit coding system is used to define each item. The first digit represents the content area: consumer…
Descriptors: Adult Education, Adults, Daily Living Skills, Difficulty Level
Lutkus, Anthony D.; Laskaris, George – 1981
Analyses of student responses to Introductory Psychology test questions were discussed. The publisher supplied a two thousand item test bank on computer tape. Instructors selected questions for fifteen item tests. The test questions were labeled by the publisher as factual or conceptual. The semester course used a mastery learning format in which…
Descriptors: Difficulty Level, Higher Education, Item Analysis, Item Banks

Richichi, Rudolph V. – 1996
An item analysis study was conducted for two multiple-choice tests of introductory psychology created from a publisher's test bank. A 47-item multiple-choice test was administered to 247 students, and 433 students took a different 42-item multiple-choice test. Participants were from a large northeastern suburban community college. The difficulty…
Descriptors: College Students, Community Colleges, Difficulty Level, Higher Education
Tollefson, Nona; Tripp, Alice – 1983
This study compared the item difficulty and item discrimination of three multiple choice item formats. The multiple choice formats studied were: a complex alternative (none of the above) as the correct answer; a complex alternative as a foil, and the one-correct answer format. One hundred four graduate students were randomly assigned to complete…
Descriptors: Analysis of Variance, Difficulty Level, Graduate Students, Higher Education
Gershon, Richard C.; And Others – 1994
A 1992 study by R. Gershon found discrepancies when comparing the theoretical Rasch item characteristic curve with the average empirical curve for 1,304 vocabulary items administered to 7,711 students. When person-item mismatches were deleted (for any person-item interaction where the ability of the person was much higher or much lower than the…
Descriptors: Adaptive Testing, Computer Assisted Testing, Difficulty Level, Elementary Education
New York State Education Dept., Albany. Div. of Research. – 1977
The "Test Development Notebook" is a resource designed for the preparation of tests of literal comprehension for students in grades 1 through 12. This volume contains 100 multiple-choice cloze exercises taken from test instructions, instructional magazines, encyclopedias, reference books, and children's magazines; and the accompanying answer key.…
Descriptors: Answer Keys, Cloze Procedure, Difficulty Level, Elementary Secondary Education
New York State Education Dept., Albany. Div. of Research. – 1977
The "Test Development Notebook" is a resource designed for the preparation of tests of literal comprehension for students in grades 1 through 12. This volume contains 100 multiple-choice cloze exercises taken from consumer magazines, catalogs, instructions, advertisements, and contracts; and the accompanying answer key. Each exercise…
Descriptors: Advertising, Answer Keys, Catalogs, Cloze Procedure
Roid, Gale H.; Wendler, Cathy L. W. – 1983
The development of the emerging technology of item writing was motivated in part by the desire to reduce potential subjectivity and bias between different item writers who attempt to construct parallel achievement tests. The present study contrasts four test forms constructed by the combined efforts of six item writers using four methods of item…
Descriptors: Achievement Tests, Difficulty Level, Intermediate Grades, Item Analysis
Rachor, Robert E.; Gray, George T. – 1996
Two frequently cited guidelines for writing multiple choice test item stems are: (1) the stem can be written in either a question or statement-to-be-completed format; and (2) only positively worded stems should be used. These guidelines were evaluated in a survey of the test item banks of 13 nationally administered examinations in the physician…
Descriptors: Allied Health Personnel, Difficulty Level, High Achievement, Item Banks
Previous Page | Next Page ยป
Pages: 1 | 2