Publication Date
| In 2026 | 0 |
| Since 2025 | 75 |
| Since 2022 (last 5 years) | 443 |
| Since 2017 (last 10 years) | 1231 |
| Since 2007 (last 20 years) | 2505 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 122 |
| Teachers | 105 |
| Researchers | 64 |
| Students | 46 |
| Administrators | 14 |
| Policymakers | 7 |
| Counselors | 3 |
| Parents | 3 |
Location
| Canada | 134 |
| Turkey | 130 |
| Australia | 123 |
| Iran | 66 |
| Indonesia | 61 |
| United Kingdom | 51 |
| Germany | 50 |
| Taiwan | 46 |
| United States | 43 |
| China | 39 |
| California | 34 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 3 |
| Meets WWC Standards with or without Reservations | 5 |
| Does not meet standards | 6 |
Mazzeo, John; And Others – 1993
This report describes three exploratory studies of the performance of males and females on the multiple-choice and constructed-response sections of four Advanced Placement Examinations: United States History, Biology, Chemistry, and English Language and Composition. Analyses were carried out for each racial or ethnic group with a sample size of at…
Descriptors: Advanced Placement, College Entrance Examinations, Constructed Response, Ethnic Groups
Braswell, James S.; Jackson, Carol A. – 1995
A new free-response item type for mathematics tests is described. The item type, referred to as the Student-Produced Response (SPR), was first introduced into the Preliminary Scholastic Aptitude Test/National Merit Scholarship Qualifying Test in 1993 and into the Scholastic Aptitude Test in 1994. Students solve a problem and record the answer by…
Descriptors: Computer Assisted Testing, Educational Assessment, Guessing (Tests), Mathematics Tests
Kitao, S. Kathleen; Kitao, Kenji – 1996
A good knowledge of English vocabulary is important for anyone who wants to use the language, so knowledge of vocabulary is often tested. A test may test one or all of the types of vocabulary (active spoken, active written, passive listening, and passive reading), but the test maker should be aware of the differences among the types and which is…
Descriptors: English (Second Language), Foreign Countries, Knowledge Level, Language Proficiency
Martinez, Michael E. – 1990
In contrast to multiple-choice test questions, figural response items call for constructed responses and rely upon figural material, such as illustrations and graphs, as the response medium. Figural response questions in various science domains were created and administered to a sample of 347 fourth, 365 eighth, and 322 twelfth graders. Data were…
Descriptors: Comparative Analysis, Constructed Response, Difficulty Level, Elementary Education
Ohio State Dept. of Education, Columbus. – 1996
A practice test has been prepared to give students experience with test items that are similar to those on the Ohio Ninth-grade Proficiency Tests. The practice test consists of one writing prompt, 20 reading questions, 20 mathematics questions, 25 citizenship questions, and 20 science questions. This manual contains instructions to the teacher, as…
Descriptors: Achievement Tests, Citizenship Education, Grade 9, Graduation Requirements
Triska, Olive H.; And Others – 1996
The domination of the information processing approach has shifted research from problem solving strategies to the structure and organization of knowledge that characterizes expertise. The purpose of this study was to compare the reasoning processes of 12 clinicians and 40 medical students as they responded to 6 positively stated multiple choice…
Descriptors: Clinical Diagnosis, Cognitive Processes, College Faculty, Foreign Countries
Hansche, Linda – 1994
Setting standards on performance measures is discussed in the context of the State Collaborative on Assessment and Student Standards (SCASS) initiative supported by the Council of Chief State School Offices. The usual item-based methods for standard setting, the methods developed by Nedelsky (1954), Angoff (1971), and Ebel (1972), were developed…
Descriptors: Decision Making, Educational Assessment, Educational Policy, Elementary Secondary Education
Trevisan, Michael S.; Sax, Gilbert – 1991
The purpose of this study was to compare the reliabilities of two-, three-, four-, and five-choice tests using an incremental option paradigm. Test forms were created incrementally, a method approximating actual test construction procedures. Participants were 154 12th-grade students from the Portland (Oregon) area. A 45-item test with two options…
Descriptors: Comparative Testing, Distractors (Tests), Estimation (Mathematics), Grade 12
Finch, Fredrick; Foertsch, Mary – 1993
Performance assessment is reviewed as an emerging form of alternative assessment, focusing on how it has been defined in the research literature, the criteria for evaluating its authenticity, the measurement of process and product, and the link between assessment and instruction. Three important dimensions that must be considered in describing…
Descriptors: Alternative Assessment, Educational Assessment, Elementary Secondary Education, Evaluation Methods
Finch, F. L.; Dost, Marcia A. – 1992
Many state and local entities are developing and using performance assessment programs. Because these initiatives are so diverse, it is very difficult to understand what they are doing, or to compare them in any meaningful way. Multiple-choice tests are contrasted with performance assessments, and preliminary classifications are suggested to…
Descriptors: Alternative Assessment, Classification, Comparative Analysis, Constructed Response
Using Multiple Choice Examination Items To Measure Teachers' Content-Specific Pedagogical Knowledge.
Kromrey, Jeffrey D.; Renfrow, Donata D. – 1991
The use of multiple-choice test items measuring content-specific pedagogical knowledge (C-P) as a viable method of increasing the validity of teacher tests is described. The purposes of the paper are to: (1) present examples of multiple-choice test items used for the assessment of C-P and contrast these items with items used for assessing content…
Descriptors: Content Validity, Elementary Secondary Education, Higher Education, Instructional Effectiveness
Hill, Clifford; Larsen, Eric – 1983
This study examines how third and fourth grade children work with a representative sample of test items designed to measure reading comprehension. The developmental and ethnocultural problems that children experience (whether they are rooted in the passage, the tasks, or the entire item) are discussed. From a passage-based perspective, the items…
Descriptors: Elementary Education, Grade 3, Grade 4, Item Analysis
Graff, Martin – Electronic Journal of e-Learning, 2006
This study sought to investigate four separate issues regarding student performance in a blended learning environment in the delivery of a Psychology course to 140 University undergraduates. Firstly, to investigate the relationship between student performance on three different coursework assignments and their performance on interim online…
Descriptors: Computer Mediated Communication, Educational Technology, Online Courses, Electronic Learning
McNamara, Thomas C. – 1990
An attempt was made to develop self-directedness in students through teaching them how to write questions in multiple-choice format, and by making students aware that phrasing things in multiple-choice format is a way of asking questions that is conducive to developing thinking, writing, and test-taking skills. The study, a form of reflective…
Descriptors: Cognitive Style, Elementary Education, Elementary School Students, Grade 2
Yachimowicz, David J.; And Others – 1990
The psychometric properties of a paper-and-pencil instrument for assessing individual differences in cerebral dominance are explored. The instrument, Your Style of Learning and Thinking (SOLAT), contains 50 multiple-choice questions. The study subjects consisted of three groups: 235 undergraduate and graduate students, 124 undergraduate and…
Descriptors: Adults, Brain Hemisphere Functions, College Students, Comparative Testing

Peer reviewed
