Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 9 |
Descriptor
Multiple Choice Tests | 13 |
Test Content | 13 |
Test Format | 13 |
Test Items | 9 |
Test Construction | 6 |
Mathematics Tests | 4 |
Correlation | 3 |
Elementary Secondary Education | 3 |
Essay Tests | 3 |
Foreign Countries | 3 |
Higher Education | 3 |
More ▼ |
Source
College Board | 2 |
American Annals of the Deaf | 1 |
Applied Measurement in… | 1 |
IDEA Center, Inc. | 1 |
Journal of Instructional… | 1 |
Learning: Research and… | 1 |
National Assessment Governing… | 1 |
ProQuest LLC | 1 |
Science Education… | 1 |
Author
Arneson, Amy | 1 |
Cawthon, Stephanie | 1 |
Ewing, Maureen | 1 |
Guei, I-Fen | 1 |
Haladyna, Thomas A. | 1 |
Haladyna, Thomas M. | 1 |
Hamilton, Laura S. | 1 |
Hendrickson, Amy | 1 |
Hertenstein, Matthew J. | 1 |
Hudson, Ross D. | 1 |
Kim, Rachel | 1 |
More ▼ |
Publication Type
Education Level
Higher Education | 5 |
Postsecondary Education | 3 |
Elementary Secondary Education | 2 |
Secondary Education | 2 |
Elementary Education | 1 |
Grade 12 | 1 |
Grade 4 | 1 |
Grade 8 | 1 |
High Schools | 1 |
Intermediate Grades | 1 |
Junior High Schools | 1 |
More ▼ |
Audience
Practitioners | 1 |
Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 1 |
SAT (College Admission Test) | 1 |
What Works Clearinghouse Rating
Arneson, Amy – ProQuest LLC, 2019
This three-paper dissertation explores item cluster-based assessments, first in general as it relates to modeling, and then, specific issues surrounding a particular item cluster-based assessment designed. There should be a reasonable analogy between the structure of a psychometric model and the cognitive theory that the assessment is based upon.…
Descriptors: Item Response Theory, Test Items, Critical Thinking, Cognitive Tests
Haladyna, Thomas M. – IDEA Center, Inc., 2018
Writing multiple-choice test items to measure student learning in higher education is a challenge. Based on extensive scholarly research and experience, the author describes various item formats, offers guidelines for creating these items, and provides many examples of both good and bad test items. He also suggests some shortcuts for developing…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Higher Education
Leber, Jasmin; Renkl, Alexander; Nückles, Matthias; Wäschle, Kristin – Learning: Research and Practice, 2018
According to the model of constructive alignment, learners adjust their learning strategies to the announced assessment (backwash effect). Hence, when teaching for understanding, the assessment method should be aligned with this teaching goal to ensure that learners engage in corresponding learning strategies. A quasi-experimental field study with…
Descriptors: Learning Strategies, Testing Problems, Educational Objectives, Learning Motivation
National Assessment Governing Board, 2017
The National Assessment of Educational Progress (NAEP) is the only continuing and nationally representative measure of trends in academic achievement of U.S. elementary and secondary school students in various subjects. For more than four decades, NAEP assessments have been conducted periodically in reading, mathematics, science, writing, U.S.…
Descriptors: Mathematics Achievement, Multiple Choice Tests, National Competency Tests, Educational Trends
Hudson, Ross D. – Science Education International, 2012
This research inquires into the effectiveness of the two predominant forms of questions--multiple-choice questions and short-answer questions--used in the State University Entrance Examination for Chemistry including the relationship between performance and gender. It examines not only the style of question but also the content type examined…
Descriptors: Chemistry, Science Achievement, Gender Differences, College Entrance Examinations
Cawthon, Stephanie – American Annals of the Deaf, 2011
Linguistic complexity of test items is one test format element that has been studied in the context of struggling readers and their participation in paper-and-pencil tests. The present article presents findings from an exploratory study on the potential relationship between linguistic complexity and test performance for deaf readers. A total of 64…
Descriptors: Language Styles, Test Content, Syntax, Linguistics
Hendrickson, Amy; Patterson, Brian; Ewing, Maureen – College Board, 2010
The psychometric considerations and challenges associated with including constructed response items on tests are discussed along with how these issues affect the form assembly specifications for mixed-format exams. Reliability and validity, security and fairness, pretesting, content and skills coverage, test length and timing, weights, statistical…
Descriptors: Multiple Choice Tests, Test Format, Test Construction, Test Validity
Kobrin, Jennifer L.; Kim, Rachel; Sackett, Paul – College Board, 2011
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by…
Descriptors: College Entrance Examinations, Mathematics Tests, Test Items, Predictive Validity
Hertenstein, Matthew J.; Wayand, Joseph F. – Journal of Instructional Psychology, 2008
Many psychology instructors present videotaped examples of behavior at least occasionally during their courses. However, few include video clips during examinations. We provide examples of video-based questions, offer guidelines for their use, and discuss their benefits and drawbacks. In addition, we provide empirical evidence to support the use…
Descriptors: Student Evaluation, Video Technology, Evaluation Methods, Test Construction
Wu, Yuh-Yin; Guei, I-Fen – 2000
A study was conducted to investigate: (1) the relationships between the results from various forms of assessment and the patterns of correlation across content areas; (2) how cognitive components correlate with the test results from different classroom assessments; and (3) how content areas affected the relationships. Data were collected from a…
Descriptors: Cognitive Processes, Cognitive Tests, Correlation, Elementary School Students

Haladyna, Thomas A. – Applied Measurement in Education, 1992
Several multiple-choice item formats are examined in the current climate of test reform. The reform movement is discussed as it affects use of the following formats: (1) complex multiple-choice; (2) alternate choice; (3) true-false; (4) multiple true-false; and (5) the context dependent item set. (SLD)
Descriptors: Cognitive Psychology, Comparative Testing, Context Effect, Educational Change
Hamilton, Laura S.; Snow, Richard E. – 1998
This study explores methods for detecting gender-based differential item functioning (DIF) on the 12th grade multiple-choice and constructed-response science tests administered as part of the National Education Longitudinal Study of 1988 (NELS:88). Several combinations of conditioning variables were explored for DIF detection on both tests, and…
Descriptors: Achievement Tests, Constructed Response, Grade 12, High School Seniors
Ory, John C.; Ryan, Katherine E. – 1993
This book for college faculty provides a resource for developing, using, and grading classroom exams. The first chapter addresses ways to determine what content should be included on an exam. The second chapter identifies testing considerations such as number of exams, difficulty level of items, and test length. Chapters 3 and 4 provide guidelines…
Descriptors: Classroom Techniques, Codes of Ethics, Essay Tests, Evaluation Methods