NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 23 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Yu; Chiu, Chia-Yi; Köhn, Hans Friedrich – Journal of Educational and Behavioral Statistics, 2023
The multiple-choice (MC) item format has been widely used in educational assessments across diverse content domains. MC items purportedly allow for collecting richer diagnostic information. The effectiveness and economy of administering MC items may have further contributed to their popularity not just in educational assessment. The MC item format…
Descriptors: Multiple Choice Tests, Nonparametric Statistics, Test Format, Educational Assessment
Mbella, Kinge Keka – ProQuest LLC, 2012
Mixed-format assessments are increasingly being used in large scale standardized assessments to measure a continuum of skills ranging from basic recall to higher order thinking skills. These assessments are usually comprised of a combination of (a) multiple-choice items which can be efficiently scored, have stable psychometric properties, and…
Descriptors: Educational Assessment, Test Format, Evaluation Methods, Multiple Choice Tests
Hagge, Sarah Lynn – ProQuest LLC, 2010
Mixed-format tests containing both multiple-choice and constructed-response items are widely used on educational tests. Such tests combine the broad content coverage and efficient scoring of multiple-choice items with the assessment of higher-order thinking skills thought to be provided by constructed-response items. However, the combination of…
Descriptors: Test Format, True Scores, Equated Scores, Psychometrics
Peer reviewed Peer reviewed
Pajares, Frank; Miller, M. David – Journal of Experimental Education, 1997
The mathematics self-efficacy and problem-solving performance of 327 middle school students were assessed with multiple-choice and open-ended methods. No differences in self-efficacy resulted from the different forms of assessment, although those who took the multiple-choice test had higher scores and better calibration of ability. (SLD)
Descriptors: Ability, Educational Assessment, Mathematics, Middle School Students
Peer reviewed Peer reviewed
Wilson, Mark; Wang, Wen-chung – Applied Psychological Measurement, 1995
Data from the California Learning Assessment System mathematics assessment were used to examine issues that arise when scores from different assessment modes are combined. Multiple-choice, open-ended, and investigation items were combined in a test across three test forms. Results illustrate the difficulties faced in evaluating combined…
Descriptors: Educational Assessment, Equated Scores, Evaluation Methods, Item Response Theory
Martinez, Michael E.; Katz, Irvin R. – 1992
Contrasts between constructed response items and stem-equivalent multiple-choice counterparts typically have involved averaging item characteristics, and this aggregation has masked differences in statistical properties at the item level. Moreover, even aggregated format differences have not been explained in terms of differential cognitive…
Descriptors: Architecture, Cognitive Processes, Construct Validity, Constructed Response
Peer reviewed Peer reviewed
Hancock, Gregory R. – Journal of Experimental Education, 1994
To investigate the ability of multiple-choice tests to assess higher order thinking skills, examinations were constructed as half multiple choice and half constructed response. Results with 90 undergraduate and graduate students indicate that the 2 formats measure similar constructs at different levels of complexity. (SLD)
Descriptors: Cognitive Processes, Comparative Analysis, Constructed Response, Educational Assessment
Burton, Nancy W.; And Others – 1976
Assessment exercises (items) in three different formats--multiple-choice with an "I don't know" (IDK) option, multiple-choice without the IDK, and open-ended--were placed at the beginning, middle and end of 45-minute assessment packages (instruments). A balanced incomplete blocks analysis of variance was computed to determine the biasing…
Descriptors: Age Differences, Difficulty Level, Educational Assessment, Guessing (Tests)
Buckendahl, Chad W.; Plake, Barbara S.; Impara, James C. – 1999
Many school districts are developing assessments that incorporate both selected response and constructed response formats. Scores on these assessments can be used for a variety of purposes ranging from subject remediation to promotion decisions. These policy decisions are informed by recommendations for Minimum Passing Scores (MPSs) from standard…
Descriptors: Academic Standards, Constructed Response, Cutting Scores, Educational Assessment
Jones, Russell W. – 1994
One of the most influential contemporary trends in educational evaluation in the United States is the move away from traditional testing methods toward "authentic assessments," which are designed to measure student performance of skills, abilities, and knowledge directly. While there is no consensus as to precisely what constitutes authentic…
Descriptors: Alternative Assessment, Educational Assessment, Educational Trends, Evaluation Methods
Peer reviewed Peer reviewed
Birenbaum, Menucha; And Others – Applied Psychological Measurement, 1992
The effect of multiple-choice (MC) or open-ended (OE) response format on diagnostic assessment of algebra test performance was investigated with 231 eighth and ninth graders in Tel Aviv (Israel) using bug or rule space analysis. Both analyses indicated closer similarity between parallel OE subsets than between stem-equivalent OE and MC subsets.…
Descriptors: Algebra, Comparative Testing, Educational Assessment, Educational Diagnosis
Braswell, James S.; Jackson, Carol A. – 1995
A new free-response item type for mathematics tests is described. The item type, referred to as the Student-Produced Response (SPR), was first introduced into the Preliminary Scholastic Aptitude Test/National Merit Scholarship Qualifying Test in 1993 and into the Scholastic Aptitude Test in 1994. Students solve a problem and record the answer by…
Descriptors: Computer Assisted Testing, Educational Assessment, Guessing (Tests), Mathematics Tests
Finch, Fredrick; Foertsch, Mary – 1993
Performance assessment is reviewed as an emerging form of alternative assessment, focusing on how it has been defined in the research literature, the criteria for evaluating its authenticity, the measurement of process and product, and the link between assessment and instruction. Three important dimensions that must be considered in describing…
Descriptors: Alternative Assessment, Educational Assessment, Elementary Secondary Education, Evaluation Methods
Finch, F. L.; Dost, Marcia A. – 1992
Many state and local entities are developing and using performance assessment programs. Because these initiatives are so diverse, it is very difficult to understand what they are doing, or to compare them in any meaningful way. Multiple-choice tests are contrasted with performance assessments, and preliminary classifications are suggested to…
Descriptors: Alternative Assessment, Classification, Comparative Analysis, Constructed Response
Pollack, Judith M. – 1990
This paper summarizes an investigation of applications and issues in free response (FR) testing during 1989. It draws on ideas from the results of the National Educational Longitudinal Study 1988 (NELS:88) field test, a seminar series at the Educational Testing Service (ETS), working papers prepared for several FR testing applications, and…
Descriptors: Comparative Analysis, Costs, Educational Assessment, Elementary Secondary Education
Previous Page | Next Page »
Pages: 1  |  2