Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 1 |
Descriptor
Multiple Choice Tests | 17 |
Scoring Formulas | 17 |
Testing | 17 |
Guessing (Tests) | 10 |
Response Style (Tests) | 8 |
Test Validity | 7 |
Scoring | 6 |
Test Reliability | 6 |
Confidence Testing | 5 |
Statistical Analysis | 4 |
Test Construction | 4 |
More ▼ |
Source
Journal of Educational… | 4 |
Applied Psychological… | 1 |
Educational and Psychological… | 1 |
Evaluation in Education:… | 1 |
Higher Education Studies | 1 |
Journal of Educational… | 1 |
Journal of Medical Education | 1 |
Psychometrika | 1 |
Author
Hakstian, A. Ralph | 2 |
Kansup, Wanlop | 2 |
Aiken, Lewis R. | 1 |
Arkin, Robert M. | 1 |
Boldt, Robert F. | 1 |
Brown, Thomas A. | 1 |
Cirillo, Pier F. | 1 |
Diamond, James J. | 1 |
Donlon, Thomas F. | 1 |
Duncan, George T. | 1 |
Echternacht, Gary J. | 1 |
More ▼ |
Publication Type
Reports - Research | 8 |
Journal Articles | 3 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Merrel, Jeremy D.; Cirillo, Pier F.; Schwartz, Pauline M.; Webb, Jeffrey A. – Higher Education Studies, 2015
Multiple choice testing is a common but often ineffective method for evaluating learning. A newer approach, however, using Immediate Feedback Assessment Technique (IF AT®, Epstein Educational Enterprise, Inc.) forms, offers several advantages. In particular, a student learns immediately if his or her answer is correct and, in the case of an…
Descriptors: Multiple Choice Tests, Feedback (Response), Evaluation Methods, Guessing (Tests)

Lord, Frederic M. – Journal of Educational Measurement, 1975
The assumption that examinees either know the answer to a test item or else guess at random is usually totally implausible. A different assumption is outlined, under which formula scoring is found to be clearly superior to number right scoring. (Author)
Descriptors: Guessing (Tests), Multiple Choice Tests, Response Style (Tests), Scoring
Wilcox, Rand R. – 1979
In the past, several latent structure models have been proposed for handling problems associated with measuring the achievement of examinees. Typically, however, these models describe a specific examinee in terms of an item domain or they describe a few items in terms of a population of examinees. In this paper, a model is proposed which allows a…
Descriptors: Achievement Tests, Guessing (Tests), Mathematical Models, Multiple Choice Tests

Frary, Robert B. – Applied Psychological Measurement, 1980
Six scoring methods for assigning weights to right or wrong responses according to various instructions given to test takers are analyzed with respect to expected change scores and the effect of various levels of information and misinformation. Three of the methods provide feedback to the test taker. (Author/CTM)
Descriptors: Guessing (Tests), Knowledge Level, Multiple Choice Tests, Scores

Duncan, George T.; Milton, E. O. – Psychometrika, 1978
A multiple-answer multiple-choice test is one which offers several alternate choices for each stem and any number of those choices may be considered to be correct. In this article, a class of scoring procedures called the binary class is discussed. (Author/JKS)
Descriptors: Answer Keys, Measurement Techniques, Multiple Choice Tests, Scoring Formulas

Essex, Diane L. – Journal of Medical Education, 1976
Two multiple-choice scoring schemes--a partial credit scheme and a dichotomous approach--were compared analyzing means, variances, and reliabilities on alternate measures and student reactions. Students preferred the partial-credit approach, which is recommended if rewarding for partial knowledge is an important concern. (Editor/JT)
Descriptors: Higher Education, Medical Students, Multiple Choice Tests, Reliability
Boldt, Robert F. – 1974
One formulation of confidence scoring requires the examinee to indicate as a number his personal probability of the correctness of each alternative in a multiple-choice test. For this formulation a linear transformation of the logarithm of the correct response is maximized if the examinee accurately reports his personal probability. To equate…
Descriptors: Confidence Testing, Guessing (Tests), Multiple Choice Tests, Probability

Diamond, James J. – Journal of Educational Measurement, 1975
Investigates the reliability and validity of scores yielded from a new scoring formula. (Author/DEP)
Descriptors: Guessing (Tests), Multiple Choice Tests, Objective Tests, Scoring

Arkin, Robert M.; Walts, Elizabeth A. – Journal of Educational Psychology, 1983
The effects of corrective testing and how such feedback might affect high- and low-test-anxious students differently are indicated. Subjects were 286 college students in three classes--one using mastery testing and two using multiple choice tests. (Author/PN)
Descriptors: Attribution Theory, Feedback, Higher Education, Mastery Tests

Aiken, Lewis R.; Williams, Newsom – Educational and Psychological Measurement, 1978
Seven formulas for scoring test items with two options (true-false or multiple choice with only two choices) were investigated. Several conditions, such as varying directions for guessing and whether testees had prior knowledge of the proportions of false items on the test were also investigated. (Author/JKS)
Descriptors: Guessing (Tests), Higher Education, Knowledge Level, Multiple Choice Tests
Echternacht, Gary J.; And Others – 1971
This handbook presents instructions for implementing a confidence testing program in technical training situations, identification of possible areas of application, techniques for evaluating confidence information, advantages and disadvantages of confidence testing, time considerations, and problem areas. Complete instructions for "Pick-One" and…
Descriptors: Confidence Testing, Educational Diagnosis, Guessing (Tests), Measurement Techniques

Kansup, Wanlop; Hakstian, A. Ralph – Journal of Educational Measurement, 1975
Effects of logically weighting incorrect item options in conventional tests and different scoring functions with confidence tests on reliability and validity were examined. Ninth graders took conventionally administered Verbal and Mathematical Reasoning tests, scored conventionally and by a procedure assigning degree-of-correctness weights to…
Descriptors: Comparative Analysis, Confidence Testing, Junior High School Students, Multiple Choice Tests

Hakstian, A. Ralph; Kansup, Wanlop – Journal of Educational Measurement, 1975
A comparison of reliability and validity was made for three testing procedures: 1) responding conventionally to Verbal Ability and Mathematical Reasoning tests; 2) using a confidence weighting response procedure with the same tests; and 3) using the elimination response method. The experimental testing procedures were not psychometrically superior…
Descriptors: Comparative Analysis, Confidence Testing, Guessing (Tests), Junior High School Students
Donlon, Thomas F. – 1975
This study empirically determined the optimizing weight to be applied to the Wrongs Total Score in scoring rubrics of the general form = R - kW, where S is the Score, R the Rights Total, k the weight and W the Wrongs Total, if reliability is to be maximized. As is well known, the traditional formula score rests on a theoretical framework which is…
Descriptors: Achievement Tests, Comparative Analysis, Guessing (Tests), Multiple Choice Tests
Sibley, William L. – 1974
The use of computers in areas of testing, selection, and placement processes for those in military services' training programs are viewed in this paper. Also discussed is a review of the motivational and theoretical foundation of admissible probability testing, the role of the computer in admissible probability testing, and the authors' experience…
Descriptors: Computer Oriented Programs, Computers, Interaction, Military Training
Previous Page | Next Page »
Pages: 1 | 2