Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 3 |
Descriptor
Knowledge Level | 11 |
Scoring Formulas | 11 |
Multiple Choice Tests | 8 |
Guessing (Tests) | 5 |
Scoring | 4 |
Scores | 3 |
Test Items | 3 |
Comparative Analysis | 2 |
Confidence Testing | 2 |
Response Style (Tests) | 2 |
Risk | 2 |
More ▼ |
Source
Author
Frary, Robert B. | 2 |
Aiken, Lewis R. | 1 |
Bakker, J. | 1 |
Beek, F. J. A. | 1 |
Bruno, James E. | 1 |
Brunza, J. Jay | 1 |
DeSanto, Dan | 1 |
Haaring, C. | 1 |
Hamdan, M. A. | 1 |
Hutchinson, T. P. | 1 |
Jancarík, Antonín | 1 |
More ▼ |
Publication Type
Journal Articles | 6 |
Reports - Research | 6 |
Tests/Questionnaires | 2 |
Numerical/Quantitative Data | 1 |
Reports - Evaluative | 1 |
Education Level
Higher Education | 2 |
Postsecondary Education | 2 |
Audience
Policymakers | 1 |
Practitioners | 1 |
Researchers | 1 |
Location
Czech Republic | 1 |
Vermont | 1 |
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 1 |
What Works Clearinghouse Rating
DeSanto, Dan; Nichols, Aaron – College & Research Libraries, 2017
This article presents the results of a faculty survey conducted at the University of Vermont during academic year 2014-2015. The survey asked faculty about: familiarity with scholarly metrics, metric-seeking habits, help-seeking habits, and the role of metrics in their department's tenure and promotion process. The survey also gathered faculty…
Descriptors: College Faculty, Teacher Surveys, Knowledge Level, Use Studies
Ravesloot, C. J.; Van der Schaaf, M. F.; Muijtjens, A. M. M.; Haaring, C.; Kruitwagen, C. L. J. J.; Beek, F. J. A.; Bakker, J.; Van Schaik, J.P.J.; Ten Cate, Th. J. – Advances in Health Sciences Education, 2015
Formula scoring (FS) is the use of a don't know option (DKO) with subtraction of points for wrong answers. Its effect on construct validity and reliability of progress test scores, is subject of discussion. Choosing a DKO may not only be affected by knowledge level, but also by risk taking tendency, and may thus introduce construct-irrelevant…
Descriptors: Scoring Formulas, Tests, Scores, Construct Validity
Jancarík, Antonín; Kostelecká, Yvona – Electronic Journal of e-Learning, 2015
Electronic testing has become a regular part of online courses. Most learning management systems offer a wide range of tools that can be used in electronic tests. With respect to time demands, the most efficient tools are those that allow automatic assessment. The presented paper focuses on one of these tools: matching questions in which one…
Descriptors: Online Courses, Computer Assisted Testing, Test Items, Scoring Formulas

Hutchinson, T. P. – Contemporary Educational Psychology, 1980
In scoring multiple-choice tests, a score of 1 is given to right answers, 0 to unanswered questions, and some negative score to wrong answers. This paper discusses the relation of this negative score to the assumption made about the partial knowledge with the subjects may have. (Author/GDC)
Descriptors: Guessing (Tests), Knowledge Level, Multiple Choice Tests, Scoring Formulas

Frary, Robert B. – Applied Measurement in Education, 1989
Multiple-choice response and scoring methods that attempt to determine an examinee's degree of knowledge about each item in order to produce a total test score are reviewed. There is apparently little advantage to such schemes; however, they may have secondary benefits such as providing feedback to enhance learning. (SLD)
Descriptors: Knowledge Level, Multiple Choice Tests, Scoring, Scoring Formulas

Hamdan, M. A.; Krutchkoff, R. G. – Journal of Experimental Education, 1975
The separation level of grades on a multiple-choice examination as a quantitative probabilistic criterion for correct classification of students by the examination was introduced by Krutchoff. (Author)
Descriptors: Educational Research, Knowledge Level, Multiple Choice Tests, Scoring Formulas

Frary, Robert B. – Applied Psychological Measurement, 1980
Six scoring methods for assigning weights to right or wrong responses according to various instructions given to test takers are analyzed with respect to expected change scores and the effect of various levels of information and misinformation. Three of the methods provide feedback to the test taker. (Author/CTM)
Descriptors: Guessing (Tests), Knowledge Level, Multiple Choice Tests, Scores
Pugh, Richard C.; Brunza, J. Jay – 1974
An examinee is required to express his confidence in the correctness of each choice of a multiple-choice item in a probabilistic test. For the responses to be valid indicators the confidence expressed in each choice should be determined by an examinees' knowledge. This study assessed the relationship of the certainty of examinees' responses to…
Descriptors: Behavior, Confidence Testing, Guessing (Tests), Individual Characteristics

Aiken, Lewis R.; Williams, Newsom – Educational and Psychological Measurement, 1978
Seven formulas for scoring test items with two options (true-false or multiple choice with only two choices) were investigated. Several conditions, such as varying directions for guessing and whether testees had prior knowledge of the proportions of false items on the test were also investigated. (Author/JKS)
Descriptors: Guessing (Tests), Higher Education, Knowledge Level, Multiple Choice Tests
Education Commission of the States, Denver, CO. National Assessment of Educational Progress. – 1983
Exercises from the National Assessment of Educational Progress (NAEP) third mathematics assessment are provided in this released exercise set. Exercises were administered to 9-year-olds, 13-year-olds, and 17-year-olds. Some exercises were administered to only one age group, others to two or more age groups. The set is divided into two parts: text…
Descriptors: Elementary School Mathematics, Elementary Secondary Education, Item Banks, Knowledge Level
Bruno, James E.; Opp, Ronald D. – 1985
The admissable probability measurement (APM) format was used to score a criterion referenced language arts test administered in an inner city junior high school. Its 30 items covered capitalization, punctuation, parts of speech, and sentence analysis. With APM, students indicate their confidence in their answer choice, and guessing is heavily…
Descriptors: Confidence Testing, Criterion Referenced Tests, Educational Testing, Equivalency Tests