Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 2 |
Descriptor
Answer Sheets | 6 |
Multiple Choice Tests | 6 |
Test Items | 6 |
Test Construction | 4 |
Computer Assisted Testing | 3 |
Difficulty Level | 3 |
Foreign Countries | 3 |
Higher Education | 3 |
Research Reports | 3 |
College Students | 2 |
Responses | 2 |
More ▼ |
Author
Kuneshka, Loreta | 2 |
Teneqexhi, Romeo | 2 |
Bridgeman, Brent | 1 |
Catts, Ralph | 1 |
Gilmer, Jerry S. | 1 |
Kleinke, David J. | 1 |
Naço, Adrian | 1 |
Qirko, Margarita | 1 |
Sharko, Genci | 1 |
Vrapi, Fatmir | 1 |
Publication Type
Reports - Research | 4 |
Speeches/Meeting Papers | 4 |
Reports - Descriptive | 2 |
Journal Articles | 1 |
Education Level
Audience
Laws, Policies, & Programs
Assessments and Surveys
Graduate Record Examinations | 1 |
What Works Clearinghouse Rating
Teneqexhi, Romeo; Kuneshka, Loreta; Naço, Adrian – International Association for Development of the Information Society, 2018
Organizing exams or competitions with multiple choice questions and assessment by technology today is something that happens in many educational institutions around the world. These kinds of exams or tests as a rule are done by answering questions in a so-called answer sheet form. In this form, each student or participant in the exam is obliged to…
Descriptors: Foreign Countries, Competition, Multiple Choice Tests, Computer Assisted Testing
Teneqexhi, Romeo; Qirko, Margarita; Sharko, Genci; Vrapi, Fatmir; Kuneshka, Loreta – International Association for Development of the Information Society, 2017
Exams assessment is one of the most tedious work for university teachers all over the world. Multiple choice theses make exams assessment a little bit easier, but the teacher cannot prepare more than 3-4 variants; in this case, the possibility of students for cheating from one another becomes a risk for "objective assessment outcome." On…
Descriptors: Testing, Computer Assisted Testing, Test Items, Test Construction
Catts, Ralph – 1978
The reliability of multiple choice tests--containing different numbers of response options--was investigated for 260 students enrolled in technical college economics courses. Four test forms, constructed from previously used four-option items, were administered, consisting of (1) 60 two-option items--two distractors randomly discarded; (2) 40…
Descriptors: Answer Sheets, Difficulty Level, Foreign Countries, Higher Education

Gilmer, Jerry S. – 1979
Sixty college students from classes in educational measurement were divided into two groups. Each group was administered the same criterion test except that one group received feedback after every item and the other received no feedback. The students were also divided into three ability levels. Each test item was classified two ways; by item…
Descriptors: Academic Ability, Answer Keys, Answer Sheets, College Students

Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing
Kleinke, David J. – 1979
Four forms of a 36-item adaptation of the Stanford Achievement Test were administered to 484 fourth graders. External factors potentially influencing test performance were examined, namely: (1) item order (easy-to-difficult vs. uniform); (2) response location (left column vs. right column); (3) handedness which may interact with response location;…
Descriptors: Achievement Tests, Answer Sheets, Difficulty Level, Eye Hand Coordination