NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Teneqexhi, Romeo; Kuneshka, Loreta; Naço, Adrian – International Association for Development of the Information Society, 2018
Organizing exams or competitions with multiple choice questions and assessment by technology today is something that happens in many educational institutions around the world. These kinds of exams or tests as a rule are done by answering questions in a so-called answer sheet form. In this form, each student or participant in the exam is obliged to…
Descriptors: Foreign Countries, Competition, Multiple Choice Tests, Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Teneqexhi, Romeo; Qirko, Margarita; Sharko, Genci; Vrapi, Fatmir; Kuneshka, Loreta – International Association for Development of the Information Society, 2017
Exams assessment is one of the most tedious work for university teachers all over the world. Multiple choice theses make exams assessment a little bit easier, but the teacher cannot prepare more than 3-4 variants; in this case, the possibility of students for cheating from one another becomes a risk for "objective assessment outcome." On…
Descriptors: Testing, Computer Assisted Testing, Test Items, Test Construction
Catts, Ralph – 1978
The reliability of multiple choice tests--containing different numbers of response options--was investigated for 260 students enrolled in technical college economics courses. Four test forms, constructed from previously used four-option items, were administered, consisting of (1) 60 two-option items--two distractors randomly discarded; (2) 40…
Descriptors: Answer Sheets, Difficulty Level, Foreign Countries, Higher Education
PDF pending restoration PDF pending restoration
Gilmer, Jerry S. – 1979
Sixty college students from classes in educational measurement were divided into two groups. Each group was administered the same criterion test except that one group received feedback after every item and the other received no feedback. The students were also divided into three ability levels. Each test item was classified two ways; by item…
Descriptors: Academic Ability, Answer Keys, Answer Sheets, College Students
Peer reviewed Peer reviewed
Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing
Kleinke, David J. – 1979
Four forms of a 36-item adaptation of the Stanford Achievement Test were administered to 484 fourth graders. External factors potentially influencing test performance were examined, namely: (1) item order (easy-to-difficult vs. uniform); (2) response location (left column vs. right column); (3) handedness which may interact with response location;…
Descriptors: Achievement Tests, Answer Sheets, Difficulty Level, Eye Hand Coordination