NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 4 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Teneqexhi, Romeo; Kuneshka, Loreta; Naço, Adrian – International Association for Development of the Information Society, 2018
Organizing exams or competitions with multiple choice questions and assessment by technology today is something that happens in many educational institutions around the world. These kinds of exams or tests as a rule are done by answering questions in a so-called answer sheet form. In this form, each student or participant in the exam is obliged to…
Descriptors: Foreign Countries, Competition, Multiple Choice Tests, Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Chang, Frederic Tao-Yi; Li, Mao-Neng Fred – International Journal of Educational Technology, 2019
IF-AT (also called answer until correct) is a revised form of traditional multiple-choice testing. It was created to provide instant feedback during testing and to tap into partial knowledge of learners. The purposes of this meta-analytic study were to compare the effectiveness of IF-AT with the traditional multiple-choice test; to assess…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Feedback (Response), Answer Sheets
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Teneqexhi, Romeo; Qirko, Margarita; Sharko, Genci; Vrapi, Fatmir; Kuneshka, Loreta – International Association for Development of the Information Society, 2017
Exams assessment is one of the most tedious work for university teachers all over the world. Multiple choice theses make exams assessment a little bit easier, but the teacher cannot prepare more than 3-4 variants; in this case, the possibility of students for cheating from one another becomes a risk for "objective assessment outcome." On…
Descriptors: Testing, Computer Assisted Testing, Test Items, Test Construction
Peer reviewed Peer reviewed
Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing