NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 4 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Jeske, Heimo J.; Lall, Manoj; Kogeda, Okuthe P. – Journal of Information Technology Education: Innovations in Practice, 2018
Aim/Purpose: The aim of this article is to develop a tool to detect plagiarism in real time amongst students being evaluated for learning in a computer-based assessment setting. Background: Cheating or copying all or part of source code of a program is a serious concern to academic institutions. Many academic institutions apply a combination of…
Descriptors: Plagiarism, Identification, Computer Software, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Veerasamy, Ashok Kumar; D'Souza, Daryl; Laakso, Mikko-Jussi – Journal of Educational Technology Systems, 2016
This article presents a study aimed at examining the novice student answers in an introductory programming final e-exam to identify misconceptions and types of errors. Our study used the Delphi concept inventory to identify student misconceptions and skill, rule, and knowledge-based errors approach to identify the types of errors made by novices…
Descriptors: Computer Science Education, Programming, Novices, Misconceptions
Peer reviewed Peer reviewed
Direct linkDirect link
Gekara, Victor Oyaro; Bloor, Michael; Sampson, Helen – Journal of Vocational Education and Training, 2011
Vocational education and training (VET) concerns the cultivation and development of specific skills and competencies, in addition to broad underpinning knowledge relating to paid employment. VET assessment is, therefore, designed to determine the extent to which a trainee has effectively acquired the knowledge, skills, and competencies required by…
Descriptors: Marine Education, Occupational Safety and Health, Computer Assisted Testing, Vocational Education
Peer reviewed Peer reviewed
Direct linkDirect link
Nevo, Dorit; McClean, Ron; Nevo, Saggi – Journal of Information Systems Education, 2010
This paper discusses the relative advantage offered by online Students' Evaluations of Teaching (SET) and describes a study conducted at a Canadian university to identify critical success factors of online evaluations from students' point of view. Factors identified as important by the students include anonymity, ease of use (of both SET survey…
Descriptors: Student Evaluation of Teacher Performance, Information Technology, Surveys, Evaluation Methods