NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Pagram, Jeremy; Cooper, Martin; Jin, Huifen; Campbell, Alistair – Education Sciences, 2018
The Centre for Schooling and Learning Technologies (CSaLT) at Edith Cowan University (ECU) was asked in 2016 to be the Western Australian arm of a national e-exam project. This project used a bespoke exam system installed on a USB-drive to deliver what would have been traditional paper-based exams in an enclosed computer-based environment that was…
Descriptors: Foreign Countries, Computer Assisted Testing, Computer Science Education, Preservice Teachers
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sullivan, Daniel P. – Online Learning, 2016
Cheating, left untended, erodes the validity of evaluation and, ultimately, corrupts the legitimacy of a course. We profile an approach to manage, with an eye toward preempting, cheating on asynchronous, objective, online quizzes. This approach taps various technological and social solutions to academic dishonesty, integrating them into a…
Descriptors: Graduate Students, Cheating, Prevention, Business Administration Education
Peer reviewed Peer reviewed
Direct linkDirect link
Hamilton, Ian Robert – Campus-Wide Information Systems, 2009
Purpose: The purpose of this paper is to report on the rationale behind the use of a unique paper-based individualised accounting assignment, which automated the provision to students of immediate formative and timely summative feedback. Design/methodology/approach: As students worked towards completing their assignment, the package provided…
Descriptors: Feedback (Response), Assignments, Accounting, Test Construction
Peer reviewed Peer reviewed
Bennett, Randy Elliot; Morley, Mary; Quardt, Dennis; Rock, Donald A.; Singley, Mark K.; Katz, Irvin R.; Nhouyvanisvong, Adisack – Journal of Educational Measurement, 1999
Evaluated a computer-delivered response type for measuring quantitative skill, the "Generating Examples" (GE) response type, which presents under-determined problems that can have many right answers. Results from 257 graduate students and applicants indicate that GE scores are reasonably reliable, but only moderately related to Graduate…
Descriptors: College Applicants, Computer Assisted Testing, Graduate Students, Graduate Study
Peer reviewed Peer reviewed
Direct linkDirect link
Burrow, Michael; Evdorides, Harry; Hallam, Barbara; Freer-Hewish, Richard – European Journal of Engineering Education, 2005
This paper outlines an approach taken to produce computer-based formative assessments for two modules in a one-year taught MSc programme in Road Management and Engineering. It presents the aims of the assessments, the taxonomy adopted to ensure that the formulation of the questions addressed learning outcomes related to the development of higher…
Descriptors: Evaluation Methods, Formative Evaluation, Psychometrics, Engineering Education
Peer reviewed Peer reviewed
Wise, Steven L.; And Others – Journal of Educational Measurement, 1992
Performance of 156 undergraduate and 48 graduate students on a self-adapted test (SFAT)--students choose the difficulty level of their test items--was compared with performance on a computer-adapted test (CAT). Those taking the SFAT obtained higher ability scores and reported lower posttest state anxiety than did CAT takers. (SLD)
Descriptors: Adaptive Testing, Comparative Testing, Computer Assisted Testing, Difficulty Level