NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 4 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Osman Tat; Abdullah Faruk Kilic – Turkish Online Journal of Distance Education, 2024
The widespread availability of internet access in daily life has resulted in a greater acceptance of online assessment methods. E-assessment platforms offer various features such as randomizing questions and answers, utilizing extensive question banks, setting time limits, and managing access during online exams. Electronic assessment enables…
Descriptors: Test Construction, Test Validity, Test Reliability, Anxiety
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Parker, Mark A. J.; Hedgeland, Holly; Jordan, Sally E.; Braithwaite, Nicholas St. J. – European Journal of Science and Mathematics Education, 2023
The study covers the development and testing of the alternative mechanics survey (AMS), a modified force concept inventory (FCI), which used automatically marked free-response questions. Data were collected over a period of three academic years from 611 participants who were taking physics classes at high school and university level. A total of…
Descriptors: Test Construction, Scientific Concepts, Physics, Test Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Kiddle, Thom; Kormos, Judit – Language Assessment Quarterly, 2011
This article reports on a study conducted with 42 participants from a Chilean university, which aimed to determine the effect of mode of response on test performance and test-taker perception of test features by comparing a semidirect online version and a direct face-to-face version of a speaking test. Candidate performances on both test versions…
Descriptors: Student Attitudes, Test Theory, Foreign Countries, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Xu, Yuejin; Iran-Nejad, Asghar; Thoma, Stephen J. – Journal of Interactive Online Learning, 2007
The purpose of the study was to determine comparability of an online version to the original paper-pencil version of Defining Issues Test 2 (DIT2). This study employed methods from both Classical Test Theory (CTT) and Item Response Theory (IRT). Findings from CTT analyses supported the reliability and discriminant validity of both versions.…
Descriptors: Computer Assisted Testing, Test Format, Comparative Analysis, Test Theory