NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Brinkhuis, Matthieu J. S.; Savi, Alexander O.; Hofman, Abe D.; Coomans, Frederik; van der Maas, Han L. J.; Maris, Gunter – Journal of Learning Analytics, 2018
With the advent of computers in education, and the ample availability of online learning and practice environments, enormous amounts of data on learning become available. The purpose of this paper is to present a decade of experience with analyzing and improving an online practice environment for math, which has thus far recorded over a billion…
Descriptors: Data Analysis, Mathematics Instruction, Accuracy, Reaction Time
Peer reviewed Peer reviewed
Direct linkDirect link
Costagliola, Gennaro; Fuccella, Vittorio – International Journal of Distance Education Technologies, 2009
To correctly evaluate learners' knowledge, it is important to administer tests composed of good quality question items. By the term "quality" we intend the potential of an item in effectively discriminating between skilled and untrained students and in obtaining tutor's desired difficulty level. This article presents a rule-based e-testing system…
Descriptors: Difficulty Level, Test Items, Computer Assisted Testing, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Lai, Ah-Fur; Chen, Deng-Jyi; Chen, Shu-Ling – Journal of Educational Multimedia and Hypermedia, 2008
The IRT (Item Response Theory) has been studied and applied in computer-based test for decades. However, almost of all these existing studies evaluated focus merely on test questions with text-based (or static text/graphic) type of presentation form illustrated exclusively. In this paper, we present our study on test questions using both…
Descriptors: Elementary School Students, Semantics, Difficulty Level, Item Response Theory