NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 9 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Yang Du; Susu Zhang – Journal of Educational and Behavioral Statistics, 2025
Item compromise has long posed challenges in educational measurement, jeopardizing both test validity and test security of continuous tests. Detecting compromised items is therefore crucial to address this concern. The present literature on compromised item detection reveals two notable gaps: First, the majority of existing methods are based upon…
Descriptors: Item Response Theory, Item Analysis, Bayesian Statistics, Educational Assessment
Chen, Dandan – Online Submission, 2023
Technology-driven shifts have created opportunities to improve efficiency and quality of assessments. Meanwhile, they may have exacerbated underlying socioeconomic issues in relation to educational equity. The increased implementation of technology-based assessments during the COVID-19 pandemic compounds the concern about the digital divide, as…
Descriptors: Technology Uses in Education, Computer Assisted Testing, Alternative Assessment, Test Format
Peer reviewed Peer reviewed
PDF on ERIC Download full text
National Assessment of Educational Progress (NAEP), 2025
Also known as The Nation's Report Card, the National Assessment of Educational Progress (NAEP) is the largest nationally representative and continuing assessment of student achievement and their learning experiences in various subjects for the nation, states, and 27 urban districts. The National Center for Education Statistics (NCES) is currently…
Descriptors: National Competency Tests, Innovation, Futures (of Society), Achievement Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Robertson, Sarah N.; Humphrey, Samia M.; Steele, John P. – Journal of Educators Online, 2019
Assessment is and has been a deliberate and essential piece of education. However, with the recent emergence and popularity of online education, faculty members have to find new ways to engage online learners with formative assessments. While much of the online learning environment can be self-guided, faculty interventions can make the content…
Descriptors: Computer Assisted Testing, Formative Evaluation, Student Evaluation, Undergraduate Students
Peer reviewed Peer reviewed
Direct linkDirect link
Luo, Xin; Reckase, Mark D.; He, Wei – AERA Online Paper Repository, 2016
While dichotomous item dominates the application of computerized adaptive testing (CAT), polytomous item and set-based item hold promises for being incorporated in CAT. However, how to assemble a CAT containing mixed item formats is challenging. This study investigated: (1) how the mixed CAT works compared with the dichotomous-item-based CAT; (2)…
Descriptors: Test Items, Test Format, Computer Assisted Testing, Adaptive Testing
Crabtree, Ashleigh R. – ProQuest LLC, 2016
The purpose of this research is to provide information about the psychometric properties of technology-enhanced (TE) items and the effects these items have on the content validity of an assessment. Specifically, this research investigated the impact that the inclusion of TE items has on the construct of a mathematics test, the technical properties…
Descriptors: Psychometrics, Computer Assisted Testing, Test Items, Test Format
Wang, Xinrui – ProQuest LLC, 2013
The computer-adaptive multistage testing (ca-MST) has been developed as an alternative to computerized adaptive testing (CAT), and been increasingly adopted in large-scale assessments. Current research and practice only focus on ca-MST panels for credentialing purposes. The ca-MST test mode, therefore, is designed to gauge a single scale. The…
Descriptors: Computer Assisted Testing, Adaptive Testing, Diagnostic Tests, Comparative Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Wan, Lei; Henly, George A. – Applied Measurement in Education, 2012
Many innovative item formats have been proposed over the past decade, but little empirical research has been conducted on their measurement properties. This study examines the reliability, efficiency, and construct validity of two innovative item formats--the figural response (FR) and constructed response (CR) formats used in a K-12 computerized…
Descriptors: Test Items, Test Format, Computer Assisted Testing, Measurement
Peer reviewed Peer reviewed
Lloyd, D.; And Others – Assessment & Evaluation in Higher Education, 1996
In an engineering technology course at Coventry University (England), the utility of computer-assisted tests was compared with that of traditional paper-based tests. It was found that the computer-based technique was acceptable to students, produced valid results, and demonstrated potential for saving staff time. (Author/MSE)
Descriptors: Comparative Analysis, Computer Assisted Testing, Efficiency, Engineering Education