NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 146 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Martin Braun – New Directions in the Teaching of Natural Sciences, 2024
During the COVID pandemic, universities around the globe had to move not only their content delivery online, but also their assessments. Due to COVID causing significant upheaval in Higher Education (HE), this enforced experiment also afforded an opportunity to reflect on traditional, invigilated, closed book exams (ICBE) resulting in research and…
Descriptors: COVID-19, Pandemics, Computer Assisted Testing, Educational Technology
Peer reviewed Peer reviewed
Direct linkDirect link
Dongkwang Shin; Jang Ho Lee – ELT Journal, 2024
Although automated item generation has gained a considerable amount of attention in a variety of fields, it is still a relatively new technology in ELT contexts. Therefore, the present article aims to provide an accessible introduction to this powerful resource for language teachers based on a review of the available research. Particularly, it…
Descriptors: Language Tests, Artificial Intelligence, Test Items, Automation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Daniel M. Settlage; Jim R. Wollscheid – Journal of the Scholarship of Teaching and Learning, 2024
The examination of the testing mode effect has received increased attention as higher education has shifted to remote testing during the COVID-19 pandemic. We believe the testing mode effect consists of four components: the ability to physically write on the test, the method of answer recording, the proctoring/testing environment, and the effect…
Descriptors: College Students, Macroeconomics, Tests, Answer Sheets
Peer reviewed Peer reviewed
Direct linkDirect link
Lei Jiang; Na Yu – Education and Information Technologies, 2024
This research aims to address the challenges of digital transformation in education by understanding the digital competence of teachers through a mixed-methods approach. The grounded theory is employed to develop the Teachers' Digital Competence Model (TDCM), which is structured around three facets: development, pedagogy, and ethics. Within these…
Descriptors: Educational Technology, Teacher Competencies, Technological Literacy, Ethics
Peer reviewed Peer reviewed
Direct linkDirect link
Falcão, Filipe; Costa, Patrício; Pêgo, José M. – Advances in Health Sciences Education, 2022
Background: Current demand for multiple-choice questions (MCQs) in medical assessment is greater than the supply. Consequently, an urgency for new item development methods arises. Automatic Item Generation (AIG) promises to overcome this burden, generating calibrated items based on the work of computer algorithms. Despite the promising scenario,…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Test Items, Medical Education
Peer reviewed Peer reviewed
Direct linkDirect link
Lauritz Schewior; Marlit Annalena Lindner – Educational Psychology Review, 2024
Studies have indicated that pictures in test items can impact item-solving performance, information processing (e.g., time on task) and metacognition as well as test-taking affect and motivation. The present review aims to better organize the existing and somewhat scattered research on multimedia effects in testing and problem solving while…
Descriptors: Multimedia Materials, Computer Assisted Testing, Test Items, Pictorial Stimuli
Rogers, Christopher M.; Ressa, Virginia A.; Thurlow, Martha L.; Lazarus, Sheryl S. – National Center on Educational Outcomes, 2022
This report provides an update on the state of the research on testing accommodations. Previous reports by the National Center on Educational Outcomes (NCEO) have covered research published since 1999. In this report, we summarize the research published in 2020. During 2020, 11 research studies addressed testing accommodations in the U.S. K-12…
Descriptors: Elementary Secondary Education, Testing Accommodations, Students with Disabilities, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Huawei, Shi; Aryadoust, Vahid – Education and Information Technologies, 2023
Automated writing evaluation (AWE) systems are developed based on interdisciplinary research and technological advances such as natural language processing, computer sciences, and latent semantic analysis. Despite a steady increase in research publications in this area, the results of AWE investigations are often mixed, and their validity may be…
Descriptors: Writing Evaluation, Writing Tests, Computer Assisted Testing, Automation
Peer reviewed Peer reviewed
Direct linkDirect link
Suzumura, Nana – Language Assessment Quarterly, 2022
The present study is part of a larger mixed methods project that investigated the speaking section of the Advanced Placement (AP) Japanese Language and Culture Exam. It investigated assumptions for the evaluation inference through a content analysis of test taker responses. Results of the content analysis were integrated with those of a many-facet…
Descriptors: Content Analysis, Test Wiseness, Advanced Placement, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Chan, Cecilia Ka Yuk – Assessment & Evaluation in Higher Education, 2023
With the advances of technologies, possessing digital and information literacy is crucial for the selection of candidates by employers in this digital AI era. For most students, receiving and outputting electronic text has become the norm, and thus examinations with writing components done by hand may not accurately reflect their abilities. It…
Descriptors: Test Format, Handwriting, Stakeholders, Feedback (Response)
Chen, Dandan – Online Submission, 2023
Technology-driven shifts have created opportunities to improve efficiency and quality of assessments. Meanwhile, they may have exacerbated underlying socioeconomic issues in relation to educational equity. The increased implementation of technology-based assessments during the COVID-19 pandemic compounds the concern about the digital divide, as…
Descriptors: Technology Uses in Education, Computer Assisted Testing, Alternative Assessment, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Debarati Mukherjee; Supriya Bhavnani; Georgia Lockwood Estrin; Vaisnavi Rao; Jayashree Dasgupta; Hiba Irfan; Bhismadev Chakrabarti; Vikram Patel; Matthew K. Belmonte – Autism: The International Journal of Research and Practice, 2024
Current challenges in early identification of autism spectrum disorder lead to significant delays in starting interventions, thereby compromising outcomes. Digital tools can potentially address this barrier as they are accessible, can measure autism-relevant phenotypes and can be administered in children's natural environments by non-specialists.…
Descriptors: Autism Spectrum Disorders, Young Children, Evaluation, Technology
Peer reviewed Peer reviewed
Direct linkDirect link
Rios, Joseph A.; Deng, Jiayi – Large-scale Assessments in Education, 2021
Background: In testing contexts that are predominately concerned with power, rapid guessing (RG) has the potential to undermine the validity of inferences made from educational assessments, as such responses are unreflective of the knowledge, skills, and abilities assessed. Given this concern, practitioners/researchers have utilized a multitude of…
Descriptors: Test Wiseness, Guessing (Tests), Reaction Time, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Alhadi, Moosa; Zhang, Dake; Wang, Ting; Maher, Carolyn A. – Computers in the Schools, 2023
This research synthesizes studies that used a Digitalized Interactive Component (DIC) to assess K-12 student performance during Computer-based-Assessments (CBAs) in mathematics. A systematic search identified ten studies, including four that provided language assistance and six that provided response-construction support. We reported on the one…
Descriptors: Computer Assisted Testing, Mathematics Tests, Student Evaluation, Elementary Secondary Education
Pearson, 2019
Pearson Test of English Academic (PTE Academic) is a computer-based international English language test. Pearson developed PTE Academic in response to demand from higher education, governments, and other customers for a test that could more accurately measure the English communication skills of international students in an academic environment.…
Descriptors: Language Tests, English for Academic Purposes, Computer Assisted Testing, Communication Skills
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10