Publication Date
| In 2026 | 0 |
| Since 2025 | 127 |
| Since 2022 (last 5 years) | 753 |
| Since 2017 (last 10 years) | 1648 |
| Since 2007 (last 20 years) | 2823 |
Descriptor
| Computer Assisted Testing | 2823 |
| Foreign Countries | 1004 |
| Test Items | 613 |
| Language Tests | 581 |
| Scores | 557 |
| Test Construction | 492 |
| Second Language Learning | 471 |
| English (Second Language) | 451 |
| Comparative Analysis | 413 |
| Test Validity | 410 |
| Student Evaluation | 402 |
| More ▼ | |
Source
Author
Publication Type
Education Level
Audience
| Teachers | 29 |
| Researchers | 15 |
| Administrators | 12 |
| Practitioners | 12 |
| Policymakers | 7 |
| Students | 6 |
| Parents | 2 |
Location
| Australia | 77 |
| China | 67 |
| Turkey | 67 |
| Germany | 64 |
| United Kingdom | 61 |
| Canada | 51 |
| Iran | 48 |
| Taiwan | 48 |
| Spain | 46 |
| Japan | 44 |
| Indonesia | 39 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 2 |
| Meets WWC Standards with or without Reservations | 2 |
| Does not meet standards | 2 |
Ying Xu; Xiaodong Li; Jin Chen – Language Testing, 2025
This article provides a detailed review of the Computer-based English Listening Speaking Test (CELST) used in Guangdong, China, as part of the National Matriculation English Test (NMET) to assess students' English proficiency. The CELST measures listening and speaking skills as outlined in the "English Curriculum for Senior Middle…
Descriptors: Computer Assisted Testing, English (Second Language), Language Tests, Listening Comprehension Tests
Yi-Jui I. Chen; Yi-Jhen Wu; Yi-Hsin Chen; Robin Irey – Journal of Psychoeducational Assessment, 2025
A short form of the 60-item computer-based orthographic processing assessment (long-form COPA or COPA-LF) was developed. The COPA-LF consists of five skills, including rapid perception, access, differentiation, correction, and arrangement. Thirty items from the COPA-LF were selected for the short-form COPA (COPA-SF) based on cognitive diagnostic…
Descriptors: Computer Assisted Testing, Test Length, Test Validity, Orthographic Symbols
Nathaniel Owen; Ananda Senel – Review of Education, 2025
Transparency in high-stakes English language assessment has become crucial for ensuring fairness and maintaining assessment validity in language testing. However, our understanding of how transparency is conceptualised and implemented remains fragmented, particularly in relation to stakeholder experiences and technological innovations. This study…
Descriptors: Accountability, High Stakes Tests, Language Tests, Computer Assisted Testing
Stefan O'Grady – International Journal of Listening, 2025
Language assessment is increasingly computermediated. This development presents opportunities with new task formats and equally a need for renewed scrutiny of established conventions. Recent recommendations to increase integrated skills assessment in lecture comprehension tests is premised on empirical research that demonstrates enhanced construct…
Descriptors: Language Tests, Lecture Method, Listening Comprehension Tests, Multiple Choice Tests
Militsa G. Ivanova; Hanna Eklöf; Michalis P. Michaelides – Journal of Applied Testing Technology, 2025
Digital administration of assessments allows for the collection of process data indices, such as response time, which can serve as indicators of rapid-guessing and examinee test-taking effort. Setting a time threshold is essential to distinguish effortful from effortless behavior using item response times. Threshold identification methods may…
Descriptors: Test Items, Computer Assisted Testing, Reaction Time, Achievement Tests
Sukru Murat Cebeci; Selcuk Acar – Journal of Creative Behavior, 2025
This study presents the Cebeci Test of Creativity (CTC), a novel computerized assessment tool designed to address the limitations of traditional open-ended paper-and-pencil creativity tests. The CTC is designed to overcome the challenges associated with the administration and manual scoring of traditional paper and pencil creativity tests. In this…
Descriptors: Creativity, Creativity Tests, Test Construction, Test Validity
Jun-ichiro Yasuda; Michael M. Hull; Naohiro Mae; Kentaro Kojima – Physical Review Physics Education Research, 2025
Although conceptual assessment tests are commonly administered at the beginning and end of a semester, this pre-post approach has inherent limitations. Specifically, education researchers and instructors have limited ability to observe the progression of students' conceptual understanding throughout the course. Furthermore, instructors are limited…
Descriptors: Computer Assisted Testing, Adaptive Testing, Science Tests, Scientific Concepts
Sarah N. Shakir; Ashley M. Virabouth; Mallory M. Rice – American Biology Teacher, 2025
Exam anxiety has been well-documented to reduce student performance in undergraduate biology courses, especially for students from marginalized groups, which can contribute to achievement gaps. Our exploratory study surveyed 61 undergraduate biology students to better understand how exams affect their anxiety levels, focusing on the impact of exam…
Descriptors: Undergraduate Students, College Science, Biology, Student Attitudes
Joanna Williamson – Research Matters, 2025
Teachers, examiners and assessment experts know from experience that some candidates annotate exam questions. "Annotation" includes anything the candidate writes or draws outside of the designated response space, such as underlining, jotting, circling, sketching and calculating. Annotations are of interest because they may evidence…
Descriptors: Mathematics, Tests, Documentation, Secondary Education
Junlan Pan; Emma Marsden – Language Testing, 2024
"Tests of Aptitude for Language Learning" (TALL) is an openly accessible internet-based battery to measure the multifaceted construct of foreign language aptitude, using language domain-specific instruments and L1-sensitive instructions and stimuli. This brief report introduces the components of this theory-informed battery and…
Descriptors: Language Tests, Aptitude Tests, Second Language Learning, Test Construction
Ildiko Porter-Szucs; Cynthia J. Macknish; Suzanne Toohey – John Wiley & Sons, Inc, 2025
"A Practical Guide to Language Assessment" helps educators at every level redefine their approach to language assessment. Grounded in extensive research and aligned with the latest advances in language education, this comprehensive guide introduces foundational concepts and explores key principles in test development and item writing.…
Descriptors: Student Evaluation, Language Tests, Test Construction, Test Items
Achmad Rante Suparman; Eli Rohaeti; Sri Wening – Journal on Efficiency and Responsibility in Education and Science, 2024
This study focuses on developing a five-tier chemical diagnostic test based on a computer-based test with 11 assessment categories with an assessment score from 0 to 10. A total of 20 items produced were validated by education experts, material experts, measurement experts, and media experts, and an average index of the Aiken test > 0.70 was…
Descriptors: Chemistry, Diagnostic Tests, Computer Assisted Testing, Credits
Jeff Allen; Jay Thomas; Stacy Dreyer; Scott Johanningmeier; Dana Murano; Ty Cruce; Xin Li; Edgar Sanchez – ACT Education Corp., 2025
This report describes the process of developing and validating the enhanced ACT. The report describes the changes made to the test content and the processes by which these design decisions were implemented. The authors describe how they shared the overall scope of the enhancements, including the initial blueprints, with external expert panels,…
Descriptors: College Entrance Examinations, Testing, Change, Test Construction
Nese Öztürk Gübes – International Journal of Assessment Tools in Education, 2025
The Trends in International Mathematics and Science Study (TIMSS) was administered via computer, eTIMSS, for the first time in 2019. The purpose of this study was to investigate item block position and item format effect on eighth grade mathematics item easiness in low- and high-achieving countries of eTIMSS 2019. Item responses from Chile, Qatar,…
Descriptors: Foreign Countries, International Assessment, Achievement Tests, Mathematics Achievement
Andreas Frey; Christoph König; Aron Fink – Journal of Educational Measurement, 2025
The highly adaptive testing (HAT) design is introduced as an alternative test design for the Programme for International Student Assessment (PISA). The principle of HAT is to be as adaptive as possible when selecting items while accounting for PISA's nonstatistical constraints and addressing issues concerning PISA such as item position effects.…
Descriptors: Adaptive Testing, Test Construction, Alternative Assessment, Achievement Tests

Peer reviewed
Direct link
