Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 16 |
Since 2016 (last 10 years) | 19 |
Since 2006 (last 20 years) | 27 |
Descriptor
Computer Assisted Testing | 34 |
Test Items | 34 |
Undergraduate Students | 34 |
Test Format | 14 |
Difficulty Level | 13 |
Foreign Countries | 13 |
Multiple Choice Tests | 11 |
Test Construction | 9 |
Scores | 8 |
Item Response Theory | 7 |
Comparative Analysis | 6 |
More ▼ |
Source
Author
Wise, Steven L. | 3 |
Adler, Rachel | 1 |
Aydin, Furkan | 1 |
Bass, Sarah M. | 1 |
Bax, Stephen | 1 |
Ben-Porath, Yossef S. | 1 |
Boudreaux, Andrew | 1 |
Boyne, James | 1 |
Braithwaite, Nicholas St. J. | 1 |
Bridgeman, Brent | 1 |
Brown, Sandra | 1 |
More ▼ |
Publication Type
Journal Articles | 29 |
Reports - Research | 28 |
Reports - Evaluative | 4 |
Speeches/Meeting Papers | 4 |
Reports - Descriptive | 2 |
Information Analyses | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 28 |
Postsecondary Education | 26 |
Secondary Education | 2 |
High Schools | 1 |
Audience
Location
China | 2 |
Indonesia | 2 |
Malaysia | 2 |
New Jersey | 2 |
Australia | 1 |
Canada (Ottawa) | 1 |
Czech Republic | 1 |
Iran | 1 |
Ireland | 1 |
Louisiana (New Orleans) | 1 |
Michigan | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Graduate Record Examinations | 2 |
Defining Issues Test | 1 |
International English… | 1 |
Minnesota Multiphasic… | 1 |
SAT (College Admission Test) | 1 |
What Works Clearinghouse Rating
Gruss, Richard; Clemons, Josh – Journal of Computer Assisted Learning, 2023
Background: The sudden growth in online instruction due to COVID-19 restrictions has given renewed urgency to questions about remote learning that have remained unresolved. Web-based assessment software provides instructors an array of options for varying testing parameters, but the pedagogical impacts of some of these variations has yet to be…
Descriptors: Test Items, Test Format, Computer Assisted Testing, Mathematics Tests
Che Lah, Noor Hidayah; Tasir, Zaidatun; Jumaat, Nurul Farhana – Educational Studies, 2023
The aim of the study was to evaluate the extended version of the Problem-Solving Inventory (PSI) via an online learning setting known as the Online Problem-Solving Inventory (OPSI) through the lens of Rasch Model analysis. To date, there is no extended version of the PSI for online settings even though many researchers have used it; thus, this…
Descriptors: Problem Solving, Measures (Individuals), Electronic Learning, Item Response Theory
Junlan Pan; Emma Marsden – Language Testing, 2024
"Tests of Aptitude for Language Learning" (TALL) is an openly accessible internet-based battery to measure the multifaceted construct of foreign language aptitude, using language domain-specific instruments and L1-sensitive instructions and stimuli. This brief report introduces the components of this theory-informed battery and…
Descriptors: Language Tests, Aptitude Tests, Second Language Learning, Test Construction
Wijanarko, Bambang Dwi; Heryadi, Yaya; Toba, Hapnes; Budiharto, Widodo – Education and Information Technologies, 2021
Automated question generation is a task to generate questions from structured or unstructured data. The increasing popularity of online learning in recent years has given momentum to automated question generation in education field for facilitating learning process, learning material retrieval, and computer-based testing. This paper report on the…
Descriptors: Foreign Countries, Undergraduate Students, Engineering Education, Computer Software
Parker, Mark A. J.; Hedgeland, Holly; Jordan, Sally E.; Braithwaite, Nicholas St. J. – European Journal of Science and Mathematics Education, 2023
The study covers the development and testing of the alternative mechanics survey (AMS), a modified force concept inventory (FCI), which used automatically marked free-response questions. Data were collected over a period of three academic years from 611 participants who were taking physics classes at high school and university level. A total of…
Descriptors: Test Construction, Scientific Concepts, Physics, Test Reliability
Maarten T. P. Beerepoot – Journal of Chemical Education, 2023
Digital automated assessment is a valuable and time-efficient tool for educators to provide immediate and objective feedback to learners. Automated assessment, however, puts high demands on the quality of the questions, alignment with the intended learning outcomes, and the quality of the feedback provided to the learners. We here describe the…
Descriptors: Formative Evaluation, Summative Evaluation, Chemistry, Science Instruction
Olsho, Alexis; Smith, Trevor I.; Eaton, Philip; Zimmerman, Charlotte; Boudreaux, Andrew; White Brahmia, Suzanne – Physical Review Physics Education Research, 2023
We developed the Physics Inventory of Quantitative Literacy (PIQL) to assess students' quantitative reasoning in introductory physics contexts. The PIQL includes several "multiple-choice-multipleresponse" (MCMR) items (i.e., multiple-choice questions for which more than one response may be selected) as well as traditional single-response…
Descriptors: Multiple Choice Tests, Science Tests, Physics, Measures (Individuals)
Gorney, Kylie; Wollack, James A. – Practical Assessment, Research & Evaluation, 2022
Unlike the traditional multiple-choice (MC) format, the discrete-option multiple-choice (DOMC) format does not necessarily reveal all answer options to an examinee. The purpose of this study was to determine whether the reduced exposure of item content affects test security. We conducted an experiment in which participants were allowed to view…
Descriptors: Test Items, Test Format, Multiple Choice Tests, Item Analysis
Fadillah, Sarah Meilani; Ha, Minsu; Nuraeni, Eni; Indriyanti, Nurma Yunita – Malaysian Journal of Learning and Instruction, 2023
Purpose: Researchers discovered that when students were given the opportunity to change their answers, a majority changed their responses from incorrect to correct, and this change often increased the overall test results. What prompts students to modify their answers? This study aims to examine the modification of scientific reasoning test, with…
Descriptors: Science Tests, Multiple Choice Tests, Test Items, Decision Making
Krzic, Maja; Brown, Sandra – Natural Sciences Education, 2022
The transition of our large ([approximately]300 student) introductory soil science course to the online setting created several challenges, including engaging first- and second-year students, providing meaningful hands-on learning activities, and setting up online exams. The objective of this paper is to describe the development and use of…
Descriptors: Introductory Courses, Social Sciences, Online Courses, Educational Change
Goolsby-Cole, Cody; Bass, Sarah M.; Stanwyck, Liz; Leupen, Sarah; Carpenter, Tara S.; Hodges, Linda C. – Journal of College Science Teaching, 2023
During the pandemic, the use of question pools for online testing was recommended to mitigate cheating, exposing multitudes of science, technology, engineering, and mathematics (STEM) students across the globe to this practice. Yet instructors may be unfamiliar with the ways that seemingly small changes between questions in a pool can expose…
Descriptors: Science Instruction, Computer Assisted Testing, Cheating, STEM Education
Viskotová, Lenka; Hampel, David – Mathematics Teaching Research Journal, 2022
Computer-aided assessment is an important tool that reduces the workload of teachers and increases the efficiency of their work. The multiple-choice test is considered to be one of the most common forms of computer-aided testing and its application for mid-term has indisputable advantages. For the purposes of a high-quality and responsible…
Descriptors: Undergraduate Students, Mathematics Tests, Computer Assisted Testing, Faculty Workload
Uhl, Juli D.; Sripathi, Kamali N.; Meir, Eli; Merrill, John; Urban-Lurain, Mark; Haudek, Kevin C. – CBE - Life Sciences Education, 2021
The focus of biology education has shifted from memorization to conceptual understanding of core biological concepts such as matter and energy relationships. To examine undergraduate learning about matter and energy, we incorporated constructed-response (CR) questions into an interactive computer-based tutorial. The objective of this tutorial is…
Descriptors: Computer Assisted Testing, Writing Evaluation, Science Education, Biology
Sahin, Muhittin; Aydin, Furkan; Sulak, Sema; Müftüoglu, Cennet Terzi; Tepgeç, Mustafa; Yilmaz, Gizem Karaoglan; Yilmaz, Ramazan; Yurdugül, Halil – International Association for Development of the Information Society, 2021
The use of technology for teaching and learning has created a paradigm shifting in learning environments and learning process, and also the paradigm shifting has also affected the assessment processes. In addition to these, online environments provide more opportunities to assess of the learners. In this study, the Adaptive Mastery Testing (AMT)…
Descriptors: Teaching Methods, Learning Processes, Adaptive Testing, Computer Assisted Testing
Gu, Lin; Ling, Guangming; Liu, Ou Lydia; Yang, Zhitong; Li, Guirong; Kardanova, Elena; Loyalka, Prashant – Assessment & Evaluation in Higher Education, 2021
We examine the effects of computer-based versus paper-based assessment of critical thinking skills, adapted from English (in the U.S.) to Chinese. Using data collected based on a random assignment between the two modes in multiple Chinese colleges, we investigate mode effects from multiple perspectives: mean scores, measurement precision, item…
Descriptors: Critical Thinking, Tests, Test Format, Computer Assisted Testing