NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 36 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Vy Le; Jayson M. Nissen; Xiuxiu Tang; Yuxiao Zhang; Amirreza Mehrabi; Jason W. Morphew; Hua Hua Chang; Ben Van Dusen – Physical Review Physics Education Research, 2025
In physics education research, instructors and researchers often use research-based assessments (RBAs) to assess students' skills and knowledge. In this paper, we support the development of a mechanics cognitive diagnostic to test and implement effective and equitable pedagogies for physics instruction. Adaptive assessments using cognitive…
Descriptors: Physics, Science Education, Scientific Concepts, Diagnostic Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Achmad Rante Suparman; Eli Rohaeti; Sri Wening – Journal on Efficiency and Responsibility in Education and Science, 2024
This study focuses on developing a five-tier chemical diagnostic test based on a computer-based test with 11 assessment categories with an assessment score from 0 to 10. A total of 20 items produced were validated by education experts, material experts, measurement experts, and media experts, and an average index of the Aiken test > 0.70 was…
Descriptors: Chemistry, Diagnostic Tests, Computer Assisted Testing, Credits
Peer reviewed Peer reviewed
Direct linkDirect link
Saputra, H.; Suhandi, A.; Setiawan, A.; Permanasari, A.; Firmansyah, J. – Physics Education, 2023
This study proposes an online practicum model supported by Wireless Sensor Network (WSN) to implement a physics practicum after the COVID-19 Pandemic. This system is guided by exploratory inquiry questions to help structure students' mindsets in answering investigative questions. Online practicum is also integrated with video conferencing, chat,…
Descriptors: Science Tests, Computer Assisted Testing, Physics, Practicums
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Marli Crabtree; Kenneth L. Thompson; Ellen M. Robertson – HAPS Educator, 2024
Research has suggested that changing one's answer on multiple-choice examinations is more likely to lead to positive academic outcomes. This study aimed to further understand the relationship between changing answer selections and item attributes, student performance, and time within a population of 158 first-year medical students enrolled in a…
Descriptors: Anatomy, Science Tests, Medical Students, Medical Education
Maarten T. P. Beerepoot – Journal of Chemical Education, 2023
Digital automated assessment is a valuable and time-efficient tool for educators to provide immediate and objective feedback to learners. Automated assessment, however, puts high demands on the quality of the questions, alignment with the intended learning outcomes, and the quality of the feedback provided to the learners. We here describe the…
Descriptors: Formative Evaluation, Summative Evaluation, Chemistry, Science Instruction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Endang Susantini; Yurizka Melia Sari; Prima Vidya Asteria; Muhammad Ilyas Marzuqi – Journal of Education and Learning (EduLearn), 2025
Assessing preservice' higher order thinking skills (HOTS) in science and mathematics is essential. Teachers' HOTS ability is closely related to their ability to create HOTS-type science and mathematics problems. Among various types of HOTS, one is Bloomian HOTS. To facilitate the preservice teacher to create problems in those subjects, an Android…
Descriptors: Content Validity, Mathematics Instruction, Decision Making, Thinking Skills
Hong Jiao, Editor; Robert W. Lissitz, Editor – IAP - Information Age Publishing, Inc., 2024
With the exponential increase of digital assessment, different types of data in addition to item responses become available in the measurement process. One of the salient features in digital assessment is that process data can be easily collected. This non-conventional structured or unstructured data source may bring new perspectives to better…
Descriptors: Artificial Intelligence, Natural Language Processing, Psychometrics, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Olsho, Alexis; Smith, Trevor I.; Eaton, Philip; Zimmerman, Charlotte; Boudreaux, Andrew; White Brahmia, Suzanne – Physical Review Physics Education Research, 2023
We developed the Physics Inventory of Quantitative Literacy (PIQL) to assess students' quantitative reasoning in introductory physics contexts. The PIQL includes several "multiple-choice-multipleresponse" (MCMR) items (i.e., multiple-choice questions for which more than one response may be selected) as well as traditional single-response…
Descriptors: Multiple Choice Tests, Science Tests, Physics, Measures (Individuals)
Crisp, Victoria; Shaw, Stuart – Research Matters, 2020
For assessment contexts where both a paper-based test and an on-screen assessment are available as alternatives, it is still common for the paper-based test to be prepared first with questions later transferred into an on-screen testing platform. One challenge with this is that some questions cannot be transferred. One solution might be for…
Descriptors: Computer Assisted Testing, Test Items, Test Construction, Mathematics Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Yasuda, Jun-ichiro; Hull, Michael M.; Mae, Naohiro – Physical Review Physics Education Research, 2022
This paper presents improvements made to a computerized adaptive testing (CAT)-based version of the FCI (FCI-CAT) in regards to test security and test efficiency. First, we will discuss measures to enhance test security by controlling for item overexposure, decreasing the risk that respondents may (i) memorize the content of a pretest for use on…
Descriptors: Adaptive Testing, Computer Assisted Testing, Test Items, Risk Management
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Fadillah, Sarah Meilani; Ha, Minsu; Nuraeni, Eni; Indriyanti, Nurma Yunita – Malaysian Journal of Learning and Instruction, 2023
Purpose: Researchers discovered that when students were given the opportunity to change their answers, a majority changed their responses from incorrect to correct, and this change often increased the overall test results. What prompts students to modify their answers? This study aims to examine the modification of scientific reasoning test, with…
Descriptors: Science Tests, Multiple Choice Tests, Test Items, Decision Making
He, Wei – NWEA, 2022
To ensure that student academic growth in a subject area is accurately captured, it is imperative that the underlying scale remains stable over time. As item parameter stability constitutes one of the factors that affects scale stability, NWEA® periodically conducts studies to check for the stability of the item parameter estimates for MAP®…
Descriptors: Achievement Tests, Test Items, Test Reliability, Academic Achievement
Peer reviewed Peer reviewed
Direct linkDirect link
Goolsby-Cole, Cody; Bass, Sarah M.; Stanwyck, Liz; Leupen, Sarah; Carpenter, Tara S.; Hodges, Linda C. – Journal of College Science Teaching, 2023
During the pandemic, the use of question pools for online testing was recommended to mitigate cheating, exposing multitudes of science, technology, engineering, and mathematics (STEM) students across the globe to this practice. Yet instructors may be unfamiliar with the ways that seemingly small changes between questions in a pool can expose…
Descriptors: Science Instruction, Computer Assisted Testing, Cheating, STEM Education
Peer reviewed Peer reviewed
Direct linkDirect link
Zhang, Lishan; VanLehn, Kurt – Interactive Learning Environments, 2021
Despite their drawback, multiple-choice questions are an enduring feature in instruction because they can be answered more rapidly than open response questions and they are easily scored. However, it can be difficult to generate good incorrect choices (called "distractors"). We designed an algorithm to generate distractors from a…
Descriptors: Semantics, Networks, Multiple Choice Tests, Teaching Methods
Nixi Wang – ProQuest LLC, 2022
Measurement errors attributable to cultural issues are complex and challenging for educational assessments. We need assessment tests sensitive to the cultural heterogeneity of populations, and psychometric methods appropriate to address fairness and equity concerns. Built on the research of culturally responsive assessment, this dissertation…
Descriptors: Culturally Relevant Education, Testing, Equal Education, Validity
Previous Page | Next Page »
Pages: 1  |  2  |  3