NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Researchers3
What Works Clearinghouse Rating
Showing 1 to 15 of 99 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Jila Niknejad; Margaret Bayer – International Journal of Mathematical Education in Science and Technology, 2025
In Spring 2020, the need for redesigning online assessments to preserve integrity became a priority to many educators. Many of us found methods to proctor examinations using Zoom and proctoring software. Such examinations pose their own issues. To reduce the technical difficulties and cost, many Zoom proctored examination sessions were shortened;…
Descriptors: Mathematics Instruction, Mathematics Tests, Computer Assisted Testing, Computer Software
Peer reviewed Peer reviewed
Direct linkDirect link
Tugba Uygun; Pinar Guner; Irfan Simsek – International Journal of Mathematical Education in Science and Technology, 2024
This study was conducted to reveal potential sources of students' difficulty and misconceptions about geometrical concepts with the help of eye tracking. In this study, the students' geometrical misconceptions were explored by answering the questions on the geometry test prepared based on the literature and test-taking processes and represented…
Descriptors: Eye Movements, Geometric Concepts, Mathematics Instruction, Misconceptions
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Laura Kuusemets; Kristin Parve; Kati Ain; Tiina Kraav – International Journal of Education in Mathematics, Science and Technology, 2024
Using multiple-choice questions as learning and assessment tools is standard at all levels of education. However, when discussing the positive and negative aspects of their use, the time and complexity involved in producing plausible distractor options emerge as a disadvantage that offsets the time savings in relation to feedback. The article…
Descriptors: Program Evaluation, Artificial Intelligence, Computer Assisted Testing, Man Machine Systems
Peer reviewed Peer reviewed
Direct linkDirect link
Gruss, Richard; Clemons, Josh – Journal of Computer Assisted Learning, 2023
Background: The sudden growth in online instruction due to COVID-19 restrictions has given renewed urgency to questions about remote learning that have remained unresolved. Web-based assessment software provides instructors an array of options for varying testing parameters, but the pedagogical impacts of some of these variations has yet to be…
Descriptors: Test Items, Test Format, Computer Assisted Testing, Mathematics Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Urrutia, Felipe; Araya, Roberto – Journal of Educational Computing Research, 2024
Written answers to open-ended questions can have a higher long-term effect on learning than multiple-choice questions. However, it is critical that teachers immediately review the answers, and ask to redo those that are incoherent. This can be a difficult task and can be time-consuming for teachers. A possible solution is to automate the detection…
Descriptors: Elementary School Students, Grade 4, Elementary School Mathematics, Mathematics Tests
Anna Caroline Keefe – ProQuest LLC, 2022
Computer-assisted assessment continues to be incorporated into more and more mathematics courses. As this method of testing is used, questions are created to use through computer-assisted assessment. This study analyzed two types of questions used on computer-assisted assessment in Calculus I, II, and III courses. The first question type was…
Descriptors: Psychometrics, Computer Assisted Testing, Technology Integration, Calculus
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Zhang, Mengxue; Heffernan, Neil; Lan, Andrew – International Educational Data Mining Society, 2023
Automated scoring of student responses to open-ended questions, including short-answer questions, has great potential to scale to a large number of responses. Recent approaches for automated scoring rely on supervised learning, i.e., training classifiers or fine-tuning language models on a small number of responses with human-provided score…
Descriptors: Scoring, Computer Assisted Testing, Mathematics Instruction, Mathematics Tests
Aaron McVay – ProQuest LLC, 2021
As assessments move towards computerized testing and making continuous testing available the need for rapid assembly of forms is increasing. The objective of this study was to investigate variability in assembled forms through the lens of first- and second-order equity properties of equating, by examining three factors and their interactions. Two…
Descriptors: Automation, Computer Assisted Testing, Test Items, Reaction Time
Peer reviewed Peer reviewed
Direct linkDirect link
Katrin Klingbeil; Fabian Rösken; Bärbel Barzel; Florian Schacht; Kaye Stacey; Vicki Steinle; Daniel Thurm – ZDM: Mathematics Education, 2024
Assessing students' (mis)conceptions is a challenging task for teachers as well as for researchers. While individual assessment, for example through interviews, can provide deep insights into students' thinking, this is very time-consuming and therefore not feasible for whole classes or even larger settings. For those settings, automatically…
Descriptors: Multiple Choice Tests, Formative Evaluation, Mathematics Tests, Misconceptions
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Ghio, Fernanda Belén; Bruzzone, Manuel; Rojas-Torres, Luis; Cupani, Marcos – European Journal of Science and Mathematics Education, 2022
In the last decades, the development of computerized adaptive testing (CAT) has allowed more precise measurements with a smaller number of items. In this study, we develop an item bank (IB) to generate the adaptive algorithm and simulate the functioning of CAT to assess the domains of mathematical knowledge in Argentinian university students…
Descriptors: Test Items, Item Banks, Adaptive Testing, Mathematics Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Endang Susantini; Yurizka Melia Sari; Prima Vidya Asteria; Muhammad Ilyas Marzuqi – Journal of Education and Learning (EduLearn), 2025
Assessing preservice' higher order thinking skills (HOTS) in science and mathematics is essential. Teachers' HOTS ability is closely related to their ability to create HOTS-type science and mathematics problems. Among various types of HOTS, one is Bloomian HOTS. To facilitate the preservice teacher to create problems in those subjects, an Android…
Descriptors: Content Validity, Mathematics Instruction, Decision Making, Thinking Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Kosh, Audra E. – Journal of Applied Testing Technology, 2021
In recent years, Automatic Item Generation (AIG) has increasingly shifted from theoretical research to operational implementation, a shift raising some unforeseen practical challenges. Specifically, generating high-quality answer choices presents several challenges such as ensuring that answer choices blend in nicely together for all possible item…
Descriptors: Test Items, Multiple Choice Tests, Decision Making, Test Construction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Congning Ni; Bhashithe Abeysinghe; Juanita Hicks – International Electronic Journal of Elementary Education, 2025
The National Assessment of Educational Progress (NAEP), often referred to as The Nation's Report Card, offers a window into the state of U.S. K-12 education system. Since 2017, NAEP has transitioned to digital assessments, opening new research opportunities that were previously impossible. Process data tracks students' interactions with the…
Descriptors: Reaction Time, Multiple Choice Tests, Behavior Change, National Competency Tests
Crisp, Victoria; Shaw, Stuart – Research Matters, 2020
For assessment contexts where both a paper-based test and an on-screen assessment are available as alternatives, it is still common for the paper-based test to be prepared first with questions later transferred into an on-screen testing platform. One challenge with this is that some questions cannot be transferred. One solution might be for…
Descriptors: Computer Assisted Testing, Test Items, Test Construction, Mathematics Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Clements, Douglas H.; Banse, Holland; Sarama, Julie; Tatsuoka, Curtis; Joswick, Candace; Hudyma, Aaron; Van Dine, Douglas W.; Tatsuoka, Kikumi K. – Mathematical Thinking and Learning: An International Journal, 2022
Researchers often develop instruments using correctness scores (and a variety of theories and techniques, such as Item Response Theory) for validation and scoring. Less frequently, observations of children's strategies are incorporated into the design, development, and application of assessments. We conducted individual interviews of 833…
Descriptors: Item Response Theory, Computer Assisted Testing, Test Items, Mathematics Tests
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7