NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
Force Concept Inventory1
What Works Clearinghouse Rating
Showing all 15 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Plasencia, Javier – Biochemistry and Molecular Biology Education, 2023
Multiple studies have shown that testing contributes to learning at all educational levels. In this observational classroom study, we report the use of a learning tool developed for a Genetics and Molecular Biology course at the college level. An interactive set of practice exams that included 136 multiple choice questions (MCQ) or matching…
Descriptors: Molecular Biology, Genetics, Science Tests, College Science
Peer reviewed Peer reviewed
Direct linkDirect link
Keith Cochran; Clayton Cohn; Peter Hastings; Noriko Tomuro; Simon Hughes – International Journal of Artificial Intelligence in Education, 2024
To succeed in the information age, students need to learn to communicate their understanding of complex topics effectively. This is reflected in both educational standards and standardized tests. To improve their writing ability for highly structured domains like scientific explanations, students need feedback that accurately reflects the…
Descriptors: Science Process Skills, Scientific Literacy, Scientific Concepts, Concept Formation
Peer reviewed Peer reviewed
Direct linkDirect link
Kempegowda, Swetha Nagarahalli; Ramachandra, Shobha Chikkavaddaragudi; Arun, Brunda; Devaraju, Abhijith; Shivashankar, Kusuma Kasapura; Raghunathachar, Sahana Kabbathy; Bettadapura, Anjalidevi Shankarrao; Puttalingaiah, Sujatha; Devegowda, Devananda; Vishwanath, Prashant; Nataraj, Suma Maduvanahalli; Prashant, Akila – Biochemistry and Molecular Biology Education, 2023
Online assessments are needed during the prevailing pandemic situation to continue educational activities while ensuring safety. After conducting the online practical assessment (OPrA) in Biochemistry, we analyzed the students' responses. The blueprint of the OPrA was prepared by the faculty, referring to the various levels and domains of Bloom's…
Descriptors: Biochemistry, Science Instruction, Science Tests, Feedback (Response)
Maarten T. P. Beerepoot – Journal of Chemical Education, 2023
Digital automated assessment is a valuable and time-efficient tool for educators to provide immediate and objective feedback to learners. Automated assessment, however, puts high demands on the quality of the questions, alignment with the intended learning outcomes, and the quality of the feedback provided to the learners. We here describe the…
Descriptors: Formative Evaluation, Summative Evaluation, Chemistry, Science Instruction
Hong Jiao, Editor; Robert W. Lissitz, Editor – IAP - Information Age Publishing, Inc., 2024
With the exponential increase of digital assessment, different types of data in addition to item responses become available in the measurement process. One of the salient features in digital assessment is that process data can be easily collected. This non-conventional structured or unstructured data source may bring new perspectives to better…
Descriptors: Artificial Intelligence, Natural Language Processing, Psychometrics, Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Parker, Mark A. J.; Hedgeland, Holly; Braithwaite, Nicholas; Jordan, Sally – European Journal of Science and Mathematics Education, 2022
The study investigated student reaction to the alternative mechanics survey (AMS), a modified force concept inventory, which used automatically marked free-response questions and offered limited feedback to students after their answers had been submitted. Eight participants were observed in completing the AMS, and they were interviewed to gain…
Descriptors: Student Attitudes, Feedback (Response), Learning Processes, Physics
Peer reviewed Peer reviewed
Direct linkDirect link
Kay, Alison E.; Hardy, Judy; Galloway, Ross K. – British Journal of Educational Technology, 2020
This study explores the relationship between engagement with an online, free-to-use question-generation application (PeerWise) and student achievement. Using PeerWise, students can create and answer multiple-choice questions and can provide feedback to the question authors on question quality. This provides further scope for students to engage in…
Descriptors: Computer Assisted Testing, Multiple Choice Tests, Academic Achievement, Feedback (Response)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Rosin, Triin; Vaino, Katrin; Soobard, Regina; Rannikmäe, Miia – Science Education International, 2021
Competence-based, science e-testing (CBSeT) is a novel external assessment tool which provides feedback to science teachers about their students' competence-based skills, thereby giving relevant assessment information which teachers could potentially use in their teaching practice. The aim of this study was to determine the extent to which…
Descriptors: Foreign Countries, Science Teachers, Beliefs, Science Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Carpenter, Tara S.; Beall, Lisa Carter; Hodges, Linda C. – Journal of Teaching and Learning with Technology, 2020
An exam wrapper is a reflective exercise completed after an exam, designed to help foster students' metacognition and self-regulation. Typical exam wrappers ask students how they studied, what they missed and why, and how they would change their study behaviors to improve on the next exam. In small classes, exam wrappers can be administered as…
Descriptors: Metacognition, Feedback (Response), Integrated Learning Systems, Measures (Individuals)
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah; Lord, Trudi; Mulholland, Matthew; Liu, Ou Lydia – Science Education, 2019
This paper describes HASbot, an automated text scoring and real-time feedback system designed to support student revision of scientific arguments. Students submit open-ended text responses to explain how their data support claims and how the limitations of their data affect the uncertainty of their explanations. HASbot automatically scores these…
Descriptors: Middle School Students, High School Students, Student Evaluation, Science Education
Peer reviewed Peer reviewed
Direct linkDirect link
Mao, Liyang; Liu, Ou Lydia; Roohr, Katrina; Belur, Vinetha; Mulholland, Matthew; Lee, Hee-Sun; Pallant, Amy – Educational Assessment, 2018
Scientific argumentation is one of the core practices for teachers to implement in science classrooms. We developed a computer-based formative assessment to support students' construction and revision of scientific arguments. The assessment is built upon automated scoring of students' arguments and provides feedback to students and teachers.…
Descriptors: Computer Assisted Testing, Science Tests, Scoring, Automation
Peer reviewed Peer reviewed
Direct linkDirect link
Hope, Sheila A.; Polwart, Anthony – Bioscience Education, 2012
The National Union of Students (NUS) National Student Experience Report identified examination feedback as an area where students had particular concerns. This finding was echoed in the authors' institution and triggered an action research project to investigate ways of improving students' perceptions of pre- and post-exam feedback. We report the…
Descriptors: Formative Evaluation, Student Evaluation, Science Tests, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Voelkel, Susanne – Research in Learning Technology, 2013
The aim of this action research project was to improve student learning by encouraging more "time on task" and to improve self-assessment and feedback through the introduction of weekly online tests in a Year 2 lecture module in biological sciences. Initially voluntary online tests were offered to students and those who participated…
Descriptors: Formative Evaluation, Summative Evaluation, Computer Assisted Testing, Feedback (Response)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schultz, Madeleine – Journal of Learning Design, 2011
This paper reports on the development of a tool that generates randomised, non-multiple choice assessment within the BlackBoard Learning Management System interface. An accepted weakness of multiple-choice assessment is that it cannot elicit learning outcomes from upper levels of Biggs' SOLO taxonomy. However, written assessment items require…
Descriptors: Foreign Countries, Feedback (Response), Student Evaluation, Large Group Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Alexander, Cara J.; Crescini, Weronika M.; Juskewitch, Justin E.; Lachman, Nirusha; Pawlina, Wojciech – Anatomical Sciences Education, 2009
The goals of our study were to determine the predictive value and usability of an audience response system (ARS) as a knowledge assessment tool in an undergraduate medical curriculum. Over a three year period (2006-2008), data were collected from first year didactic blocks in Genetics/Histology and Anatomy/Radiology (n = 42-50 per class). During…
Descriptors: Feedback (Response), Medical Education, Audience Response, Genetics