NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 46 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Jeremy L. Hsu; Noelle Clark; Kate Hill; Melissa Rowland-Goldsmith – CBE - Life Sciences Education, 2023
Nearly all undergraduate biology courses rely on quizzes and exams. Despite their prevalence, very little work has been done to explore how the framing of assessment questions may influence student performance and affect. Here, we conduct a quasi-random experimental study where students in different sections of the same course were given…
Descriptors: Undergraduate Students, Biology, Science Education, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Yangqiuting Li; Chandralekha Singh – Physical Review Physics Education Research, 2025
Research-based multiple-choice questions implemented in class with peer instruction have been shown to be an effective tool for improving students' engagement and learning outcomes. Moreover, multiple-choice questions that are carefully sequenced to build on each other can be particularly helpful for students to develop a systematic understanding…
Descriptors: Physics, Science Instruction, Science Tests, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wicaksono, Azizul Ghofar Candra; Korom, Erzsébet – Participatory Educational Research, 2022
The accuracy of learning results relies on the evaluation and assessment. The learning goals, including problem solving ability must be aligned with the valid standardized measurement tools. The study on exploring the nature of problem-solving, framework, and assessment in the Indonesian context will make contributions to problem solving…
Descriptors: Problem Solving, Educational Research, Test Construction, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Chen, Qingwei; Zhu, Guangtian; Liu, Qiaoyi; Han, Jing; Fu, Zhao; Bao, Lei – Physical Review Physics Education Research, 2020
Problem-solving categorization tasks have been well studied and used as an effective tool for assessment of student knowledge structure. In this study, a traditional free-response categorization test has been modified into a multiple-choice format, and the effectiveness of this new assessment is evaluated. Through randomized testing with Chinese…
Descriptors: Foreign Countries, Test Construction, Multiple Choice Tests, Problem Solving
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Klender, Sara; Ferriby, Andrew; Notebaert, Andrew – HAPS Educator, 2019
Multiple-choice questions (MCQ) are commonly used on histology examinations. There are many guidelines for how to properly write MCQ and many of them recommend avoiding negatively worded stems. The current study aims to investigate differences between positively and negatively worded stems in a medical histology course by comparing the item…
Descriptors: Multiple Choice Tests, Science Tests, Biology, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Loudon, Catherine; Macias-Muñoz, Aide – Advances in Physiology Education, 2018
Different versions of multiple-choice exams were administered to an undergraduate class in human physiology as part of normal testing in the classroom. The goal was to evaluate whether the number of options (possible answers) per question influenced the effectiveness of this assessment. Three exams (each with three versions) were given to each of…
Descriptors: Multiple Choice Tests, Test Construction, Test Items, Science Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Joseph, Dane Christian – Journal of Effective Teaching in Higher Education, 2019
Multiple-choice testing is a staple within the U.S. higher education system. From classroom assessments to standardized entrance exams such as the GRE, GMAT, or LSAT, test developers utilize a variety of validated and heuristic driven item-writing guidelines. One such guideline that has been given recent attention is to randomize the position of…
Descriptors: Test Construction, Multiple Choice Tests, Guessing (Tests), Test Wiseness
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Crowther, Gregory J.; Wiggins, Benjamin L.; Jenkins, Lekelia D. – HAPS Educator, 2020
Many undergraduate biology instructors incorporate active learning exercises into their lessons while continuing to assess students with traditional exams. To better align practice and exams, we present an approach to question-asking that emphasizes templates instead of specific questions. Students and instructors can use these Test Question…
Descriptors: Science Tests, Active Learning, Biology, Undergraduate Students
Peer reviewed Peer reviewed
Direct linkDirect link
Lindner, Marlit A.; Schult, Johannes; Mayer, Richard E. – Journal of Educational Psychology, 2022
This classroom experiment investigates the effects of adding representational pictures to multiple-choice and constructed-response test items to understand the role of the response format for the multimedia effect in testing. Participants were 575 fifth- and sixth-graders who answered 28 science test items--seven items in each of four experimental…
Descriptors: Elementary School Students, Grade 5, Grade 6, Multimedia Materials
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wright, Christian D.; Huang, Austin L.; Cooper, Katelyn M.; Brownell, Sara E. – International Journal for the Scholarship of Teaching and Learning, 2018
College instructors in the United States usually make their own decisions about how to design course exams. Even though summative course exams are well known to be important to student success, we know little about the decision making of instructors when designing course exams. To probe how instructors design exams for introductory biology, we…
Descriptors: College Faculty, Science Teachers, Science Tests, Teacher Made Tests
Haladyna, Thomas M. – IDEA Center, Inc., 2018
Writing multiple-choice test items to measure student learning in higher education is a challenge. Based on extensive scholarly research and experience, the author describes various item formats, offers guidelines for creating these items, and provides many examples of both good and bad test items. He also suggests some shortcuts for developing…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Higher Education
Peer reviewed Peer reviewed
Direct linkDirect link
Boone, William J. – CBE - Life Sciences Education, 2016
This essay describes Rasch analysis psychometric techniques and how such techniques can be used by life sciences education researchers to guide the development and use of surveys and tests. Specifically, Rasch techniques can be used to document and evaluate the measurement functioning of such instruments. Rasch techniques also allow researchers to…
Descriptors: Item Response Theory, Psychometrics, Science Education, Educational Research
Martin, Michael O., Ed.; von Davier, Matthias, Ed.; Mullis, Ina V. S., Ed. – International Association for the Evaluation of Educational Achievement, 2020
The chapters in this online volume comprise the TIMSS & PIRLS International Study Center's technical report of the methods and procedures used to develop, implement, and report the results of TIMSS 2019. There were various technical challenges because TIMSS 2019 was the initial phase of the transition to eTIMSS, with approximately half the…
Descriptors: Foreign Countries, Elementary Secondary Education, Achievement Tests, International Assessment
Castle, Courtney – ProQuest LLC, 2018
The Next Generation Science Standards propose a multidimensional model of science learning, comprised of Core Disciplinary Ideas, Science and Engineering Practices, and Crosscutting Concepts (NGSS Lead States, 2013). Accordingly, there is a need for student assessment aligned with the new standards. Creating assessments that validly and reliably…
Descriptors: Science Education, Student Evaluation, Science Tests, Test Construction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Frey, Bruce B.; Ellis, James D.; Bulgreen, Janis A.; Hare, Jana Craig; Ault, Marilyn – Electronic Journal of Science Education, 2015
"Scientific argumentation," defined as the ability to develop and analyze scientific claims, support claims with evidence from investigations of the natural world, and explain and evaluate the reasoning that connects the evidence to the claim, is a critical component of current science standards and is consistent with "Common Core…
Descriptors: Test Construction, Science Tests, Persuasive Discourse, Science Process Skills
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4