NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 35 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Jeremy L. Hsu; Noelle Clark; Kate Hill; Melissa Rowland-Goldsmith – CBE - Life Sciences Education, 2023
Nearly all undergraduate biology courses rely on quizzes and exams. Despite their prevalence, very little work has been done to explore how the framing of assessment questions may influence student performance and affect. Here, we conduct a quasi-random experimental study where students in different sections of the same course were given…
Descriptors: Undergraduate Students, Biology, Science Education, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Crowther, Gregory J.; Knight, Thomas A. – Advances in Physiology Education, 2023
The past [approximately]15 years have seen increasing interest in defining disciplinary core concepts. Within the field of physiology, Michael, McFarland, Modell, and colleagues have published studies that defined physiology core concepts and have elaborated many of these as detailed conceptual frameworks. With such helpful definitions now in…
Descriptors: Test Format, Physiology, Higher Education, Concept Teaching
Peer reviewed Peer reviewed
Direct linkDirect link
Spratto, Elisabeth M.; Bandalos, Deborah L. – Journal of Experimental Education, 2020
Research suggests that certain characteristics of survey items may impact participants' responses. In this study we investigated the impact of several of these characteristics: vague wording, question-versus-statement phrasing, and full-versus-partial labeling of response options. We manipulated survey items per these characteristics and randomly…
Descriptors: Attitude Measures, Test Format, Test Construction, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Davis-Berg, Elizabeth C.; Minbiole, Julie – School Science Review, 2020
The completion rates were compared for long-form questions where a large blank answer space is provided and for long-form questions where the answer space has bullet-points prompts corresponding to the parts of the question. It was found that students were more likely to complete a question when bullet points were provided in the answer space.…
Descriptors: Test Format, Test Construction, Academic Achievement, Educational Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Yangqiuting Li; Chandralekha Singh – Physical Review Physics Education Research, 2025
Research-based multiple-choice questions implemented in class with peer instruction have been shown to be an effective tool for improving students' engagement and learning outcomes. Moreover, multiple-choice questions that are carefully sequenced to build on each other can be particularly helpful for students to develop a systematic understanding…
Descriptors: Physics, Science Instruction, Science Tests, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wicaksono, Azizul Ghofar Candra; Korom, Erzsébet – Participatory Educational Research, 2022
The accuracy of learning results relies on the evaluation and assessment. The learning goals, including problem solving ability must be aligned with the valid standardized measurement tools. The study on exploring the nature of problem-solving, framework, and assessment in the Indonesian context will make contributions to problem solving…
Descriptors: Problem Solving, Educational Research, Test Construction, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Walsh, Cole; Quinn, Katherine N.; Wieman, C.; Holmes, N. G. – Physical Review Physics Education Research, 2019
Introductory physics lab instruction is undergoing a transformation, with increasing emphasis on developing experimentation and critical thinking skills. These changes present a need for standardized assessment instruments to determine the degree to which students develop these skills through instructional labs. In this article, we present the…
Descriptors: Critical Thinking, Physics, Cognitive Tests, Science Experiments
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Yulianto, Ahmad; Pudjitriherwanti, Anastasia; Kusumah, Chevy; Oktavia, Dies – International Journal of Language Testing, 2023
The increasing use of computer-based mode in language testing raises concern over its similarities with and differences from paper-based format. The present study aimed to delineate discrepancies between TOEFL PBT and CBT. For that objective, a quantitative method was employed to probe into scores equivalence, the performance of male-female…
Descriptors: Computer Assisted Testing, Test Format, Comparative Analysis, Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Viskotová, Lenka; Hampel, David – Mathematics Teaching Research Journal, 2022
Computer-aided assessment is an important tool that reduces the workload of teachers and increases the efficiency of their work. The multiple-choice test is considered to be one of the most common forms of computer-aided testing and its application for mid-term has indisputable advantages. For the purposes of a high-quality and responsible…
Descriptors: Undergraduate Students, Mathematics Tests, Computer Assisted Testing, Faculty Workload
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kaharu, Sarintan N.; Mansyur, Jusman – Pegem Journal of Education and Instruction, 2021
This study aims to develop a test that can be used to explore mental models and representation patterns of objects in liquid fluid. The test developed by adapting the Reeves's Development Model was carried out in several stages, namely: determining the orientation and test segments; initial survey; preparation of the initial draft; try out;…
Descriptors: Test Construction, Schemata (Cognition), Scientific Concepts, Water
Peer reviewed Peer reviewed
Direct linkDirect link
O'Grady, Stefan – Language Teaching Research, 2023
The current study explores the impact of varying multiple-choice question preview and presentation formats in a test of second language listening proficiency targeting different levels of text comprehension. In a between-participant design, participants completed a 30-item test of listening comprehension featuring implicit and explicit information…
Descriptors: Language Tests, Multiple Choice Tests, Scores, Second Language Learning
Peer reviewed Peer reviewed
Direct linkDirect link
Loudon, Catherine; Macias-Muñoz, Aide – Advances in Physiology Education, 2018
Different versions of multiple-choice exams were administered to an undergraduate class in human physiology as part of normal testing in the classroom. The goal was to evaluate whether the number of options (possible answers) per question influenced the effectiveness of this assessment. Three exams (each with three versions) were given to each of…
Descriptors: Multiple Choice Tests, Test Construction, Test Items, Science Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Joseph, Dane Christian – Journal of Effective Teaching in Higher Education, 2019
Multiple-choice testing is a staple within the U.S. higher education system. From classroom assessments to standardized entrance exams such as the GRE, GMAT, or LSAT, test developers utilize a variety of validated and heuristic driven item-writing guidelines. One such guideline that has been given recent attention is to randomize the position of…
Descriptors: Test Construction, Multiple Choice Tests, Guessing (Tests), Test Wiseness
Miller, Faith G.; Riley-Tillman, T. Chris; Chafouleas, Sandra M.; Schardt, Alyssa A. – Assessment for Effective Intervention, 2017
The purpose of this study was to investigate the impact of two different Direct Behavior Rating--Single Item Scale (DBR-SIS) formats on rating accuracy. A total of 119 undergraduate students participated in one of two study conditions, each utilizing a different DBR-SIS scale format: one that included percentage of time anchors on the DBR-SIS…
Descriptors: Behavior Rating Scales, Test Format, Accuracy, Undergraduate Students
Peer reviewed Peer reviewed
Direct linkDirect link
Kunnan, Antony John; Qin, Coral Yiwei; Zhao, Cecilia Guanfang – Language Assessment Quarterly, 2022
A new computer-assisted test of academic English for use at an Asian University was commissioned by administrators. The test was designed to serve both placement and diagnostic purposes. The authors and their team conceptualized, developed, and administered a scenario-based assessment with an online delivery with independent and integrated…
Descriptors: Computer Assisted Testing, English (Second Language), Second Language Learning, Vignettes
Previous Page | Next Page »
Pages: 1  |  2  |  3