NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 43 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Su, Kun; Henson, Robert A. – Journal of Educational and Behavioral Statistics, 2023
This article provides a process to carefully evaluate the suitability of a content domain for which diagnostic classification models (DCMs) could be applicable and then optimized steps for constructing a test blueprint for applying DCMs and a real-life example illustrating this process. The content domains were carefully evaluated using a set of…
Descriptors: Classification, Models, Science Tests, Physics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Paul J. Walter; Edward Nuhfer; Crisel Suarez – Numeracy, 2021
We introduce an approach for making a quantitative comparison of the item response curves (IRCs) of any two populations on a multiple-choice test instrument. In this study, we employ simulated and actual data. We apply our approach to a dataset of 12,187 participants on the 25-item Science Literacy Concept Inventory (SLCI), which includes ample…
Descriptors: Item Analysis, Multiple Choice Tests, Simulation, Data Analysis
Yanan Feng – ProQuest LLC, 2021
This dissertation aims to investigate the effect size measures of differential item functioning (DIF) detection in the context of cognitive diagnostic models (CDMs). A variety of DIF detection techniques have been developed in the context of CDMs. However, most of the DIF detection procedures focus on the null hypothesis significance test. Few…
Descriptors: Effect Size, Item Response Theory, Cognitive Measurement, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Le Hebel, Florence; Montpied, Pascale; Tiberghien, Andrée; Fontanieu, Valérie – International Journal of Science Education, 2017
The understanding of what makes a question difficult is a crucial concern in assessment. To study the difficulty of test questions, we focus on the case of PISA, which assesses to what degree 15-year-old students have acquired knowledge and skills essential for full participation in society. Our research question is to identify PISA science item…
Descriptors: Achievement Tests, Foreign Countries, International Assessment, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Traxler, Adrienne; Henderson, Rachel; Stewart, John; Stewart, Gay; Papak, Alexis; Lindell, Rebecca – Physical Review Physics Education Research, 2018
Research on the test structure of the Force Concept Inventory (FCI) has largely ignored gender, and research on FCI gender effects (often reported as "gender gaps") has seldom interrogated the structure of the test. These rarely crossed streams of research leave open the possibility that the FCI may not be structurally valid across…
Descriptors: Physics, Science Instruction, Sex Fairness, Gender Differences
Peer reviewed Peer reviewed
Direct linkDirect link
Martinková, Patricia; Drabinová, Adéla; Liaw, Yuan-Ling; Sanders, Elizabeth A.; McFarland, Jenny L.; Price, Rebecca M. – CBE - Life Sciences Education, 2017
We provide a tutorial on differential item functioning (DIF) analysis, an analytic method useful for identifying potentially biased items in assessments. After explaining a number of methodological approaches, we test for gender bias in two scenarios that demonstrate why DIF analysis is crucial for developing assessments, particularly because…
Descriptors: Test Bias, Test Items, Gender Bias, Science Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Ene, Emanuela; Ackerson, Bruce J. – Physical Review Physics Education Research, 2018
We describe the construction, validation, and testing of a concept inventory for an "Introduction to Physics of Semiconductors" course offered by the department of physics to undergraduate engineering students. By design, this inventory addresses both content knowledge and the ability to interpret content via different cognitive…
Descriptors: Physics, Science Instruction, Small Classes, College Science
Peer reviewed Peer reviewed
Direct linkDirect link
Undersander, Molly A.; Lund, Travis J.; Langdon, Laurie S.; Stains, Marilyne – Chemistry Education Research and Practice, 2017
The design of assessment tools is critical to accurately evaluate students' understanding of chemistry. Although extensive research has been conducted on various aspects of assessment tool design, few studies in chemistry have focused on the impact of the order in which questions are presented to students on the measurement of students'…
Descriptors: Test Construction, Scientific Concepts, Concept Formation, Science Education
Peer reviewed Peer reviewed
Direct linkDirect link
Hoe, Kai Yee; Subramaniam, R. – Chemistry Education Research and Practice, 2016
This study presents an analysis of alternative conceptions (ACs) on acid--base chemistry harbored by grade 9 students in Singapore. The ACs were obtained by the development and validation of a 4-tier diagnostic instrument. It is among the very few studies in the science education literature that have focused on examining results based also on…
Descriptors: Foreign Countries, Secondary School Students, Secondary School Science, Chemistry
Peer reviewed Peer reviewed
Direct linkDirect link
McColgan, Michele W.; Finn, Rose A.; Broder, Darren L.; Hassel, George E. – Physical Review Physics Education Research, 2017
We present the Electricity and Magnetism Conceptual Assessment (EMCA), a new assessment aligned with second-semester introductory physics courses. Topics covered include electrostatics, electric fields, circuits, magnetism, and induction. We have two motives for writing a new assessment. First, we find other assessments such as the Brief…
Descriptors: Energy, Magnets, Scientific Concepts, Student Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Schiffl, Iris – Universal Journal of Educational Research, 2016
Science standards have been a topic in educational research in Austria for about ten years now. Starting in 2005, competency structure models have been developed for junior and senior classes of different school types. After evaluating these models, prototypic tasks were created to point out the meaning of the models to teachers. At the moment,…
Descriptors: Science Education, Standards, Foreign Countries, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Saß, Steffani; Schütte, Kerstin – Journal of Psychoeducational Assessment, 2016
Solving test items might require abilities in test-takers other than the construct the test was designed to assess. Item and student characteristics such as item format or reading comprehension can impact the test result. This experiment is based on cognitive theories of text and picture comprehension. It examines whether integration aids, which…
Descriptors: Reading Difficulties, Science Tests, Test Items, Visual Aids
Peer reviewed Peer reviewed
Direct linkDirect link
Carpenter, Shana K.; Rahman, Shuhebur; Lund, Terry J. S.; Armstrong, Patrick I.; Lamm, Monica H.; Reason, Robert D.; Coffman, Clark R. – CBE - Life Sciences Education, 2017
Retrieval practice has been shown to produce significant enhancements in student learning of course information, but the extent to which students make use of retrieval to learn information on their own is unclear. In the current study, students in a large introductory biology course were provided with optional online review questions that could be…
Descriptors: Introductory Courses, Biology, Science Instruction, Science Achievement
Peer reviewed Peer reviewed
Direct linkDirect link
Todd, Amber; Romine, William L.; Cook Whitt, Katahdin – Science Education, 2017
We describe the development, validation, and use of the "Learning Progression-Based Assessment of Modern Genetics" (LPA-MG) in a high school biology context. Items were constructed based on a current learning progression framework for genetics (Shea & Duncan, 2013; Todd & Kenyon, 2015). The 34-item instrument, which was tied to…
Descriptors: Genetics, Science Instruction, High School Students, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
El Masri, Yasmine H.; Ferrara, Steve; Foltz, Peter W.; Baird, Jo-Anne – Curriculum Journal, 2017
Predicting item difficulty is highly important in education for both teachers and item writers. Despite identifying a large number of explanatory variables, predicting item difficulty remains a challenge in educational assessment with empirical attempts rarely exceeding 25% of variance explained. This paper analyses 216 science items of key stage…
Descriptors: Predictor Variables, Test Items, Difficulty Level, Test Construction
Previous Page | Next Page »
Pages: 1  |  2  |  3