NotesFAQContact Us
Collection
Advanced
Search Tips
Location
Germany1
Malaysia1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Sohee Kim; Ki Lynn Cole – International Journal of Testing, 2025
This study conducted a comprehensive comparison of Item Response Theory (IRT) linking methods applied to a bifactor model, examining their performance on both multiple choice (MC) and mixed format tests within the common item nonequivalent group design framework. Four distinct multidimensional IRT linking approaches were explored, consisting of…
Descriptors: Item Response Theory, Comparative Analysis, Models, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Yang Du; Susu Zhang – Journal of Educational and Behavioral Statistics, 2025
Item compromise has long posed challenges in educational measurement, jeopardizing both test validity and test security of continuous tests. Detecting compromised items is therefore crucial to address this concern. The present literature on compromised item detection reveals two notable gaps: First, the majority of existing methods are based upon…
Descriptors: Item Response Theory, Item Analysis, Bayesian Statistics, Educational Assessment
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Öztürk-Gübes, Nese; Kelecioglu, Hülya – Educational Sciences: Theory and Practice, 2016
The purpose of this study was to examine the impact of dimensionality, common-item set format, and different scale linking methods on preserving equity property with mixed-format test equating. Item response theory (IRT) true-score equating (TSE) and IRT observed-score equating (OSE) methods were used under common-item nonequivalent groups design.…
Descriptors: Test Format, Item Response Theory, True Scores, Equated Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Albano, Anthony D. – Journal of Educational Measurement, 2013
In many testing programs it is assumed that the context or position in which an item is administered does not have a differential effect on examinee responses to the item. Violations of this assumption may bias item response theory estimates of item and person parameters. This study examines the potentially biasing effects of item position. A…
Descriptors: Test Items, Item Response Theory, Test Format, Questioning Techniques
Peer reviewed Peer reviewed
Direct linkDirect link
Kirschner, Sophie; Borowski, Andreas; Fischer, Hans E.; Gess-Newsome, Julie; von Aufschnaiter, Claudia – International Journal of Science Education, 2016
Teachers' professional knowledge is assumed to be a key variable for effective teaching. As teacher education has the goal to enhance professional knowledge of current and future teachers, this knowledge should be described and assessed. Nevertheless, only a limited number of studies quantitatively measures physics teachers' professional…
Descriptors: Evaluation Methods, Tests, Test Format, Science Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Camilli, Gregory – Educational Research and Evaluation, 2013
In the attempt to identify or prevent unfair tests, both quantitative analyses and logical evaluation are often used. For the most part, fairness evaluation is a pragmatic attempt at determining whether procedural or substantive due process has been accorded to either a group of test takers or an individual. In both the individual and comparative…
Descriptors: Alternative Assessment, Test Bias, Test Content, Test Format
Morsy, Leila; Kieffer, Michael; Snow, Catherine – Carnegie Corporation of New York, 2010
Although millions of dollars and weeks of instructional time are spent nationally on testing students, educators often have little information on how to choose appropriate assessments of adolescent reading for informing instruction. This guide is designed to meet that need, by drawing together evidence about nine of the most commonly-used,…
Descriptors: Reading Comprehension, Reading Tests, Evaluation Methods, Adolescents
Hamzah, Hanizah; Ariffin, Siti Rahayah; Yassin, Ruhizan Mohd – Journal of Science and Mathematics Education in Southeast Asia, 2006
This study explored the differential performances of mathematics test items used to test secondary school girls and boys in the national examination. The main purpose was to find out whether type of items is the reason for girls' overachievement in the Malaysian mathematics national examination. To investigate seven types of items, Differential…
Descriptors: Test Items, Test Format, Females, Overachievement
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Wen-Chung; Wilson, Mark – Educational and Psychological Measurement, 2005
This study presents a procedure for detecting differential item functioning (DIF) for dichotomous and polytomous items in testlet-based tests, whereby DIF is taken into account by adding DIF parameters into the Rasch testlet model. Simulations were conducted to assess recovery of the DIF and other parameters. Two independent variables, test type…
Descriptors: Test Format, Test Bias, Item Response Theory, Item Analysis
Peer reviewed Peer reviewed
Ory, John C. – Educational and Psychological Measurement, 1982
In two studies, selections of evaluation form items were negatively worded and presented before or after overall student ratings. Ratings of courses and instructors were not significantly affected by wording. Differences in the global assessment of the courses are discussed. (Author/CM)
Descriptors: Course Evaluation, Evaluation Methods, Higher Education, Item Analysis
Austin, James T.; Mahlman, Robert A. – 2000
The process of assessment in career and technical education (CTE) is changing significantly under the influence of forces such as emphasis on assessment for individual and program accountability; emphasis on the investigation of consequences of assessment; emergence of item response theory, which supports computer adaptive testing; and pressure…
Descriptors: Career Education, Computer Assisted Testing, Computer Oriented Programs, Computer Uses in Education