NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
National Defense Education Act1
What Works Clearinghouse Rating
Showing 1 to 15 of 48 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Baghaei, Purya; Christensen, Karl Bang – Language Testing, 2023
C-tests are gap-filling tests mainly used as rough and economical measures of second-language proficiency for placement and research purposes. A C-test usually consists of several short independent passages where the second half of every other word is deleted. Owing to their interdependent structure, C-test items violate the local independence…
Descriptors: Item Response Theory, Language Tests, Language Proficiency, Second Language Learning
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Mimi Ismail; Ahmed Al - Badri; Said Al - Senaidi – Journal of Education and e-Learning Research, 2025
This study aimed to reveal the differences in individuals' abilities, their standard errors, and the psychometric properties of the test according to the two methods of applying the test (electronic and paper). The descriptive approach was used to achieve the study's objectives. The study sample consisted of 74 male and female students at the…
Descriptors: Achievement Tests, Computer Assisted Testing, Psychometrics, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
H. Cigdem Bulut; Okan Bulut; Ashley Clelland – Field Methods, 2025
In this study, we explored psychometric network analysis (PNA) as an alternative method for identifying item wording effects in self-report instruments. We examined the functioning of negatively worded items in the network structures of two math-related scales from the 2019 Trends in International Mathematics and Science Study (TIMSS); Students…
Descriptors: Psychometrics, Network Analysis, Identification, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Qi Huang; Daniel M. Bolt; Weicong Lyu – Large-scale Assessments in Education, 2024
Large scale international assessments depend on invariance of measurement across countries. An important consideration when observing cross-national differential item functioning (DIF) is whether the DIF actually reflects a source of bias, or might instead be a methodological artifact reflecting item response theory (IRT) model misspecification.…
Descriptors: Test Items, Item Response Theory, Test Bias, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Joo, Seang-Hwane; Khorramdel, Lale; Yamamoto, Kentaro; Shin, Hyo Jeong; Robin, Frederic – Educational Measurement: Issues and Practice, 2021
In Programme for International Student Assessment (PISA), item response theory (IRT) scaling is used to examine the psychometric properties of items and scales and to provide comparable test scores across participating countries and over time. To balance the comparability of IRT item parameter estimations across countries with the best possible…
Descriptors: Foreign Countries, International Assessment, Achievement Tests, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Chengyu Cui; Chun Wang; Gongjun Xu – Grantee Submission, 2024
Multidimensional item response theory (MIRT) models have generated increasing interest in the psychometrics literature. Efficient approaches for estimating MIRT models with dichotomous responses have been developed, but constructing an equally efficient and robust algorithm for polytomous models has received limited attention. To address this gap,…
Descriptors: Item Response Theory, Accuracy, Simulation, Psychometrics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Guven Demir, Elif; Öksuz, Yücel – Participatory Educational Research, 2022
This research aimed to investigate animation-based achievement tests according to the item format, psychometric features, students' performance, and gender. The study sample consisted of 52 fifth-grade students in Samsun/Turkey in 2017-2018. Measures of the research were open-ended (OE), animation-based open-ended (AOE), multiple-choice (MC), and…
Descriptors: Animation, Achievement Tests, Test Items, Psychometrics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Cascella, Clelia; Giberti, Chiara; Bolondi, Giorgio – Education Sciences, 2021
This study is aimed at exploring how different formulations of the same mathematical item may influence students' answers, and whether or not boys and girls are equally affected by differences in presentation. An experimental design was employed: the same stem-items (i.e., items with the same mathematical content and question intent) were…
Descriptors: Mathematics Achievement, Mathematics Tests, Achievement Tests, Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Ezechukwu, Roseline Ifeoma; Chinecherem, Basil; Oguguo, E.; Ene, Catherine U.; Ugorji, Clifford O. – World Journal of Education, 2020
This study determined the psychometric properties of the Economics Achievement Test (EAT) using Item Response Theory (IRT). Two popular IRT models namely, one-parameter logistics (1PL) and two-parameter logistics (2PL) models were utilized. The researcher adopted instrumentation research design. Four research questions and two hypotheses were…
Descriptors: Economics Education, Economics, Achievement Tests, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Tsaousis, Ioannis; Sideridis, Georgios; Al-Saawi, Fahad – International Journal of Testing, 2018
The aim of the present study was to examine Differential Distractor Functioning (DDF) as a means of improving the quality of a measure through understanding biased responses across groups. A DDF analysis could shed light on the potential sources of construct-irrelevant variance by examining whether the differential selection of incorrect choices…
Descriptors: Foreign Countries, College Entrance Examinations, Test Bias, Chemistry
Peer reviewed Peer reviewed
Direct linkDirect link
Yildirim, Hüseyin H. – Educational Assessment, Evaluation and Accountability, 2021
From a sociocognitive perspective, item parameters in a test represent regularities in examinees' item responses. These regularities are originated from shared experiences among individuals in interacting with their environment. Theories explaining the relationship between culture and cognition also acknowledge these shared experiences as the…
Descriptors: Educational Assessment, Test Items, Item Response Theory, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Jerrim, John; Parker, Philip; Choi, Alvaro; Chmielewski, Anna Katyn; Sälzer, Christine; Shure, Nikki – Educational Measurement: Issues and Practice, 2018
The Programme for International Student Assessment (PISA) is an important international study of 15-olds' knowledge and skills. New results are released every 3 years, and have a substantial impact upon education policy. Yet, despite its influence, the methodology underpinning PISA has received significant criticism. Much of this criticism has…
Descriptors: Educational Assessment, Comparative Education, Achievement Tests, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Culpepper, Steven Andrew – Journal of Educational and Behavioral Statistics, 2017
In the absence of clear incentives, achievement tests may be subject to the effect of slipping where item response functions have upper asymptotes below one. Slipping reduces score precision for higher latent scores and distorts test developers' understandings of item and test information. A multidimensional four-parameter normal ogive model was…
Descriptors: Measurement, Achievement Tests, Item Response Theory, National Competency Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Fishbein, Bethany; Martin, Michael O.; Mullis, Ina V. S.; Foy, Pierre – Large-scale Assessments in Education, 2018
Background: TIMSS 2019 is the first assessment in the TIMSS transition to a computer-based assessment system, called eTIMSS. The TIMSS 2019 Item Equivalence Study was conducted in advance of the field test in 2017 to examine the potential for mode effects on the psychometric behavior of the TIMSS mathematics and science trend items induced by the…
Descriptors: Mathematics Achievement, Science Achievement, Mathematics Tests, Elementary Secondary Education
Peer reviewed Peer reviewed
Direct linkDirect link
Huggins-Manley, Anne Corinne – Educational and Psychological Measurement, 2017
This study defines subpopulation item parameter drift (SIPD) as a change in item parameters over time that is dependent on subpopulations of examinees, and hypothesizes that the presence of SIPD in anchor items is associated with bias and/or lack of invariance in three psychometric outcomes. Results show that SIPD in anchor items is associated…
Descriptors: Psychometrics, Test Items, Item Response Theory, Hypothesis Testing
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4