Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 5 |
Descriptor
Item Response Theory | 5 |
Monte Carlo Methods | 5 |
Predictor Variables | 5 |
Comparative Analysis | 3 |
Bayesian Statistics | 2 |
Correlation | 2 |
Difficulty Level | 2 |
Maximum Likelihood Statistics | 2 |
Models | 2 |
Simulation | 2 |
Statistical Analysis | 2 |
More ▼ |
Source
Educational and Psychological… | 2 |
Applied Measurement in… | 1 |
ETS Research Report Series | 1 |
International Journal of… | 1 |
Author
Algina, James | 1 |
Ames, Allison J. | 1 |
Aydin, Burak | 1 |
Cao, Mengyang | 1 |
Fifield, Steve | 1 |
Ford, Danielle | 1 |
Glutting, Joseoph | 1 |
Koziol, Natalie A. | 1 |
Leite, Walter L. | 1 |
Myers, Aaron J. | 1 |
Nandakumar, Ratna | 1 |
More ▼ |
Publication Type
Journal Articles | 5 |
Reports - Research | 5 |
Education Level
Grade 8 | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
National Longitudinal Study… | 1 |
Trends in International… | 1 |
What Works Clearinghouse Rating
Ames, Allison J.; Myers, Aaron J. – Educational and Psychological Measurement, 2021
Contamination of responses due to extreme and midpoint response style can confound the interpretation of scores, threatening the validity of inferences made from survey responses. This study incorporated person-level covariates in the multidimensional item response tree model to explain heterogeneity in response style. We include an empirical…
Descriptors: Response Style (Tests), Item Response Theory, Longitudinal Studies, Adolescents
Cao, Mengyang; Song, Q. Chelsea; Tay, Louis – International Journal of Testing, 2018
There is a growing use of noncognitive assessments around the world, and recent research has posited an ideal point response process underlying such measures. A critical issue is whether the typical use of dominance approaches (e.g., average scores, factor analysis, and the Samejima's graded response model) in scoring such measures is adequate.…
Descriptors: Comparative Analysis, Item Response Theory, Factor Analysis, Models
Koziol, Natalie A. – Applied Measurement in Education, 2016
Testlets, or groups of related items, are commonly included in educational assessments due to their many logistical and conceptual advantages. Despite their advantages, testlets introduce complications into the theory and practice of educational measurement. Responses to items within a testlet tend to be correlated even after controlling for…
Descriptors: Classification, Accuracy, Comparative Analysis, Models
Aydin, Burak; Leite, Walter L.; Algina, James – Educational and Psychological Measurement, 2016
We investigated methods of including covariates in two-level models for cluster randomized trials to increase power to detect the treatment effect. We compared multilevel models that included either an observed cluster mean or a latent cluster mean as a covariate, as well as the effect of including Level 1 deviation scores in the model. A Monte…
Descriptors: Error of Measurement, Predictor Variables, Randomized Controlled Trials, Experimental Groups
Qian, Xiaoyu; Nandakumar, Ratna; Glutting, Joseoph; Ford, Danielle; Fifield, Steve – ETS Research Report Series, 2017
In this study, we investigated gender and minority achievement gaps on 8th-grade science items employing a multilevel item response methodology. Both gaps were wider on physics and earth science items than on biology and chemistry items. Larger gender gaps were found on items with specific topics favoring male students than other items, for…
Descriptors: Item Analysis, Gender Differences, Achievement Gap, Grade 8