NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 4 results Save | Export
Yu Wang – ProQuest LLC, 2024
The multiple-choice (MC) item format has been widely used in educational assessments across diverse content domains. MC items purportedly allow for collecting richer diagnostic information. The effectiveness and economy of administering MC items may have further contributed to their popularity not just in educational assessment. The MC item format…
Descriptors: Multiple Choice Tests, Cognitive Tests, Cognitive Measurement, Educational Diagnosis
Lotfi Simon Kerzabi – ProQuest LLC, 2021
Monte Carlo methods are an accepted methodology in regards to generation critical values for a Maximum test. The same methods are also applicable to the evaluation of the robustness of the new created test. A table of critical values was created, and the robustness of the new maximum test was evaluated for five different distributions. Robustness…
Descriptors: Data, Monte Carlo Methods, Testing, Evaluation Research
Zebing Wu – ProQuest LLC, 2024
Response style, one common aberrancy in non-cognitive assessments in psychological fields, is problematic in terms of inaccurate estimation of item and person parameters, which leads to serious reliability, validity, and fairness issues (Baumgartner & Steenkamp, 2001; Bolt & Johnson, 2009; Bolt & Newton, 2011). Response style refers to…
Descriptors: Response Style (Tests), Accuracy, Preferences, Psychological Testing
Jinjin Huang – ProQuest LLC, 2020
Measurement invariance is crucial for an effective and valid measure of a construct. Invariance holds when the latent trait varies consistently across subgroups; in other words, the mean differences among subgroups are only due to true latent ability differences. Differential item functioning (DIF) occurs when measurement invariance is violated.…
Descriptors: Robustness (Statistics), Item Response Theory, Test Items, Item Analysis