Publication Date
In 2025 | 0 |
Since 2024 | 10 |
Since 2021 (last 5 years) | 64 |
Since 2016 (last 10 years) | 119 |
Since 2006 (last 20 years) | 247 |
Descriptor
Test Items | 251 |
Item Response Theory | 103 |
Difficulty Level | 52 |
Comparative Analysis | 50 |
Test Construction | 50 |
Scores | 48 |
Psychometrics | 43 |
Item Analysis | 40 |
Models | 39 |
Test Validity | 38 |
Statistical Analysis | 35 |
More ▼ |
Source
ProQuest LLC | 246 |
Online Submission | 1 |
Author
Publication Type
Dissertations/Theses -… | 251 |
Reports - Research | 2 |
Education Level
Audience
Location
California | 3 |
China | 3 |
Ohio | 3 |
Tennessee | 3 |
United States | 3 |
Alabama | 2 |
Canada | 2 |
Florida | 2 |
Georgia | 2 |
Hong Kong | 2 |
Illinois | 2 |
More ▼ |
Laws, Policies, & Programs
Head Start | 1 |
No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
Leighton, Elizabeth A. – ProQuest LLC, 2022
The use of unidimensional scales that contain both positively and negatively worded items is common in both the educational and psychological fields. However, dimensionality investigations of these instruments often lead to a rejection of the theorized unidimensional model in favor of multidimensional structures, leaving researchers at odds for…
Descriptors: Test Items, Language Usage, Models, Statistical Analysis
He, Dan – ProQuest LLC, 2023
This dissertation examines the effectiveness of machine learning algorithms and feature engineering techniques for analyzing process data and predicting test performance. The study compares three classification approaches and identifies item-specific process features that are highly predictive of student performance. The findings suggest that…
Descriptors: Artificial Intelligence, Data Analysis, Algorithms, Classification
Chang, Kuo-Feng – ProQuest LLC, 2022
This dissertation was designed to foster a deeper understanding of population invariance in the context of composite-score equating and provide practitioners with guidelines for addressing score equity concerns at the composite score level. The purpose of this dissertation was threefold. The first was to compare different composite equating…
Descriptors: Test Items, Equated Scores, Methods, Design
Hess, Jessica – ProQuest LLC, 2023
This study was conducted to further research into the impact of student-group item parameter drift (SIPD) --referred to as subpopulation item parameter drift in previous research-- on ability estimates and proficiency classification accuracy when occurring in the discrimination parameter of a 2-PL item response theory (IRT) model. Using Monte…
Descriptors: Test Items, Groups, Ability, Item Response Theory
Lin Ma – ProQuest LLC, 2024
This dissertation presents an innovative approach to examining the keying method, wording method, and construct validity on psychometric instruments. By employing a mixed methods explanatory sequential design, the effects of keying and wording in two psychometric assessments were examined and validated. Those two self-report psychometric…
Descriptors: Evaluation, Psychometrics, Measures (Individuals), Instrumentation
Mingjia Ma – ProQuest LLC, 2023
Response time is an important research topic in the field of psychometrics. This dissertation tries to explore some response time properties across several item characteristics and examinee characteristics, as well as the interactions between response time and response outcomes, using data from a statewide mathematics assessment in two grades.…
Descriptors: Reaction Time, Mathematics Tests, Standardized Tests, State Standards
Paige Haley – ProQuest LLC, 2023
As the research on feigning has grown, the number and quality of performance validity tests (PVTs) has increased as well. However, while several PVTs have been developed from assessments commonly used as part of neuropsychological batteries, there has been less exploration for PVTs scored from items in cognitive screeners. The Montreal Cognitive…
Descriptors: Cognitive Measurement, Performance, Test Validity, Psychological Testing
Matthew John Davidson – ProQuest LLC, 2022
Digitally-based assessments create opportunities for collecting moment to moment information about how students are responding to assessment items. This information, called log or process data, has long been regarded as a vast and valuable source of data about student performance. Despite repeated assurances of its vastness and value, process data…
Descriptors: Data Use, Psychometrics, Item Response Theory, Test Items
Jing Ma – ProQuest LLC, 2024
This study investigated the impact of scoring polytomous items later on measurement precision, classification accuracy, and test security in mixed-format adaptive testing. Utilizing the shadow test approach, a simulation study was conducted across various test designs, lengths, number and location of polytomous item. Results showed that while…
Descriptors: Scoring, Adaptive Testing, Test Items, Classification
Jackson, Kayla – ProQuest LLC, 2023
Prior research highlights the benefits of multimode surveys and best practices for item-by-item (IBI) and matrix-type survey items. Some researchers have explored whether mode differences for online and paper surveys persist for these survey item types. However, no studies discuss measurement invariance when both item types and online modes are…
Descriptors: Test Items, Surveys, Error of Measurement, Item Response Theory
Kane, Jesse F. – ProQuest LLC, 2023
The idea of student engagement as a predictor of student success was first introduced by Alexander Astin (1974; 1984) who studied student involvement. The connection of student involvement and student success has led to the focus on student and how we measure it to ensure that institutions are doing all they can to improve outcomes. Nothing has…
Descriptors: Learner Engagement, College Freshmen, College Seniors, Student Surveys
Lanrong Li – ProQuest LLC, 2021
When developing a test, it is essential to ensure that the test is free of items with differential item functioning (DIF). DIF occurs when examinees of equal ability, but from different examinee subgroups, have different chances of getting the item correct. According to the multidimensional perspective, DIF occurs because the test measures more…
Descriptors: Test Bias, Test Items, Meta Analysis, Effect Size
Montserrat Beatriz Valdivia Medinaceli – ProQuest LLC, 2023
My dissertation examines three current challenges of international large-scale assessments (ILSAs) associated with the transition from linear testing to an adaptive testing design. ILSAs are important for making comparisons among populations and informing countries about the quality of their educational systems. ILSA's results inform policymakers…
Descriptors: International Assessment, Achievement Tests, Adaptive Testing, Test Items
Patterson, Christopher R. – ProQuest LLC, 2023
Typical approaches to test and item development are rooted in the "Standards for Educational and Psychological Testing." Culturally responsive and antiracist assessment practices are two new processes that challenge the typical process noted in the "Standards," incorporating critical race theory and cultural responsiveness into…
Descriptors: College Students, Student Attitudes, Culturally Relevant Education, Test Items
Jiajing Huang – ProQuest LLC, 2022
The nonequivalent-groups anchor-test (NEAT) data-collection design is commonly used in large-scale assessments. Under this design, different test groups take different test forms. Each test form has its own unique items and all test forms share a set of common items. If item response theory (IRT) models are applied to analyze the test data, the…
Descriptors: Item Response Theory, Test Format, Test Items, Test Construction