Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 6 |
Descriptor
Source
International Journal of… | 6 |
Author
Arikan, Serkan | 1 |
Bulut, Hatice Cigdem | 1 |
Bulut, Okan | 1 |
Chang, Rong | 1 |
Finch, W. Holmes | 1 |
Frazier, Paul | 1 |
French, Brian F. | 1 |
Guo, Fanmin | 1 |
Han, Kyung T. | 1 |
Hernández Finch, Maria E. | 1 |
Kieftenbeld, Vincent | 1 |
More ▼ |
Publication Type
Journal Articles | 6 |
Reports - Research | 5 |
Reports - Descriptive | 1 |
Education Level
Elementary Education | 2 |
Grade 4 | 2 |
Intermediate Grades | 2 |
Secondary Education | 2 |
Higher Education | 1 |
Audience
Location
Finland | 1 |
United States | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Progress in International… | 2 |
Program for International… | 1 |
Wechsler Adult Intelligence… | 1 |
What Works Clearinghouse Rating
Bulut, Hatice Cigdem; Bulut, Okan; Arikan, Serkan – International Journal of Testing, 2023
This study examined group differences in online reading comprehension (ORC) using student data from the 2016 administration of the Progress in International Reading Literacy Study (ePIRLS). An explanatory item response modeling approach was used to explore the effects of item properties (i.e., item format, text complexity, and cognitive…
Descriptors: International Assessment, Achievement Tests, Grade 4, Foreign Countries
Smith, Patriann; Frazier, Paul; Lee, Jaehoon; Chang, Rong – International Journal of Testing, 2018
Previous research has primarily addressed the effects of language on the Program for International Student Assessment (PISA) mathematics and science assessments. More recent research has focused on the effects of language on PISA reading comprehension and literacy assessments on student populations in specific Organization for Economic Cooperation…
Descriptors: Achievement Tests, Foreign Countries, International Assessment, Secondary School Students
Finch, W. Holmes; Hernández Finch, Maria E.; French, Brian F. – International Journal of Testing, 2016
Differential item functioning (DIF) assessment is key in score validation. When DIF is present scores may not accurately reflect the construct of interest for some groups of examinees, leading to incorrect conclusions from the scores. Given rising immigration, and the increased reliance of educational policymakers on cross-national assessments…
Descriptors: Test Bias, Scores, Native Language, Language Usage
Shermis, Mark D.; Mao, Liyang; Mulholland, Matthew; Kieftenbeld, Vincent – International Journal of Testing, 2017
This study uses the feature sets employed by two automated scoring engines to determine if a "linguistic profile" could be formulated that would help identify items that are likely to exhibit differential item functioning (DIF) based on linguistic features. Sixteen items were administered to 1200 students where demographic information…
Descriptors: Computer Assisted Testing, Scoring, Hypothesis Testing, Essays
Roivainen, Eka – International Journal of Testing, 2013
To study the concept of national IQ profile, we compared U.S. and Finnish WAIS, WAIS-R, and WAIS III nonverbal and working memory subtest norms. The U.S. standardization samples had consistently higher scores on the Coding and Digit span subtests, while the Finnish samples had higher scores on the Block design subtest. No stable cross-national…
Descriptors: Intelligence Tests, Profiles, Cultural Influences, Nonverbal Tests
Talento-Miller, Eileen; Guo, Fanmin; Han, Kyung T. – International Journal of Testing, 2013
When power tests include a time limit, it is important to assess the possibility of speededness for examinees. Past research on differential speededness has examined gender and ethnic subgroups in the United States on paper and pencil tests. When considering the needs of a global audience, research regarding different native language speakers is…
Descriptors: Adaptive Testing, Computer Assisted Testing, English, Scores