Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 10 |
Since 2016 (last 10 years) | 16 |
Since 2006 (last 20 years) | 16 |
Descriptor
Source
International Journal of… | 16 |
Author
Akcan, Rabia | 1 |
Altintas, Ozge | 1 |
Altintas, Özge | 1 |
Atalay Kabasakal, Kubra | 1 |
Ayse Bilicioglu Gunes | 1 |
Basman, Munevver | 1 |
Bayram Bicak | 1 |
Brooks, Gordon | 1 |
Celen, Umit | 1 |
Diaz, Emily | 1 |
Dogan, Nuri | 1 |
More ▼ |
Publication Type
Journal Articles | 16 |
Reports - Research | 16 |
Speeches/Meeting Papers | 1 |
Education Level
Secondary Education | 7 |
Higher Education | 5 |
Postsecondary Education | 5 |
High Schools | 3 |
Junior High Schools | 3 |
Middle Schools | 3 |
Elementary Education | 2 |
Grade 8 | 2 |
Elementary Secondary Education | 1 |
Grade 10 | 1 |
Grade 9 | 1 |
More ▼ |
Audience
Laws, Policies, & Programs
Assessments and Surveys
Program for International… | 3 |
Trends in International… | 1 |
What Works Clearinghouse Rating
Ebru Dogruöz; Hülya Kelecioglu – International Journal of Assessment Tools in Education, 2024
In this research, multistage adaptive tests (MST) were compared according to sample size, panel pattern and module length for top-down and bottom-up test assembly methods. Within the scope of the research, data from PISA 2015 were used and simulation studies were conducted according to the parameters estimated from these data. Analysis results for…
Descriptors: Adaptive Testing, Test Construction, Foreign Countries, Achievement Tests
Celen, Umit – International Journal of Assessment Tools in Education, 2021
This study examined the calculation methods of P121 and P10 scores used in teacher appointments. The statistics regarding the Public Personnel Selection Examination (PPSE) subtests used by Measurement, Selection and Placement Center (MSPC) in 2018, 2019 and 2020 were accessed from the website of the institution. The parameters not published on…
Descriptors: Teacher Placement, Scores, Teacher Competency Testing, Foreign Countries
Ayse Bilicioglu Gunes; Bayram Bicak – International Journal of Assessment Tools in Education, 2023
The main purpose of this study is to examine the Type I error and statistical power ratios of Differential Item Functioning (DIF) techniques based on different theories under different conditions. For this purpose, a simulation study was conducted by using Mantel-Haenszel (MH), Logistic Regression (LR), Lord's [chi-squared], and Raju's Areas…
Descriptors: Test Items, Item Response Theory, Error of Measurement, Test Bias
Basman, Munevver – International Journal of Assessment Tools in Education, 2023
To ensure the validity of the tests is to check that all items have similar results across different groups of individuals. However, differential item functioning (DIF) occurs when the results of individuals with equal ability levels from different groups differ from each other on the same test item. Based on Item Response Theory and Classic Test…
Descriptors: Test Bias, Test Items, Test Validity, Item Response Theory
Fatma Betül Kurnaz; Hüseyin Yildiz – International Journal of Assessment Tools in Education, 2023
Investigating the existence of items with differential item functioning (DIF) may provide more accurate comparisons of group differences in studies that aim to compare scores obtained in a test by groups with different characteristics. In the present study, a scale measuring critical thinking motivation that was adapted to the Turkish culture was…
Descriptors: Test Bias, Foreign Countries, High School Graduates, College Graduates
Saaatcioglu, Fatima Munevver – International Journal of Assessment Tools in Education, 2022
The aim of this study is to investigate the presence of DIF over the gender variable with the latent class modeling approach. The data were collected from 953 students who participated in the PISA 2018 8th-grade financial literacy assessment in the USA. Latent Class Analysis (LCA) approach was used to identify the latent classes, and the data fit…
Descriptors: International Assessment, Achievement Tests, Secondary School Students, Gender Differences
Altintas, Ozge; Wallin, Gabriel – International Journal of Assessment Tools in Education, 2021
Educational assessment tests are designed to measure the same psychological constructs over extended periods. This feature is important considering that test results are often used for admittance to university programs. To ensure fair assessments, especially for those whose results weigh heavily in selection decisions, it is necessary to collect…
Descriptors: College Admission, College Entrance Examinations, Test Bias, Equated Scores
Diaz, Emily; Brooks, Gordon; Johanson, George – International Journal of Assessment Tools in Education, 2021
This Monte Carlo study assessed Type I error in differential item functioning analyses using Lord's chi-square (LC), Likelihood Ratio Test (LRT), and Mantel-Haenszel (MH) procedure. Two research interests were investigated: item response theory (IRT) model specification in LC and the LRT and continuity correction in the MH procedure. This study…
Descriptors: Test Bias, Item Response Theory, Statistical Analysis, Comparative Analysis
Uysal, Ibrahim; Dogan, Nuri – International Journal of Assessment Tools in Education, 2021
Scoring constructed-response items can be highly difficult, time-consuming, and costly in practice. Improvements in computer technology have enabled automated scoring of constructed-response items. However, the application of automated scoring without an investigation of test equating can lead to serious problems. The goal of this study was to…
Descriptors: Computer Assisted Testing, Scoring, Item Response Theory, Test Format
Soysal, Sumeyra; Yilmaz Kogar, Esin – International Journal of Assessment Tools in Education, 2021
In this study, whether item position effects lead to DIF in the condition where different test booklets are used was investigated. To do this the methods of Lord's chi-square and Raju's unsigned area with the 3PL model under with and without item purification were used. When the performance of the methods was compared, it was revealed that…
Descriptors: Item Response Theory, Test Bias, Test Items, Comparative Analysis
Lee, Hyung Rock; Lee, Sunbok; Sung, Jaeyun – International Journal of Assessment Tools in Education, 2019
Applying single-level statistical models to multilevel data typically produces underestimated standard errors, which may result in misleading conclusions. This study examined the impact of ignoring multilevel data structure on the estimation of item parameters and their standard errors of the Rasch, two-, and three-parameter logistic models in…
Descriptors: Item Response Theory, Computation, Error of Measurement, Test Bias
Gungor, Metehan; Atalay Kabasakal, Kubra – International Journal of Assessment Tools in Education, 2020
Measurement invariance analyses are carried out in order to find evidence for the structural validity of the measurement tools used in the field of educational sciences and psychology. The purpose of this research is to examine the measurement invariance of Science Motivation and Self-Efficacy Model constructed by Instrumental Motivation to Learn…
Descriptors: Achievement Tests, Foreign Countries, Secondary School Students, Gender Differences
Altintas, Özge; Kutlu, Ömer – International Journal of Assessment Tools in Education, 2019
This study aims to determine whether items in the Ankara University Examination for Foreign Students Basic Learning Skills Test function differently according to country and gender using the Recursive Partitioning Analysis in the Rasch Model. The variables used in the recursive partitioning of the data are country and gender. The population of the…
Descriptors: Foreign Countries, Higher Education, Test Bias, Foreign Students
Selvi, Hüseyin; Özdemir Alici, Devrim – International Journal of Assessment Tools in Education, 2018
In this study, it is aimed to investigate the impact of different missing data handling methods on the detection of Differential Item Functioning methods (Mantel Haenszel and Standardization methods based on Classical Test Theory and Likelihood Ratio Test method based on Item Response Theory). In this regard, on the data acquired from 1046…
Descriptors: Test Bias, Test Theory, Item Response Theory, Multiple Choice Tests
Retnawati, Heri – International Journal of Assessment Tools in Education, 2018
The study was to identify the load, the type and the significance of differential item functioning (DIF) in constructed response item using the partial credit model (PCM). The data in the study were the students' instruments and the students' responses toward the PISA-like test items that had been completed by 386 ninth grade students and 460…
Descriptors: Test Bias, Test Items, Responses, Grade 9
Previous Page | Next Page »
Pages: 1 | 2