NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)11
Since 2006 (last 20 years)24
Audience
Researchers2
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 55 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Dogan, Nuri; Hambleton, Ronald K.; Yurtcu, Meltem; Yavuz, Sinan – Cypriot Journal of Educational Sciences, 2018
Validity is one of the psychometric properties of the achievement tests. To determine the validity, one of the examination is item bias studies, which are based on differential item functioning (DIF) analyses and field experts' opinion. In this study, field experts were asked to estimate the DIF levels of the items to compare the estimations…
Descriptors: Test Bias, Comparative Analysis, Predictor Variables, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Lundqvist, Lars-Olov; Lindner, Helen – Journal of Autism and Developmental Disorders, 2017
The Autism-Spectrum Quotient (AQ) is among the most widely used scales assessing autistic traits in the general population. However, some aspects of the AQ are questionable. To test its scale properties, the AQ was translated into Swedish, and data were collected from 349 adults, 130 with autism spectrum disorder (ASD) and 219 without ASD, and…
Descriptors: Autism, Pervasive Developmental Disorders, Adults, Comparative Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Turkan, Azmi; Cetin, Bayram – Journal of Education and Practice, 2017
Validity and reliability are among the most crucial characteristics of a test. One of the steps to make sure that a test is valid and reliable is to examine the bias in test items. The purpose of this study was to examine the bias in 2012 Placement Test items in terms of gender variable using Rasch Model in Turkey. The sample of this study was…
Descriptors: Item Response Theory, Gender Differences, Test Bias, Test Items
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Demir, Ergul – Eurasian Journal of Educational Research, 2018
Purpose: The answer-copying tendency has the potential to detect suspicious answer patterns for prior distributions of statistical detection techniques. The aim of this study is to develop a valid and reliable measurement tool as a scale in order to observe the tendency of university students' copying of answers. Also, it is aimed to provide…
Descriptors: College Students, Cheating, Test Construction, Student Behavior
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, María Elena; von Davier, Alina A. – International Journal of Testing, 2016
In this study, we propose that the unique needs and characteristics of linguistic minorities should be considered throughout the test development process. Unlike most measurement invariance investigations in the assessment of linguistic minorities, which typically are conducted after test administration, we propose strategies that focus on the…
Descriptors: Psychometrics, Linguistics, Test Construction, Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Rios, Joseph A.; Sparks, Jesse R.; Zhang, Mo; Liu, Ou Lydia – ETS Research Report Series, 2017
Proficiency with written communication (WC) is critical for success in college and careers. As a result, institutions face a growing challenge to accurately evaluate their students' writing skills to obtain data that can support demands of accreditation, accountability, or curricular improvement. Many current standardized measures, however, lack…
Descriptors: Test Construction, Test Validity, Writing Tests, College Outcomes Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Mao, Liyang; Liu, Ou Lydia; Roohr, Katrina; Belur, Vinetha; Mulholland, Matthew; Lee, Hee-Sun; Pallant, Amy – Educational Assessment, 2018
Scientific argumentation is one of the core practices for teachers to implement in science classrooms. We developed a computer-based formative assessment to support students' construction and revision of scientific arguments. The assessment is built upon automated scoring of students' arguments and provides feedback to students and teachers.…
Descriptors: Computer Assisted Testing, Science Tests, Scoring, Automation
Peer reviewed Peer reviewed
Direct linkDirect link
Alhaythami, Hassan; Karpinski, Aryn; Kirschner, Paul; Bolden, Edward – Journal of Interactive Learning Research, 2017
This study examined the psychometric properties of a social-networking site (SNS) activities scale (SNSAS) using Rasch Analysis. Items were also examined with Rasch Principal Components Analysis (PCA) and Differential Item Functioning (DIF) across groups of university students (i.e., males and females from the United States [US] and Europe; N =…
Descriptors: Social Media, Social Networks, Likert Scales, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Cockcroft, Kate; Bloch, Lauren; Moolla, Azra – Education as Change, 2016
This study investigated whether measures of verbal working memory are less sensitive to children's socioeconomic background than traditional vocabulary measures. Participants were 120 school beginners, divided into high and low socioeconomic groups. The groups contained equal numbers of English first-language and second-language speakers. All were…
Descriptors: Foreign Countries, Short Term Memory, Vocabulary, English (Second Language)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Dorans, Neil J. – ETS Research Report Series, 2013
Quantitative fairness procedures have been developed and modified by ETS staff over the past several decades. ETS has been a leader in fairness assessment, and its efforts are reviewed in this report. The first section deals with differential prediction and differential validity procedures that examine whether test scores predict a criterion, such…
Descriptors: Test Bias, Statistical Analysis, Test Validity, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
McFarland, Jenny L.; Price, Rebecca M.; Wenderoth, Mary Pat; Martinková, Patrícia; Cliff, William; Michael, Joel; Modell, Harold; Wright, Ann – CBE - Life Sciences Education, 2017
We present the Homeostasis Concept Inventory (HCI), a 20-item multiple-choice instrument that assesses how well undergraduates understand this critical physiological concept. We used an iterative process to develop a set of questions based on elements in the Homeostasis Concept Framework. This process involved faculty experts and undergraduate…
Descriptors: Scientific Concepts, Multiple Choice Tests, Science Tests, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Rios, Joseph A.; Sireci, Stephen G. – International Journal of Testing, 2014
The International Test Commission's "Guidelines for Translating and Adapting Tests" (2010) provide important guidance on developing and evaluating tests for use across languages. These guidelines are widely applauded, but the degree to which they are followed in practice is unknown. The objective of this study was to perform a…
Descriptors: Guidelines, Translation, Adaptive Testing, Second Languages
Peer reviewed Peer reviewed
Direct linkDirect link
Fletcher, Richard B.; Crocker, Peter – Measurement in Physical Education and Exercise Science, 2014
The present study investigated the social physique anxiety scale's factor structure and item properties using confirmatory factor analysis and item response theory. An additional aim was to identify differences in response patterns between groups (gender). A large sample of high school students aged 11-15 years (N = 1,529) consisting of n =…
Descriptors: Anxiety, Human Body, Factor Analysis, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Pedrosa, Ignacio; Suárez-Álvarez, Javier; Lozano, Luis M.; Muñiz, José; García-Cueto, Eduardo – Journal of Psychoeducational Assessment, 2014
Adolescence is a critical period of life during which significant psychosocial adjustment occurs and in which emotional intelligence plays an essential role. This article provides validity evidence for the Trait Meta-Mood Scale-24 (TMMS-24) scores based on an item response theory (IRT) approach. A sample of 2,693 Spanish adolescents (M = 16.52…
Descriptors: Foreign Countries, Adolescents, Secondary School Students, Emotional Intelligence
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Zwick, Rebecca – ETS Research Report Series, 2012
Differential item functioning (DIF) analysis is a key component in the evaluation of the fairness and validity of educational tests. The goal of this project was to review the status of ETS DIF analysis procedures, focusing on three aspects: (a) the nature and stringency of the statistical rules used to flag items, (b) the minimum sample size…
Descriptors: Test Bias, Sample Size, Bayesian Statistics, Evaluation Methods
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4