NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)3
Since 2006 (last 20 years)16
Audience
Researchers1
Laws, Policies, & Programs
Assessments and Surveys
Program for International…1
What Works Clearinghouse Rating
Showing 1 to 15 of 25 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Jeon, Minjeong; De Boeck, Paul; van der Linden, Wim – Journal of Educational and Behavioral Statistics, 2017
We present a novel application of a generalized item response tree model to investigate test takers' answer change behavior. The model allows us to simultaneously model the observed patterns of the initial and final responses after an answer change as a function of a set of latent traits and item parameters. The proposed application is illustrated…
Descriptors: Item Response Theory, Behavior, Mathematics Tests, Change
Peer reviewed Peer reviewed
Direct linkDirect link
Elosua, Paula; De Boeck, Paul – Language, Culture and Curriculum, 2020
Fair educational assessment in linguistically diverse contexts poses new challenges which call for the need to evaluate the impact of context-related language factors on student performance. Based on data from the Basque Autonomous Community in Spain, this research analyses the effect of different factors on a mathematics achievement test. Using…
Descriptors: Educational Assessment, Mathematics Tests, Mathematics Achievement, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Debeer, Dries; Janssen, Rianne; De Boeck, Paul – Journal of Educational Measurement, 2017
When dealing with missing responses, two types of omissions can be discerned: items can be skipped or not reached by the test taker. When the occurrence of these omissions is related to the proficiency process the missingness is nonignorable. The purpose of this article is to present a tree-based IRT framework for modeling responses and omissions…
Descriptors: Item Response Theory, Test Items, Responses, Testing Problems
Peer reviewed Peer reviewed
Direct linkDirect link
Halpin, Peter F.; Dolan, Conor V.; Grasman, Raoul P. P. P.; De Boeck, Paul – Psychometrika, 2011
The relationship between linear factor models and latent profile models is addressed within the context of maximum likelihood estimation based on the joint distribution of the manifest variables. Although the two models are well known to imply equivalent covariance decompositions, in general they do not yield equivalent estimates of the…
Descriptors: Models, Maximum Likelihood Statistics, Computation, Goodness of Fit
Peer reviewed Peer reviewed
Direct linkDirect link
De Boeck, Paul; Cho, Sun-Joo; Wilson, Mark – Applied Psychological Measurement, 2011
The models used in this article are secondary dimension mixture models with the potential to explain differential item functioning (DIF) between latent classes, called latent DIF. The focus is on models with a secondary dimension that is at the same time specific to the DIF latent class and linked to an item property. A description of the models…
Descriptors: Test Bias, Models, Statistical Analysis, Computation
Peer reviewed Peer reviewed
Direct linkDirect link
Frederickx, Sofie; Tuerlinckx, Francis; De Boeck, Paul; Magis, David – Journal of Educational Measurement, 2010
In this paper we present a new methodology for detecting differential item functioning (DIF). We introduce a DIF model, called the random item mixture (RIM), that is based on a Rasch model with random item difficulties (besides the common random person abilities). In addition, a mixture model is assumed for the item difficulties such that the…
Descriptors: Test Bias, Models, Test Items, Difficulty Level
Peer reviewed Peer reviewed
Direct linkDirect link
Gonzalez, Jorge; De Boeck, Paul; Tuerlinckx, Francis – Psychological Methods, 2008
Structural equation models are commonly used to analyze 2-mode data sets, in which a set of objects is measured on a set of variables. The underlying structure within the object mode is evaluated using latent variables, which are measured by indicators coming from the variable mode. Additionally, when the objects are measured under different…
Descriptors: Structural Equation Models, Data Analysis, Evaluation Methods, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Ip, Edward H.; Smits, Dirk J. M.; De Boeck, Paul – Applied Psychological Measurement, 2009
The article proposes a family of item-response models that allow the separate and independent specification of three orthogonal components: item attribute, person covariate, and local item dependence. Special interest lies in extending the linear logistic test model, which is commonly used to measure item attributes, to tests with embedded item…
Descriptors: Item Response Theory, Models, Computation, Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Goegebeur, Yuri; De Boeck, Paul; Wollack, James A.; Cohen, Allan S. – Psychometrika, 2008
An item response theory model for dealing with test speededness is proposed. The model consists of two random processes, a problem solving process and a random guessing process, with the random guessing gradually taking over from the problem solving process. The involved change point and change rate are considered random parameters in order to…
Descriptors: Problem Solving, Item Response Theory, Models, Case Studies
Peer reviewed Peer reviewed
Direct linkDirect link
del Pino, Guido; San Martin, Ernesto; Gonzalez, Jorge; De Boeck, Paul – Psychometrika, 2008
This paper analyzes the sum score based (SSB) formulation of the Rasch model, where items and sum scores of persons are considered as factors in a logit model. After reviewing the evolution leading to the equality between their maximum likelihood estimates, the SSB model is then discussed from the point of view of pseudo-likelihood and of…
Descriptors: Computation, Models, Scores, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
De Boeck, Paul – Psychometrika, 2008
It is common practice in IRT to consider items as fixed and persons as random. Both, continuous and categorical person parameters are most often random variables, whereas for items only continuous parameters are used and they are commonly of the fixed type, although exceptions occur. It is shown in the present article that random item parameters…
Descriptors: Test Items, Goodness of Fit, Item Response Theory, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Kahraman, Nilufer; De Boeck, Paul; Janssen, Rianne – International Journal of Testing, 2009
This study introduces an approach for modeling multidimensional response data with construct-relevant group and domain factors. The item level parameter estimation process is extended to incorporate the refined effects of test dimension and group factors. Differences in item performances over groups are evaluated, distinguishing two levels of…
Descriptors: Test Bias, Test Items, Groups, Interaction
Peer reviewed Peer reviewed
Direct linkDirect link
Rijmen, Frank; Vansteelandt, Kristof; De Boeck, Paul – Psychometrika, 2008
The increasing use of diary methods calls for the development of appropriate statistical methods. For the resulting panel data, latent Markov models can be used to model both individual differences and temporal dynamics. The computational burden associated with these models can be overcome by exploiting the conditional independence relations…
Descriptors: Markov Processes, Patients, Regression (Statistics), Probability
Peer reviewed Peer reviewed
Direct linkDirect link
Braeken, Johan; Tuerlinckx, Francis; De Boeck, Paul – Psychometrika, 2007
Most item response theory models are not robust to violations of conditional independence. However, several modeling approaches (e.g., conditioning on other responses, additional random effects) exist that try to incorporate local item dependencies, but they have some drawbacks such as the nonreproducibility of marginal probabilities and resulting…
Descriptors: Probability, Item Response Theory, Test Items, Psychometrics
Peer reviewed Peer reviewed
Hoskens, Machteld; De Boeck, Paul – Applied Psychological Measurement, 2001
Presents a framework for modeling componential data using item response theory models for polytomous items. The framework models response accuracies on complex cognitive tasks that are decomposed in terms of more basic elements. Illustrates the proposed multidimensional model. (SLD)
Descriptors: Item Response Theory, Models
Previous Page | Next Page ยป
Pages: 1  |  2