Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 4 |
Since 2006 (last 20 years) | 5 |
Descriptor
Source
Author
Mislevy, Robert J. | 2 |
Andersen, Nico | 1 |
Bengs, Daniel | 1 |
Braun, Henry | 1 |
Clement, John | 1 |
Culpepper, Steven Andrew | 1 |
Deribo, Tobias | 1 |
Eichmann, Beate | 1 |
Hahnel, Carolin | 1 |
Harrison, Scott | 1 |
Kolstad, Andrew | 1 |
More ▼ |
Publication Type
Journal Articles | 8 |
Reports - Descriptive | 3 |
Reports - Research | 3 |
Reports - Evaluative | 2 |
Opinion Papers | 1 |
Education Level
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Grade 8 | 1 |
Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 9 |
National Assessment of Adult… | 1 |
Program for International… | 1 |
Trends in International… | 1 |
What Works Clearinghouse Rating
Zehner, Fabian; Eichmann, Beate; Deribo, Tobias; Harrison, Scott; Bengs, Daniel; Andersen, Nico; Hahnel, Carolin – Journal of Educational Data Mining, 2021
The NAEP EDM Competition required participants to predict efficient test-taking behavior based on log data. This paper describes our top-down approach for engineering features by means of psychometric modeling, aiming at machine learning for the predictive classification task. For feature engineering, we employed, among others, the Log-Normal…
Descriptors: National Competency Tests, Engineering Education, Data Collection, Data Analysis
Oranje, Andreas; Kolstad, Andrew – Journal of Educational and Behavioral Statistics, 2019
The design and psychometric methodology of the National Assessment of Educational Progress (NAEP) is constantly evolving to meet the changing interests and demands stemming from a rapidly shifting educational landscape. NAEP has been built on strong research foundations that include conducting extensive evaluations and comparisons before new…
Descriptors: National Competency Tests, Psychometrics, Statistical Analysis, Computation
Culpepper, Steven Andrew – Journal of Educational and Behavioral Statistics, 2017
In the absence of clear incentives, achievement tests may be subject to the effect of slipping where item response functions have upper asymptotes below one. Slipping reduces score precision for higher latent scores and distorts test developers' understandings of item and test information. A multidimensional four-parameter normal ogive model was…
Descriptors: Measurement, Achievement Tests, Item Response Theory, National Competency Tests
Braun, Henry; von Davier, Matthias – Large-scale Assessments in Education, 2017
Background: Economists are making increasing use of measures of student achievement obtained through large-scale survey assessments such as NAEP, TIMSS, and PISA. The construction of these measures, employing plausible value (PV) methodology, is quite different from that of the more familiar test scores associated with assessments such as the SAT…
Descriptors: Scores, Test Use, Measurement, Psychometrics
Mislevy, Robert J. – Educational Measurement: Issues and Practice, 2012
This article presents the author's observations on Neil Dorans's NCME Career Award Address: "The Contestant Perspective on Taking Tests: Emanations from the Statue within." He calls attention to some points that Dr. Dorans made in his address, and offers his thoughts in response.
Descriptors: Testing, Test Reliability, Psychometrics, Scores

Muraki, Eiji – Applied Psychological Measurement, 1993
The concept of information functions developed for dichotomous item response models is adapted for the partial credit model, and the information function is used to investigate collapsing and recoding categories of polytomously scored items from the National Assessment of Educational Progress. (SLD)
Descriptors: Equations (Mathematics), Item Response Theory, National Surveys, Psychometrics

Rudner, Lawrence M.; And Others – Applied Measurement in Education, 1996
An analysis of data from the 1990 National Assessment of Educational Progress Trial State Assessment suggests that person-fit statistics may not provide additional information about results of psychometrically strong achievement tests. More research is needed before person-fit statistics can be used routinely in analysis of item response data.…
Descriptors: Achievement Tests, Individual Differences, Item Response Theory, Psychometrics
White, Sheida; Clement, John – National Center for Education Statistics, 2001
This working paper summarizes the results of an expert panel review of the Lexile Framework (LF). The review was conducted by five panel members through readings, the preparation of brief individual reports, and participation in a meeting held on April 26, 2001 in Washington, D.C. The list of panel members and invited observers, along with brief…
Descriptors: Readability Formulas, Linguistic Theory, Construct Validity, Semantics

Sheehan, Kathleen; Mislevy, Robert J. – Journal of Educational Measurement, 1990
The 63 items on skills in acquiring and using information from written documents contained in the Survey of Young Adult Literacy in the 1985 National Assessment of Educational Progress are analyzed. The analyses are based on a qualitative cognitive model and an item-response theory model. (TJH)
Descriptors: Adult Literacy, Cognitive Processes, Diagnostic Tests, Elementary Secondary Education