Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 6 |
Descriptor
Source
ETS Research Report Series | 3 |
American Institutes for… | 1 |
Journal of Educational… | 1 |
Journal of School Choice | 1 |
Author
Oranje, Andreas | 2 |
Braun, Henry | 1 |
Deping, Li | 1 |
Haag, Nicole | 1 |
Innes, Richard G. | 1 |
Li, Deping | 1 |
Phillips, Gary W. | 1 |
Qian, Jiahe | 1 |
Roppelt, Alexander | 1 |
Sachse, Karoline A. | 1 |
Publication Type
Reports - Research | 6 |
Journal Articles | 5 |
Education Level
Elementary Education | 2 |
Elementary Secondary Education | 2 |
Grade 4 | 2 |
Secondary Education | 2 |
Grade 8 | 1 |
Intermediate Grades | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Audience
Location
Laws, Policies, & Programs
No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
National Assessment of… | 6 |
Program for International… | 1 |
Progress in International… | 1 |
Trends in International… | 1 |
What Works Clearinghouse Rating
Phillips, Gary W. – American Institutes for Research, 2014
This paper describes a statistical linking between the 2011 National Assessment of Educational Progress (NAEP) in Grade 4 reading and the 2011 Progress in International Reading Literacy Study (PIRLS) in Grade 4 reading. The primary purpose of the linking study is to obtain a statistical comparison between NAEP (a national assessment) and PIRLS (an…
Descriptors: National Competency Tests, Reading Achievement, Comparative Analysis, Measures (Individuals)
Sachse, Karoline A.; Roppelt, Alexander; Haag, Nicole – Journal of Educational Measurement, 2016
Trend estimation in international comparative large-scale assessments relies on measurement invariance between countries. However, cross-national differential item functioning (DIF) has been repeatedly documented. We ran a simulation study using national item parameters, which required trends to be computed separately for each country, to compare…
Descriptors: Comparative Analysis, Measurement, Test Bias, Simulation
Innes, Richard G. – Journal of School Choice, 2012
This article provides examples of how serious misconceptions can result when only "all student" scores from the National Assessment of Educational Progress (NAEP) are used for simplistic state-to-state comparisons. Suggestions for better treatment are presented. The article also compares Kentucky's eighth grade EXPLORE testing to NAEP…
Descriptors: National Competency Tests, Scoring, Misconceptions, Academic Achievement
Braun, Henry; Qian, Jiahe – ETS Research Report Series, 2008
This report describes the derivation and evaluation of a method for comparing the performance standards for public school students set by different states. It is based on an approach proposed by McLaughlin and associates, which constituted an innovative attempt to resolve the confusion and concern that occurs when very different proportions of…
Descriptors: State Standards, Comparative Analysis, Public Schools, National Competency Tests
Li, Deping; Oranje, Andreas – ETS Research Report Series, 2007
Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…
Descriptors: Error of Measurement, Regression (Statistics), Trend Analysis, National Competency Tests
Deping, Li; Oranje, Andreas – ETS Research Report Series, 2006
A hierarchical latent regression model is suggested to estimate nested and nonnested relationships in complex samples such as found in the National Assessment of Educational Progress (NAEP). The proposed model aims at improving both parameters and variance estimates via a two-level hierarchical linear model. This model falls naturally within the…
Descriptors: Hierarchical Linear Modeling, Computation, Measurement, Regression (Statistics)