NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Sainan Xu; Jing Lu; Jiwei Zhang; Chun Wang; Gongjun Xu – Grantee Submission, 2024
With the growing attention on large-scale educational testing and assessment, the ability to process substantial volumes of response data becomes crucial. Current estimation methods within item response theory (IRT), despite their high precision, often pose considerable computational burdens with large-scale data, leading to reduced computational…
Descriptors: Educational Assessment, Bayesian Statistics, Statistical Inference, Item Response Theory
André A. Rupp; Laura Pinsonneault – National Center for the Improvement of Educational Assessment, 2025
State education agencies are sitting on rich repositories of quantitative and qualitative assessment data. This document is designed to provide a conceptual framework and implementation guidance that can help agency leadership leverage and interrogate student performance data in systematic ways for reporting, outreach, and planning purposes. The…
Descriptors: Evaluation Methods, Educational Assessment, Achievement Tests, College Entrance Examinations
Peer reviewed Peer reviewed
Direct linkDirect link
Verhelst, Norman D. – Scandinavian Journal of Educational Research, 2012
When using IRT models in Educational Achievement Testing, the model is as a rule too simple to catch all the relevant dimensions in the test. It is argued that a simple model may nevertheless be useful but that it can be complemented with additional analyses. Such an analysis, called profile analysis, is proposed and applied to the reading data of…
Descriptors: Multidimensional Scaling, Profiles, Item Response Theory, Achievement Tests
Donovan, Jenny; Hutton, Penny; Lennon, Melissa; O'Connor, Gayl; Morrissey, Noni – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In July 2001, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to the development of assessment instruments and key performance measures for reporting on student skills, knowledge and understandings in primary science. It directed the newly established Performance Measurement and Reporting Taskforce…
Descriptors: Foreign Countries, Scientific Literacy, Media Literacy, Scientific Concepts
Donovan, Jenny; Lennon, Melissa; O'Connor, Gayl; Morrissey, Noni – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In 2003 the first nationally-comparable science assessment was designed, developed and carried out under the auspices of the national council of education ministers, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA). In 2006 a second science assessment was conducted and, for the first time nationally, the…
Descriptors: Foreign Countries, Scientific Literacy, Science Achievement, Comparative Analysis
Wu, Margaret; Donovan, Jenny; Hutton, Penny; Lennon, Melissa – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In July 2001, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to the development of assessment instruments and key performance measures for reporting on student skills, knowledge and understandings in primary science. It directed the newly established Performance Measurement and Reporting Taskforce…
Descriptors: Foreign Countries, Scientific Literacy, Science Achievement, Comparative Analysis