Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 7 |
Since 2006 (last 20 years) | 9 |
Descriptor
Coding | 11 |
Elementary School Students | 11 |
Test Items | 11 |
Models | 5 |
Computation | 4 |
Difficulty Level | 4 |
Foreign Countries | 4 |
Problem Solving | 4 |
Science Tests | 4 |
Statistical Analysis | 4 |
Test Construction | 4 |
More ▼ |
Source
Grantee Submission | 3 |
Assessment for Effective… | 1 |
Curriculum Journal | 1 |
International Journal of… | 1 |
Journal of Computer Assisted… | 1 |
Journal of Language and… | 1 |
Mathematics Education… | 1 |
Author
Champagne, Zachary M. | 2 |
Farina, Kristy | 2 |
LaVenia, Mark | 2 |
Nelson, Gena | 2 |
Schoen, Robert C. | 2 |
Yepes-Baraya, Mario | 2 |
Abendroth, Jennifer | 1 |
Allen, Nancy L. | 1 |
Baird, Jo-Anne | 1 |
Bohm, Isabell | 1 |
Di Mitri, Daniele | 1 |
More ▼ |
Publication Type
Reports - Research | 9 |
Journal Articles | 5 |
Speeches/Meeting Papers | 3 |
Reports - Evaluative | 2 |
Tests/Questionnaires | 2 |
Education Level
Elementary Education | 9 |
Early Childhood Education | 4 |
Primary Education | 4 |
Grade 3 | 3 |
Grade 2 | 2 |
Grade 1 | 1 |
Secondary Education | 1 |
Audience
Location
United Kingdom (England) | 2 |
Australia | 1 |
Florida | 1 |
Germany | 1 |
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 2 |
Wide Range Achievement Test | 2 |
What Works Clearinghouse Rating
Gombert, Sebastian; Di Mitri, Daniele; Karademir, Onur; Kubsch, Marcus; Kolbe, Hannah; Tautz, Simon; Grimm, Adrian; Bohm, Isabell; Neumann, Knut; Drachsler, Hendrik – Journal of Computer Assisted Learning, 2023
Background: Formative assessments are needed to enable monitoring how student knowledge develops throughout a unit. Constructed response items which require learners to formulate their own free-text responses are well suited for testing their active knowledge. However, assessing such constructed responses in an automated fashion is a complex task…
Descriptors: Coding, Energy, Scientific Concepts, Formative Evaluation
Hubbard, Jane; Russo, James; Livy, Sharyn – Mathematics Education Research Group of Australasia, 2022
Making accurate judgements and interpretations about student growth and progress in mathematics can be problematic when using open-ended assessments. This study reports on the development of a class-based assessment instrument and marking key designed to assess Year 2 students' mathematics competence to reflect their learning of mathematics…
Descriptors: Mathematics Skills, Mathematics Instruction, Grading, Mathematics Tests
Nelson, Gena; Powell, Sarah R. – Assessment for Effective Intervention, 2018
Though proficiency with computation is highly emphasized in national mathematics standards, students with mathematics difficulty (MD) continue to struggle with computation. To learn more about the differences in computation error patterns between typically achieving students and students with MD, we assessed 478 third-grade students on a measure…
Descriptors: Computation, Mathematics Instruction, Learning Problems, Mathematics Skills
Nelson, Gena; Powell, Sarah R – Grantee Submission, 2017
Though proficiency with computation is highly emphasized in national mathematics standards, students with mathematics difficulty (MD) continue to struggle with computation. To learn more about the differences in computation error patterns between typically achieving students and students with MD, we assessed 478 3rd-grade students on a measure of…
Descriptors: Computation, Mathematics Instruction, Learning Problems, Mathematics Skills
Predicting Item Difficulty of Science National Curriculum Tests: The Case of Key Stage 2 Assessments
El Masri, Yasmine H.; Ferrara, Steve; Foltz, Peter W.; Baird, Jo-Anne – Curriculum Journal, 2017
Predicting item difficulty is highly important in education for both teachers and item writers. Despite identifying a large number of explanatory variables, predicting item difficulty remains a challenge in educational assessment with empirical attempts rarely exceeding 25% of variance explained. This paper analyses 216 science items of key stage…
Descriptors: Predictor Variables, Test Items, Difficulty Level, Test Construction
Schoen, Robert C.; LaVenia, Mark; Champagne, Zachary M.; Farina, Kristy; Tazaz, Amanda M. – Grantee Submission, 2017
The following report describes an assessment instrument called the Mathematics Performance and Cognition (MPAC) interview. The MPAC interview was designed to measure two outcomes of interest. It was designed to measure first and second graders' mathematics achievement in number, operations, and equality, and it was also designed to gather…
Descriptors: Interviews, Test Construction, Psychometrics, Elementary School Mathematics
Schoen, Robert C.; LaVenia, Mark; Champagne, Zachary M.; Farina, Kristy – Grantee Submission, 2017
This report provides an overview of the development, implementation, and psychometric properties of a student mathematics interview designed to assess first- and second-grade student achievement and thinking processes. The student interview was conducted with 622 first- or second-grade students in 22 schools located in two public school districts…
Descriptors: Interviews, Test Construction, Psychometrics, Elementary School Mathematics
Ong, Yoke Mooi; Williams, Julian; Lamprianou, Iasonas – International Journal of Testing, 2015
The purpose of this article is to explore crossing differential item functioning (DIF) in a test drawn from a national examination of mathematics for 11-year-old pupils in England. An empirical dataset was analyzed to explore DIF by gender in a mathematics assessment. A two-step process involving the logistic regression (LR) procedure for…
Descriptors: Mathematics Tests, Gender Differences, Test Bias, Test Items
Williams, Lunetta M.; Hall, Katrina W.; Hedrick, Wanda B.; Lamkin, Marcia; Abendroth, Jennifer – Journal of Language and Literacy Education, 2013
The purpose of the present study was to develop an instrument to measure reading during in-school independent reading (ISIR). Procedures to establish validity and reliability of the instrument included videotaping and observing students during ISIR, gathering feedback from literacy experts, establishing interrater reliability, crosschecking…
Descriptors: Test Construction, Test Validity, Test Reliability, Video Technology
Yepes-Baraya, Mario; Tatsuoka, Kikumi; Allen, Nancy L.; O'Sullivan, Christine; Liang, Jo-Lin; Hui, Xuefei – 1998
In the context of Phase Four of the National Assessment of Educational Progress (NAEP) Science Attribute Study, this report includes a discussion of item attributes, an overview of the item attributes used in the study, some psychometric characteristics of the blocks analyzed, a general description of the rule-space methodology, the results…
Descriptors: Coding, Elementary School Students, Grade 4, Intermediate Grades
Yepes-Baraya, Mario – 1997
The study described in this paper is part of an effort to improve understanding of the science assessment of the National Assessment of Educational Progress (NAEP). It involved the coding of all the items in the 1996 NAEP science assessments, which included 45 blocks (15 each for grades 4, 8, and 12) and over 500 items. Each of the approximately…
Descriptors: Coding, Elementary School Students, Grade 4, Intermediate Grades