NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)8
Since 2006 (last 20 years)14
Audience
Laws, Policies, & Programs
Assessments and Surveys
Trends in International…2
What Works Clearinghouse Rating
Showing all 14 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Retnawati, Heri – International Journal of Assessment Tools in Education, 2018
The study was to identify the load, the type and the significance of differential item functioning (DIF) in constructed response item using the partial credit model (PCM). The data in the study were the students' instruments and the students' responses toward the PISA-like test items that had been completed by 386 ninth grade students and 460…
Descriptors: Test Bias, Test Items, Responses, Grade 9
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Anjar Putro Utomo; Erlia Narulita; Kinya Shimizu – Journal of Baltic Science Education, 2018
The aim of this research was to assess the classification of science test items of TIMSS grade 8 based on higher order thinking skills (HOTS) and determine whether those classified-science test items can be an assessment tool in science class. Sixteen sample test items of HOTS were chosen from 37 reasoning items of TIMSS 1999, 2003, and 2011;…
Descriptors: Foreign Countries, Achievement Tests, Elementary Secondary Education, International Assessment
Herrmann-Abell, Cari F.; DeBoer, George E. – Grantee Submission, 2018
This study tests a hypothesized learning progression for the concept of energy. It looks at 14 specific ideas under the categories of (i) Energy Forms and Transformations; (ii) Energy Transfer; (iii) Energy Dissipation and Degradation; and (iv) Energy Conservation. It then examines students' growth of understanding within each of these ideas at…
Descriptors: Energy, Science Instruction, Concept Formation, Energy Conservation
Peer reviewed Peer reviewed
Direct linkDirect link
Tengberg, Michael – Language Assessment Quarterly, 2018
Reading comprehension is often treated as a multidimensional construct. In many reading tests, items are distributed over reading process categories to represent the subskills expected to constitute comprehension. This study explores (a) the extent to which specified subskills of reading comprehension tests are conceptually conceivable to…
Descriptors: Reading Tests, Reading Comprehension, Scores, Test Results
Liu, Junhui; Brown, Terran; Chen, Jianshen; Ali, Usama; Hou, Likun; Costanzo, Kate – Partnership for Assessment of Readiness for College and Careers, 2016
The Partnership for Assessment of Readiness for College and Careers (PARCC) is a state-led consortium working to develop next-generation assessments that more accurately, compared to previous assessments, measure student progress toward college and career readiness. The PARCC assessments include both English Language Arts/Literacy (ELA/L) and…
Descriptors: Testing, Achievement Tests, Test Items, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Long, Caroline; Wendt, Heike – African Journal of Research in Mathematics, Science and Technology Education, 2017
South Africa participated in TIMSS from 1995 to 2015. Over these two decades, some positive changes have been reported on the aggregated mathematics performance patterns of South African learners. This paper focuses on the achievement patterns of South Africa's high-performing Grade 9 learners (n = 3378) in comparison with similar subsamples of…
Descriptors: Foreign Countries, Comparative Analysis, Multiplication, Comparative Education
Peer reviewed Peer reviewed
Direct linkDirect link
Schwichow, Martin; Christoph, Simon; Boone, William J.; Härtig, Hendrik – International Journal of Science Education, 2016
The so-called control-of-variables strategy (CVS) incorporates the important scientific reasoning skills of designing controlled experiments and interpreting experimental outcomes. As CVS is a prominent component of science standards appropriate assessment instruments are required to measure these scientific reasoning skills and to evaluate the…
Descriptors: Thinking Skills, Science Instruction, Science Experiments, Science Tests
Steedle, Jeffrey; McBride, Malena; Johnson, Marc; Keng, Leslie – Partnership for Assessment of Readiness for College and Careers, 2016
The first operational administration of the Partnership for Assessment of Readiness for College and Careers (PARCC) took place during the 2014-2015 school year. In addition to the traditional paper-and-pencil format, the assessments were available for administration on a variety of electronic devices, including desktop computers, laptop computers,…
Descriptors: Computer Assisted Testing, Difficulty Level, Test Items, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Hadenfeldt, Jan C.; Bernholt, Sascha; Liu, Xiufeng; Neumann, Knut; Parchmann, Ilka – Journal of Chemical Education, 2013
Helping students develop a sound understanding of scientific concepts can be a major challenge. Lately, learning progressions have received increasing attention as a means to support students in developing understanding of core scientific concepts. At the center of a learning progression is a sequence of developmental levels reflecting an…
Descriptors: Elementary School Science, Secondary School Science, Science Instruction, Chemistry
Peer reviewed Peer reviewed
Direct linkDirect link
Bernholt, Sascha; Parchmann, Ilka – Chemistry Education Research and Practice, 2011
Current reforms in the education policy of various countries are intended to produce a paradigm shift in the educational system towards an outcome orientation. After implementing educational standards as normative objectives, the development of test procedures that adequately reflect these targets and standards is a central problem. This paper…
Descriptors: Science Achievement, Chemistry, Knowledge Level, Science Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Miller, Tess; Chahine, Saad; Childs, Ruth A. – Practical Assessment, Research & Evaluation, 2010
This study illustrates the use of differential item functioning (DIF) and differential step functioning (DSF) analyses to detect differences in item difficulty that are related to experiences of examinees, such as their teachers' instructional practices, that are relevant to the knowledge, skill, or ability the test is intended to measure. This…
Descriptors: Test Bias, Difficulty Level, Test Items, Mathematics Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Wolf, Mikyung Kim; Kim, Jinok; Kao, Jenny – Applied Measurement in Education, 2012
Glossary and reading aloud test items are commonly allowed in many states' accommodation policies for English language learner (ELL) students for large-scale mathematics assessments. However, little research is available regarding the effects of these accommodations on ELL students' performance. Further, no research exists that examines how…
Descriptors: Testing Accommodations, Glossaries, Reading Aloud to Others, Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Harsch, Claudia; Rupp, Andre Alexander – Language Assessment Quarterly, 2011
The "Common European Framework of Reference" (CEFR; Council of Europe, 2001) provides a competency model that is increasingly used as a point of reference to compare language examinations. Nevertheless, aligning examinations to the CEFR proficiency levels remains a challenge. In this article, we propose a new, level-centered approach to…
Descriptors: Language Tests, Writing Tests, Test Construction, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Poon, Kin-Keung; Leung, Chi-Keung – International Journal of Mathematical Education in Science and Technology, 2010
The purpose of the study reported herein was to identify the common mistakes made by junior secondary students in Hong Kong when learning algebra and to compare teachers' perceptions of students' ability with the results of an algebra test. An algebra test was developed and administered to a sample of students (aged between 13 and 14 years). From…
Descriptors: Instructional Design, Test Results, Correlation, Academic Ability