NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 34 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Kyung-Mi O. – Language Testing in Asia, 2024
This study examines the efficacy of artificial intelligence (AI) in creating parallel test items compared to human-made ones. Two test forms were developed: one consisting of 20 existing human-made items and another with 20 new items generated with ChatGPT assistance. Expert reviews confirmed the content parallelism of the two test forms.…
Descriptors: Comparative Analysis, Artificial Intelligence, Computer Software, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Goran Trajkovski; Heather Hayes – Digital Education and Learning, 2025
This book explores the transformative role of artificial intelligence in educational assessment, catering to researchers, educators, administrators, policymakers, and technologists involved in shaping the future of education. It delves into the foundations of AI-assisted assessment, innovative question types and formats, data analysis techniques,…
Descriptors: Artificial Intelligence, Educational Assessment, Computer Uses in Education, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Wind, Stefanie A.; Guo, Wenjing – Educational Assessment, 2021
Scoring procedures for the constructed-response (CR) items in large-scale mixed-format educational assessments often involve checks for rater agreement or rater reliability. Although these analyses are important, researchers have documented rater effects that persist despite rater training and that are not always detected in rater agreement and…
Descriptors: Scoring, Responses, Test Items, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Yi-Hsuan Lee; Yue Jia – Applied Measurement in Education, 2024
Test-taking experience is a consequence of the interaction between students and assessment properties. We define a new notion, rapid-pacing behavior, to reflect two types of test-taking experience -- disengagement and speededness. To identify rapid-pacing behavior, we extend existing methods to develop response-time thresholds for individual items…
Descriptors: Adaptive Testing, Reaction Time, Item Response Theory, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Yu; Chiu, Chia-Yi; Köhn, Hans Friedrich – Journal of Educational and Behavioral Statistics, 2023
The multiple-choice (MC) item format has been widely used in educational assessments across diverse content domains. MC items purportedly allow for collecting richer diagnostic information. The effectiveness and economy of administering MC items may have further contributed to their popularity not just in educational assessment. The MC item format…
Descriptors: Multiple Choice Tests, Nonparametric Statistics, Test Format, Educational Assessment
Care, Esther; Vista, Alvin; Kim, Helyn – UNESCO Bangkok, 2019
UNESCO's Asia-Pacific Regional Bureau for Education has been working on education quality under the name of 'transversal competencies' (TVC) since 2013. Many of these competencies have been included in national education policy and curricula of countries in the region, but now the importance accorded them is increasingly gaining attention. As…
Descriptors: Foreign Countries, Educational Quality, 21st Century Skills, Competence
Lance M. Kruse – ProQuest LLC, 2019
This study explores six item-reduction methodologies used to shorten an existing complex problem-solving non-objective test by evaluating how each shortened form performs across three sources of validity evidence (i.e., test content, internal structure, and relationships with other variables). Two concerns prompted the development of the present…
Descriptors: Educational Assessment, Comparative Analysis, Test Format, Test Length
National Assessment Governing Board, 2019
Since 1973, the National Assessment of Educational Progress (NAEP) has gathered information about student achievement in mathematics. The NAEP assessment in mathematics has two components that differ in purpose. One assessment measures long-term trends in achievement among 9-, 13-, and 17-year-old students by using the same basic design each time.…
Descriptors: National Competency Tests, Mathematics Achievement, Grade 4, Grade 8
Martin, Michael O., Ed.; von Davier, Matthias, Ed.; Mullis, Ina V. S., Ed. – International Association for the Evaluation of Educational Achievement, 2020
The chapters in this online volume comprise the TIMSS & PIRLS International Study Center's technical report of the methods and procedures used to develop, implement, and report the results of TIMSS 2019. There were various technical challenges because TIMSS 2019 was the initial phase of the transition to eTIMSS, with approximately half the…
Descriptors: Foreign Countries, Elementary Secondary Education, Achievement Tests, International Assessment
National Assessment Governing Board, 2014
Since 1973, the National Assessment of Educational Progress (NAEP) has gathered information about student achievement in mathematics. Results of these periodic assessments, produced in print and web-based formats, provide valuable information to a wide variety of audiences. They inform citizens about the nature of students' comprehension of the…
Descriptors: National Competency Tests, Mathematics Achievement, Mathematics Skills, Grade 4
Louisiana Department of Education, 2012
"Louisiana Believes" embraces the principle that all children can achieve at high levels, as evidenced in Louisiana's recent adoption of the Common Core State Standards (CCSS). "Louisiana Believes" also promotes the idea that Louisiana's educators should be empowered to make decisions to support the success of their students.…
Descriptors: Student Evaluation, Achievement Tests, Educational Assessment, Testing Accommodations
Louisiana Department of Education, 2012
"Louisiana Believes" embraces the principle that all children can achieve at high levels, as evidenced in Louisiana's recent adoption of the Common Core State Standards (CCSS). "Louisiana Believes" also promotes the idea that Louisiana's educators should be empowered to make decisions to support the success of their students.…
Descriptors: Student Evaluation, Achievement Tests, Educational Assessment, Testing Accommodations
Hagge, Sarah Lynn – ProQuest LLC, 2010
Mixed-format tests containing both multiple-choice and constructed-response items are widely used on educational tests. Such tests combine the broad content coverage and efficient scoring of multiple-choice items with the assessment of higher-order thinking skills thought to be provided by constructed-response items. However, the combination of…
Descriptors: Test Format, True Scores, Equated Scores, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Frey, Andreas; Hartig, Johannes; Rupp, Andre A. – Educational Measurement: Issues and Practice, 2009
In most large-scale assessments of student achievement, several broad content domains are tested. Because more items are needed to cover the content domains than can be presented in the limited testing time to each individual student, multiple test forms or booklets are utilized to distribute the items to the students. The construction of an…
Descriptors: Measures (Individuals), Test Construction, Theory Practice Relationship, Design
Johanson, George; Motlomelo, Samuel – 1998
Many textbooks in educational measurement and classroom assessment have chapters devoted to specific item formats. There may be attempts to relate one item format to another, but the chapters and item formats are largely seem as distinct entities with only loose and uncertain connections. This paper synthesizes these discussions. An item format…
Descriptors: Educational Assessment, Essay Tests, Measurement Techniques, Objective Tests
Previous Page | Next Page »
Pages: 1  |  2  |  3