Publication Date
In 2025 | 1 |
Since 2024 | 8 |
Since 2021 (last 5 years) | 20 |
Since 2016 (last 10 years) | 45 |
Since 2006 (last 20 years) | 189 |
Descriptor
Educational Assessment | 969 |
Test Construction | 969 |
Elementary Secondary Education | 392 |
Student Evaluation | 279 |
Evaluation Methods | 248 |
Test Use | 228 |
Performance Based Assessment | 221 |
Testing Programs | 159 |
Test Items | 154 |
Academic Achievement | 144 |
Test Validity | 144 |
More ▼ |
Source
Author
Publication Type
Education Level
Audience
Practitioners | 103 |
Teachers | 78 |
Policymakers | 26 |
Researchers | 25 |
Administrators | 20 |
Students | 15 |
Community | 5 |
Parents | 5 |
Location
Canada | 23 |
Australia | 18 |
United States | 15 |
United Kingdom | 13 |
Texas | 10 |
California | 9 |
Kentucky | 9 |
United Kingdom (England) | 9 |
Ohio | 8 |
Connecticut | 6 |
Netherlands | 5 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Does not meet standards | 1 |
Moses, Tim – Journal of Educational Measurement, 2022
One result of recent changes in testing is that previously established linking frameworks may not adequately address challenges in current linking situations. Test linking through equating, concordance, vertical scaling or battery scaling may not represent linkings for the scores of tests developed to measure constructs differently for different…
Descriptors: Measures (Individuals), Educational Assessment, Test Construction, Comparative Analysis
Daniel Koretz – Journal of Educational and Behavioral Statistics, 2024
A critically important balance in educational measurement between practical concerns and matters of technique has atrophied in recent decades, and as a result, some important issues in the field have not been adequately addressed. I start with the work of E. F. Lindquist, who exemplified the balance that is now wanting. Lindquist was arguably the…
Descriptors: Educational Assessment, Evaluation Methods, Achievement Tests, Educational History
Scott F. Marion, Editor; James W. Pellegrino, Editor; Amy I. Berman, Editor – National Academy of Education, 2024
High-quality assessments are crucial to many aspects of the educational process. They can help policymakers monitor long-term educational trends, assist state educational agencies (SEAs) and local educational agencies (LEAs) in allocating resources and professional development opportunities, provide insights to teachers about how well students…
Descriptors: Educational Assessment, Educational Policy, Equal Education, Test Validity
Suto, Irenka; Williamson, Joanna; Ireland, Jo; Macinska, Sylwia – Research Papers in Education, 2023
Errors that occasionally manifest in examination papers and other educational assessment instruments can threaten reliability and validity. For example, a multiple choice question could have two correct response options, or a geography question containing an inaccurate map could be unanswerable. In this paper we explore this oft-neglected element…
Descriptors: Error Patterns, International Assessment, Test Construction, Failure
Kyung-Mi O. – Language Testing in Asia, 2024
This study examines the efficacy of artificial intelligence (AI) in creating parallel test items compared to human-made ones. Two test forms were developed: one consisting of 20 existing human-made items and another with 20 new items generated with ChatGPT assistance. Expert reviews confirmed the content parallelism of the two test forms.…
Descriptors: Comparative Analysis, Artificial Intelligence, Computer Software, Test Items
Fiji Phuti; Setlhomo Koloi-Keaikitse; Gaelebale Nnunu Tsheko; Seth Oppong – SAGE Open, 2023
There are concerns that soft skills assessment has been conceptualized within the Western context and may not reflect the indigenous African worldview. Without relevant soft skills assessment contextualized in the African cultural cosmology, there is a limitation in assessing African conceptions of abilities. The purpose of this study was to…
Descriptors: Foreign Countries, Soft Skills, Indigenous Knowledge, African Culture
Hatice Merve Imir; K. Büsra Kaynak-Ekici; Z. Fulya Temel – Infant and Child Development, 2024
This study examines metacognitive monitoring in Turkish preschoolers aged 48--66 months, crucial for their learning and development. A specialised paired-association task was designed to assess higher-order thinking skills in this age group. Data from 160 children (52.5% girls, 47.5% boys; mean age 57.6 months, standard deviation 4.8) were…
Descriptors: Test Construction, Test Validity, Metacognition, Educational Assessment
Gitomer, Drew H.; Iwatani, Emi – Educational Assessment, 2022
The education measurement community has centered the idea of test fairness in both theory and practice. Yet, racial justice advocates in education research and practice (the racial justice community) have consistently critiqued that assessments are hardly fair and play a critical and outsized role in contributing to racial and social inequities in…
Descriptors: Culture Fair Tests, Equal Education, Justice, Educational Assessment
Abdullah Abdul Wahab Alsayar – ProQuest LLC, 2021
Testlets bring several perks in the development and administration of tests, such as 1) the construction of meaningful test items, 2) the avoidance of non-relevant context exposure, 3) the improvement of testing efficiency, and 4) the progression of testlet items requiring higher thinking skills. Thus, the inclusion of testlets in educational…
Descriptors: Test Construction, Testing, Test Items, Efficiency
Joshua B. Gilbert; Luke W. Miratrix; Mridul Joshi; Benjamin W. Domingue – Journal of Educational and Behavioral Statistics, 2025
Analyzing heterogeneous treatment effects (HTEs) plays a crucial role in understanding the impacts of educational interventions. A standard practice for HTE analysis is to examine interactions between treatment status and preintervention participant characteristics, such as pretest scores, to identify how different groups respond to treatment.…
Descriptors: Causal Models, Item Response Theory, Statistical Inference, Psychometrics
Maddox, Bryan – OECD Publishing, 2023
The digital transition in educational testing has introduced many new opportunities for technology to enhance large-scale assessments. These include the potential to collect and use log data on test-taker response processes routinely, and on a large scale. Process data has long been recognised as a valuable source of validation evidence in…
Descriptors: Measurement, Inferences, Test Reliability, Computer Assisted Testing
Joshua B. Gilbert; Luke W. Miratrix; Mridul Joshi; Benjamin W. Domingue – Annenberg Institute for School Reform at Brown University, 2024
Analyzing heterogeneous treatment effects (HTE) plays a crucial role in understanding the impacts of educational interventions. A standard practice for HTE analysis is to examine interactions between treatment status and pre-intervention participant characteristics, such as pretest scores, to identify how different groups respond to treatment.…
Descriptors: Causal Models, Item Response Theory, Statistical Inference, Psychometrics
Sondergeld, Toni A.; Johnson, Carla C. – School Science and Mathematics, 2019
In response to the call for more rigorously validated educational assessments, this study used an iterative multimethod validation process to develop and validate outcomes from the 21st Century Skills Assessment global rating scale. Qualitative and quantitative data sources were used to inform four types of validity evidence: content, response…
Descriptors: 21st Century Skills, Test Construction, Test Validity, Educational Assessment
Rotou, Ourania; Rupp, André A. – ETS Research Report Series, 2020
This research report provides a description of the processes of evaluating the "deployability" of automated scoring (AS) systems from the perspective of large-scale educational assessments in operational settings. It discusses a comprehensive psychometric evaluation that entails analyses that take into consideration the specific purpose…
Descriptors: Computer Assisted Testing, Scoring, Educational Assessment, Psychometrics
Dadey, Nathan; Gong, Brian – Smarter Balanced Assessment Consortium, 2023
This document is written primarily for policy makers and state department of education staff who are considering through-year assessments, as well as consultants and contractors state departments rely on. The document identifies essential things to consider when designing or evaluating a through-year assessment program. The paper is organized into…
Descriptors: Student Evaluation, Progress Monitoring, Summative Evaluation, Standardized Tests