Publication Date
| In 2026 | 0 |
| Since 2025 | 200 |
| Since 2022 (last 5 years) | 1070 |
| Since 2017 (last 10 years) | 2580 |
| Since 2007 (last 20 years) | 4941 |
Descriptor
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 653 |
| Teachers | 563 |
| Researchers | 250 |
| Students | 201 |
| Administrators | 81 |
| Policymakers | 22 |
| Parents | 17 |
| Counselors | 8 |
| Community | 7 |
| Support Staff | 3 |
| Media Staff | 1 |
| More ▼ | |
Location
| Turkey | 225 |
| Canada | 223 |
| Australia | 155 |
| Germany | 116 |
| United States | 99 |
| China | 90 |
| Florida | 86 |
| Indonesia | 82 |
| Taiwan | 78 |
| United Kingdom | 73 |
| California | 65 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 4 |
| Meets WWC Standards with or without Reservations | 4 |
| Does not meet standards | 1 |
Seyedeh Azadeh Ghiasian; Fatemeh Hemmati; Seyyed Mohammad Alavi; Afsar Rouhi – International Journal of Language Testing, 2025
A critical component of cognitive diagnostic models (CDMs) is a Q-matrix that stipulates associations between items of a test and their required attributes. The present study aims to develop and empirically validate a Q-matrix for the listening comprehension section of the International English Language Testing System (IELTS). To this end, a…
Descriptors: Test Items, Listening Comprehension Tests, English (Second Language), Language Tests
Endang Susantini; Yurizka Melia Sari; Prima Vidya Asteria; Muhammad Ilyas Marzuqi – Journal of Education and Learning (EduLearn), 2025
Assessing preservice' higher order thinking skills (HOTS) in science and mathematics is essential. Teachers' HOTS ability is closely related to their ability to create HOTS-type science and mathematics problems. Among various types of HOTS, one is Bloomian HOTS. To facilitate the preservice teacher to create problems in those subjects, an Android…
Descriptors: Content Validity, Mathematics Instruction, Decision Making, Thinking Skills
Test of Understanding of Electric Field, Force, and Flux: A Reliable Multiple-Choice Assessment Tool
Eder Hernandez; Esmeralda Campos; Pablo Barniol; Genaro Zavala – Physical Review Physics Education Research, 2025
This study presents the development and validation of a novel multiple-choice test designed to assess university students' conceptual understanding of electric field, force, and flux. The test of understanding of electric field, force, and flux was constructed based on the results of previous studies using a phenomenographic approach to classify…
Descriptors: Physics, Scientific Concepts, Science Tests, Multiple Choice Tests
Güntay Tasçi – Science Insights Education Frontiers, 2024
The present study has aimed to develop and validate a protein concept inventory (PCI) consisting of 25 multiple-choice (MC) questions to assess students' understanding of protein, which is a fundamental concept across different biology disciplines. The development process of the PCI involved a literature review to identify protein-related content,…
Descriptors: Science Instruction, Science Tests, Multiple Choice Tests, Biology
Harun Bayer; Fazilet Gül Ince Araci; Gülsah Gürkan – International Journal of Technology in Education and Science, 2024
The rapid advancement of artificial intelligence technologies, their pervasive use in every field, and the growing understanding of the benefits they bring have led actors in the education sector to pursue research in this field. In particular, the use of artificial intelligence tools has become more prevalent in the education sector due to the…
Descriptors: Artificial Intelligence, Computer Software, Computational Linguistics, Technology Uses in Education
Shan Lin; Jian Wang – Journal of Baltic Science Education, 2024
Scientific thinking constitutes a vital component of scientific competencies, crucial for citizens to adapt to the evolving societal landscape. To cultivate students' scientific thinking, teachers should possess an adequate professional knowledge foundation, which encompasses pedagogical content knowledge (PCK). Assessing teachers' PCK of…
Descriptors: Secondary School Teachers, Teacher Attitudes, Biology, Pedagogical Content Knowledge
Zebing Wu – ProQuest LLC, 2024
Response style, one common aberrancy in non-cognitive assessments in psychological fields, is problematic in terms of inaccurate estimation of item and person parameters, which leads to serious reliability, validity, and fairness issues (Baumgartner & Steenkamp, 2001; Bolt & Johnson, 2009; Bolt & Newton, 2011). Response style refers to…
Descriptors: Response Style (Tests), Accuracy, Preferences, Psychological Testing
Kim, Kyung Yong; Lim, Euijin; Lee, Won-Chan – International Journal of Testing, 2019
For passage-based tests, items that belong to a common passage often violate the local independence assumption of unidimensional item response theory (UIRT). In this case, ignoring local item dependence (LID) and estimating item parameters using a UIRT model could be problematic because doing so might result in inaccurate parameter estimates,…
Descriptors: Item Response Theory, Equated Scores, Test Items, Models
Raykov, Tenko; Dimitrov, Dimiter M.; Marcoulides, George A.; Harrison, Michael – Educational and Psychological Measurement, 2019
This note highlights and illustrates the links between item response theory and classical test theory in the context of polytomous items. An item response modeling procedure is discussed that can be used for point and interval estimation of the individual true score on any item in a measuring instrument or item set following the popular and widely…
Descriptors: Correlation, Item Response Theory, Test Items, Scores
Barrett, Michelle D.; van der Linden, Wim J. – Journal of Educational and Behavioral Statistics, 2019
Parameter linking in item response theory is generally necessary to adjust for differences between the true values for the same item and ability parameters due to the use of different identifiability restrictions in different calibrations. The research reported in this article explores a precision-weighted (PW) approach to the problem of…
Descriptors: Item Response Theory, Computation, Error of Measurement, Test Items
Stanke, Luke; Bulut, Okan – International Journal of Assessment Tools in Education, 2019
Item response theory is a widely used framework for the design, scoring, and scaling of measurement instruments. Item response models are typically used for dichotomously scored questions that have only two score points (e.g., multiple-choice items). However, given the increasing use of instruments that include questions with multiple response…
Descriptors: Item Response Theory, Test Items, Responses, College Freshmen
Pentimonti, J.; Petscher, Y.; Stanley, C. – National Center on Improving Literacy, 2019
When evaluating the quality of any screening tool, it is important to determine whether or not the assessment is biased against different groups of students. We want to ensure that students do not receive higher or lower scores on an assessment for reasons other than the primary skill or trait that is being tested.
Descriptors: Screening Tests, Test Bias, Culture Fair Tests, Student Characteristics
Wang, Yu; Chiu, Chia-Yi; Köhn, Hans Friedrich – Journal of Educational and Behavioral Statistics, 2023
The multiple-choice (MC) item format has been widely used in educational assessments across diverse content domains. MC items purportedly allow for collecting richer diagnostic information. The effectiveness and economy of administering MC items may have further contributed to their popularity not just in educational assessment. The MC item format…
Descriptors: Multiple Choice Tests, Nonparametric Statistics, Test Format, Educational Assessment
Chen, Chia-Wen; Andersson, Björn; Zhu, Jinxin – Journal of Educational Measurement, 2023
The certainty of response index (CRI) measures respondents' confidence level when answering an item. In conjunction with the answers to the items, previous studies have used descriptive statistics and arbitrary thresholds to identify student knowledge profiles with the CRIs. Whereas this approach overlooked the measurement error of the observed…
Descriptors: Item Response Theory, Factor Analysis, Psychometrics, Test Items
Gregory J. Crowther; Usha Sankar; Leena S. Knight; Deborah L. Myers; Kevin T. Patton; Lekelia D. Jenkins; Thomas A. Knight – Journal of Microbiology & Biology Education, 2023
The biology education literature includes compelling assertions that unfamiliar problems are especially useful for revealing students' true understanding of biology. However, there is only limited evidence that such novel problems have different cognitive requirements than more familiar problems. Here, we sought additional evidence by using…
Descriptors: Science Instruction, Artificial Intelligence, Scoring, Molecular Structure

Peer reviewed
Direct link
