NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 3,061 to 3,075 of 9,530 results Save | Export
National Assessment Governing Board, 2017
The National Assessment of Educational Progress (NAEP) is the only continuing and nationally representative measure of trends in academic achievement of U.S. elementary and secondary school students in various subjects. For more than four decades, NAEP assessments have been conducted periodically in reading, mathematics, science, writing, U.S.…
Descriptors: Mathematics Achievement, Multiple Choice Tests, National Competency Tests, Educational Trends
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Dorans, Neil J. – ETS Research Report Series, 2013
Quantitative fairness procedures have been developed and modified by ETS staff over the past several decades. ETS has been a leader in fairness assessment, and its efforts are reviewed in this report. The first section deals with differential prediction and differential validity procedures that examine whether test scores predict a criterion, such…
Descriptors: Test Bias, Statistical Analysis, Test Validity, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Vedul-Kjelsås, Vigdis; Stensdotter, Ann-Katrin; Sigmundsson, Hermundur – Scandinavian Journal of Educational Research, 2013
By using the Movement Assessment Battery (MABC), the present study investigated possible gender differences in several tasks of motor competence in children. The sample included 67 Norwegian sixth-grade children (Girls N?=?29; Boys?=?39). Boys' performance exceeds that of girls in ball skills and in one of the balance skills. No differences were…
Descriptors: Foreign Countries, Gender Differences, Physical Activities, Psychomotor Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Parish, Jane A.; Karisch, Brandi B. – Journal of Extension, 2013
Item analysis can serve as a useful tool in improving multiple-choice questions used in Extension programming. It can identify gaps between instruction and assessment. An item analysis of Mississippi Master Cattle Producer program multiple-choice examination responses was performed to determine the difficulty of individual examinations, assess the…
Descriptors: Test Items, Item Analysis, Multiple Choice Tests, Extension Education
Peer reviewed Peer reviewed
Direct linkDirect link
Mao, Xiuzhen; Xin, Tao – Applied Psychological Measurement, 2013
The Monte Carlo approach which has previously been implemented in traditional computerized adaptive testing (CAT) is applied here to cognitive diagnostic CAT to test the ability of this approach to address multiple content constraints. The performance of the Monte Carlo approach is compared with the performance of the modified maximum global…
Descriptors: Monte Carlo Methods, Cognitive Tests, Diagnostic Tests, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Hsu, Chia-Ling; Wang, Wen-Chung; Chen, Shu-Ying – Applied Psychological Measurement, 2013
Interest in developing computerized adaptive testing (CAT) under cognitive diagnosis models (CDMs) has increased recently. CAT algorithms that use a fixed-length termination rule frequently lead to different degrees of measurement precision for different examinees. Fixed precision, in which the examinees receive the same degree of measurement…
Descriptors: Computer Assisted Testing, Adaptive Testing, Cognitive Tests, Diagnostic Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Sun, Jianan; Xin, Tao; Zhang, Shumei; de la Torre, Jimmy – Applied Psychological Measurement, 2013
This article proposes a generalized distance discriminating method for test with polytomous response (GDD-P). The new method is the polytomous extension of an item response theory (IRT)-based cognitive diagnostic method, which can identify examinees' ideal response patterns (IRPs) based on a generalized distance index. The similarities between…
Descriptors: Item Response Theory, Cognitive Tests, Diagnostic Tests, Matrices
Peer reviewed Peer reviewed
Direct linkDirect link
Hansen, Mary A.; Lyon, Steven R.; Heh, Peter; Zigmond, Naomi – Applied Measurement in Education, 2013
Large-scale assessment programs, including alternate assessments based on alternate achievement standards (AA-AAS), must provide evidence of technical quality and validity. This study provides information about the technical quality of one AA-AAS by evaluating the standard setting for the science component. The assessment was designed to have…
Descriptors: Alternative Assessment, Science Tests, Standard Setting, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Svetina, Dubravka – Educational and Psychological Measurement, 2013
The purpose of this study was to investigate the effect of complex structure on dimensionality assessment in noncompensatory multidimensional item response models using dimensionality assessment procedures based on DETECT (dimensionality evaluation to enumerate contributing traits) and NOHARM (normal ogive harmonic analysis robust method). Five…
Descriptors: Item Response Theory, Statistical Analysis, Computation, Test Length
Maxwell, Lesli A. – Education Week, 2013
As test designers work to craft the new, common assessments set to debut in most of the nation's public schools in the 2014-15 school year, their goal is to provide all English-language learners (ELLs), regardless of their language-proficiency levels, the same opportunities to demonstrate their content knowledge and skills as their peers who are…
Descriptors: Testing Accommodations, English Language Learners, Educational Assessment, Consortia
Peer reviewed Peer reviewed
Direct linkDirect link
Runnels, Judith – Language Testing in Asia, 2013
Differential item functioning (DIF) is when a test item favors or hinders a characteristic exhibited by group members of a test-taking population. DIF analyses are statistical procedures used to determine to what extent the content of an item affects the item endorsement of sub-groups of test-takers. If DIF is found for many items on the test, the…
Descriptors: Test Items, Test Bias, Item Response Theory, College Freshmen
Xiang, Rui – ProQuest LLC, 2013
A key issue of cognitive diagnostic models (CDMs) is the correct identification of Q-matrix which indicates the relationship between attributes and test items. Previous CDMs typically assumed a known Q-matrix provided by domain experts such as those who developed the questions. However, misspecifications of Q-matrix had been discovered in the past…
Descriptors: Diagnostic Tests, Cognitive Processes, Matrices, Test Items
Marsh, Kimberly R. – ProQuest LLC, 2013
As the demand for accountability and transparency in higher education has increased, so too has the call for direct assessment of student learning outcomes. Accompanying this increase of knowledge-based, cognitive assessments administered in a higher education context is an increased emphasis on assessing various noncognitive aspects of student…
Descriptors: Mixed Methods Research, Educational Assessment, Test Items, Individual Characteristics
Peer reviewed Peer reviewed
Direct linkDirect link
Basaraba, Deni; Yovanoff, Paul; Alonzo, Julie; Tindal, Gerald – Reading and Writing: An Interdisciplinary Journal, 2013
Although the recent identification of the five critical components of early literacy has been a catalyst for modifications to the content of materials used to provide reading instruction and the tools used to examine student's acquisition of early literacy skills, these skills have not received equal attention from test developers and publishers.…
Descriptors: Reading Comprehension, Emergent Literacy, Reading Tests, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Chon, Kyong Hee; Lee, Won-Chan; Ansley, Timothy N. – Applied Measurement in Education, 2013
Empirical information regarding performance of model-fit procedures has been a persistent need in measurement practice. Statistical procedures for evaluating item fit were applied to real test examples that consist of both dichotomously and polytomously scored items. The item fit statistics used in this study included the PARSCALE's G[squared],…
Descriptors: Test Format, Test Items, Item Analysis, Goodness of Fit
Pages: 1  |  ...  |  201  |  202  |  203  |  204  |  205  |  206  |  207  |  208  |  209  |  ...  |  636