NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 11 results Save | Export
Al-Jarf, Reima – Online Submission, 2023
This article aims to give a comprehensive guide to planning and designing vocabulary tests which include Identifying the skills to be covered by the test; outlining the course content covered; preparing a table of specifications that shows the skill, content topics and number of questions allocated to each; and preparing the test instructions. The…
Descriptors: Vocabulary Development, Learning Processes, Test Construction, Course Content
Bronson Hui – ProQuest LLC, 2021
Vocabulary researchers have started expanding their assessment toolbox by incorporating timed tasks and psycholinguistic instruments (e.g., priming tasks) to gain insights into lexical development (e.g., Elgort, 2011; Godfroid, 2020b; Nakata & Elgort, 2020; Vandenberghe et al., 2021). These timed sensitive and implicit word measures differ…
Descriptors: Measures (Individuals), Construct Validity, Decision Making, Vocabulary Development
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Karagöl, Efecan – Journal of Language and Linguistic Studies, 2020
Turkish and Foreign Languages Research and Application Center (TÖMER) is one of the important institutions for learning Turkish as a foreign language. In these institutions, proficiency tests are applied at the end of each level. However, test applications in TÖMERs vary between each center as there is no shared program in teaching Turkish as a…
Descriptors: Language Tests, Turkish, Language Proficiency, Second Language Learning
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Polat, Murat – Novitas-ROYAL (Research on Youth and Language), 2020
Classroom practices, materials and teaching methods in language classes have changed a lot in the last decades and continue to evolve; however, the commonly used techniques to test students' foreign language skills have not changed much regardless of the recent awareness in Bloom's taxonomy. Testing units at schools rely mostly on multiple choice…
Descriptors: Multiple Choice Tests, Test Format, Test Items, Difficulty Level
Peer reviewed Peer reviewed
Direct linkDirect link
Soureshjani, Kamal Heidari – Language Testing in Asia, 2011
Among several factors affecting the performance of testees on a test is the sequence of the test items. The present study served as an attempt to shed light on the effect of test item sequence on Iranian EFL learners' performance on a test of grammar. To achieve such a purpose, 70 language learners of English at Pooyesh Institute in Shiraz (the…
Descriptors: Test Format, Test Items, Difficulty Level, Grammar
Abedi, Jamal; Leon, Seth; Kao, Jenny; Bayley, Robert; Ewers, Nancy; Herman, Joan; Mundhenk, Kimberly – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2011
The purpose of this study was to examine the characteristics of reading test items that may differentially impede the performance of students with disabilities. The findings suggest that there are certain revisions that can be done on current assessments to make them more accessible for students with disabilities. Features such as words per page,…
Descriptors: Test Items, Reading Tests, Disabilities, Student Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
He, Wei; Wolfe, Edward W. – International Journal of Testing, 2010
This article reports the results of a study of potential sources of item nonequivalence between English and Chinese language versions of a cognitive development test for preschool-aged children. Items were flagged for potential nonequivalence through statistical and judgment-based procedures, and the relationship between flag status and item…
Descriptors: Preschool Children, Mandarin Chinese, Cognitive Development, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Allalouf, Avi; Abramzon, Andrea – Language Assessment Quarterly, 2008
Differential item functioning (DIF) analysis can be used to great advantage in second language (L2) assessments. This study examined the differences in performance on L2 test items between groups from different first language backgrounds and suggested ways of improving L2 assessments. The study examined DIF on L2 (Hebrew) test items for two…
Descriptors: Test Items, Test Format, Second Language Learning, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
David, Gergely – Language Testing, 2007
Some educational contexts almost mandate the application of multiple-choice (MC) testing techniques, even if they are deplored by many practitioners in the field. In such contexts especially, research into how well these types of item perform and how their performance may be characterised is both appropriate and desirable. The focus of this paper…
Descriptors: Student Evaluation, Grammar, Language Tests, Test Items
Peer reviewed Peer reviewed
Plake, Barbara S.; Huntley, Renee M. – Educational and Psychological Measurement, 1984
Two studies examined the effect of making the correct answer of a multiple choice test item grammatically consistent with the item. American College Testing Assessment experimental items were constructed to investigate grammatical compliance to investigate grammatical compliance for plural-singular and vowel-consonant agreement. Results suggest…
Descriptors: Grammar, Higher Education, Item Analysis, Multiple Choice Tests
McMorris, Robert F.; And Others – 1983
Two 50-item multiple-choice forms of a grammar test were developed differing only in humor being included in 20 items of one form. One hundred twenty-six (126) eighth graders received the test plus alternate forms of a questionnaire. Humor inclusion did not affect grammar scores on matched humorous/nonhumorous items nor on common post-treatment…
Descriptors: Grade 8, Grammar, Humor, Junior High Schools