NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Yves Bestgen – Applied Linguistics, 2024
Measuring lexical diversity in texts that have different lengths is problematic because length has a significant effect on the number of types a text contains, thus hampering any comparison. Treffers-Daller et al. (2018) recommended a simple solution, namely counting the number of types in a section of a given length that was extracted from the…
Descriptors: Language Variation, Second Language Learning, Essays, Writing Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Olesya Kisselev; Mihail Kopotev; Anton Vakhranev – Modern Language Journal, 2025
Lexical proficiency in a second language (L2) has long been effectively assessed through the measurement of various lexical indices, or textual characteristics that act as observable indicators of such conceptual categories as lexical richness, diversity, sophistication, and fluency. While many studies have established links between these lexical…
Descriptors: Language Proficiency, Russian, Second Language Learning, Second Language Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Jia, Wenfeng; Zhang, Peixin – Language Testing in Asia, 2023
It is widely believed that raters' cognition is an important aspect of writing assessment, as it has both logical and temporal priority over scores. Based on a critical review of previous research in this area, it is found that raters' cognition can be boiled to two fundamental issues: building text images and strategies for articulating scores.…
Descriptors: Problem Solving, Cognitive Processes, Writing Evaluation, Evaluators
Peer reviewed Peer reviewed
Direct linkDirect link
Deane, Paul; Song, Yi; van Rijn, Peter; O'Reilly, Tenaha; Fowles, Mary; Bennett, Randy; Sabatini, John; Zhang, Mo – Reading and Writing: An Interdisciplinary Journal, 2019
This paper presents a theoretical and empirical case for the value of scenario-based assessment (SBA) in the measurement of students' written argumentation skills. First, we frame the problem in terms of creating a reasonably efficient method of evaluating written argumentation skills, including for students at relatively low levels of competency.…
Descriptors: Vignettes, Writing Skills, Persuasive Discourse, Writing Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Breland, Hunter; Lee, Yong-Won; Muraki, Eiji – Educational and Psychological Measurement, 2005
Eighty-three Test of English as a Foreign Language (TOEFL) writing prompts administered via computer-based testing between July 1998 and August 2000 were examined for differences attributable to the response mode (handwriting or word processing) chosen by examinees. Differences were examined statistically using polytomous logistic regression. A…
Descriptors: Evaluation Methods, Word Processing, Handwriting, Effect Size
Peer reviewed Peer reviewed
Direct linkDirect link
East, Martin – Assessing Writing, 2006
Writing assessment essentially juxtaposes two elements: how "good writing" is to be defined, and how "good measurement" of that writing is to be carried out. The timed test is often used in large-scale L2 writing assessments because it is considered to provide reliable measurement. It is, however, highly inauthentic. One way of enhancing…
Descriptors: Writing Evaluation, Writing Tests, Timed Tests, Dictionaries
Breland, Hunter M.; And Others – 1995
Brief, impromptu essays written for the 1990 administration of the College Board's English Composition Test (ECT) were randomly sampled for four groups of examinees. These essays were subjected to further holistic ratings beyond those conducted for the ECT, and analytical ratings were also obtained. The holistic scores were correlated with the…
Descriptors: Cohesion (Written Composition), English, Essays, Evaluation Methods
Weasenforth, Donald L. – 1993
This study examined 412 college students' essay performance on two prompt types, a traditional prose essay and a type incorporating graphics, modeled on those from English Language Challenge Examination (ELCE) developed for the University of Southern California (USC). The majority of the participants were international students at USC. Each…
Descriptors: College Students, English (Second Language), Essays, Evaluation Methods