Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 69 |
Descriptor
Source
Assessing Writing | 131 |
Author
Callahan, Susan | 3 |
Condon, William | 3 |
Knoch, Ute | 3 |
Slomp, David H. | 3 |
Weigle, Sara Cushing | 3 |
Anson, Chris M. | 2 |
Beck, Sarah W. | 2 |
Cumming, Alister | 2 |
Elbow, Peter | 2 |
Flach, Jennifer | 2 |
Gearhart, Maryl | 2 |
More ▼ |
Publication Type
Journal Articles | 131 |
Reports - Research | 59 |
Reports - Evaluative | 39 |
Reports - Descriptive | 16 |
Opinion Papers | 13 |
Information Analyses | 7 |
Guides - Non-Classroom | 4 |
Guides - Classroom - Teacher | 2 |
Collected Works - General | 1 |
Education Level
Higher Education | 41 |
Postsecondary Education | 31 |
Elementary Secondary Education | 22 |
Secondary Education | 9 |
Elementary Education | 6 |
Adult Education | 3 |
Grade 6 | 3 |
Grade 4 | 2 |
Grade 5 | 2 |
Grade 7 | 2 |
Grade 8 | 2 |
More ▼ |
Audience
Teachers | 13 |
Policymakers | 1 |
Practitioners | 1 |
Students | 1 |
Location
Canada | 7 |
Washington | 4 |
New York | 3 |
Netherlands | 2 |
New Zealand | 2 |
United States | 2 |
Arizona | 1 |
Australia | 1 |
California | 1 |
Canada (Toronto) | 1 |
China | 1 |
More ▼ |
Laws, Policies, & Programs
No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
Test of English as a Foreign… | 3 |
SAT (College Admission Test) | 2 |
Collegiate Assessment of… | 1 |
Myers Briggs Type Indicator | 1 |
National Assessment of… | 1 |
What Works Clearinghouse Rating
Does not meet standards | 1 |
Serviss, Tricia – Assessing Writing, 2012
Drawing upon archival materials, I describe the history, design, and assessment of literacy tests from early 20th century New York state. Practitioners working with these early standardized writing tests grappled with tensions created by public Nativist sentiment, the legislation of "literacy," and calls to score the tests in…
Descriptors: Literacy, Writing Tests, Standardized Tests, Scoring
Camp, Heather – Assessing Writing, 2012
This article reviews key developmental theories that have been adopted by writing development researchers over the last fifty years. It describes how researchers have translated these theories into definitions of writing development capable of influencing curricular design and interpretations of student writing and explores the implications for…
Descriptors: Writing (Composition), Writing Evaluation, Researchers, Theories
Bridgeman, Brent; Trapani, Catherine; Bivens-Tatum, Jennifer – Assessing Writing, 2011
Writing task variants can increase test security in high-stakes essay assessments by substantially increasing the pool of available writing stimuli and by making the specific writing task less predictable. A given prompt (parent) may be used as the basis for one or more different variants. Six variant types based on argument essay prompts from a…
Descriptors: Writing Evaluation, Writing Tests, Tests, Writing Instruction
Colombini, Crystal Broch; McBride, Maureen – Assessing Writing, 2012
Composition assessment scholars have exhibited uneasiness with the language of norming grounded in distaste for the psychometric assumption that achievement of consensus in a communal assessment setting is desirable even at the cost of individual pedagogical values. Responding to the problems of a "reliability" defined by homogenous agreement,…
Descriptors: Writing Evaluation, Conflict, Test Norms, Reliability
Ramineni, Chaitanya; Williamson, David M. – Assessing Writing, 2013
In this paper, we provide an overview of psychometric procedures and guidelines Educational Testing Service (ETS) uses to evaluate automated essay scoring for operational use. We briefly describe the e-rater system, the procedures and criteria used to evaluate e-rater, implications for a range of potential uses of e-rater, and directions for…
Descriptors: Educational Testing, Guidelines, Scoring, Psychometrics
Slomp, David H. – Assessing Writing, 2012
This article discusses three sets of challenges involved in the assessment of writing from a developmental perspective. These challenges include defining a workable theory of development, developing a suitable construct, and overcoming limitations in technocentric approaches to writing assessment. In North America in recent years, a burgeoning…
Descriptors: Writing (Composition), Writing Evaluation, Writing Tests, Writing Ability
Wardle, Elizabeth; Roozen, Kevin – Assessing Writing, 2012
This article offers one potential response to Yancey's (1999) call for a fourth wave of writing assessment able to capture writing development in all of its complexity. Based on an ecological perspective of literate development that situates students' growth as writers across multiple engagements with writing, including those outside of school,…
Descriptors: Writing Evaluation, Writing Tests, Ecology, Writing Instruction
Johnson, David; VanBrackle, Lewis – Assessing Writing, 2012
Raters of Georgia's (USA) state-mandated college-level writing exam, which is intended to ensure a minimal university-level writing competency, are trained to grade holistically when assessing these exams. A guiding principle in holistic grading is to not focus exclusively on any one aspect of writing but rather to give equal weight to style,…
Descriptors: Writing Evaluation, Linguistics, Writing Tests, English (Second Language)
Deane, Paul – Assessing Writing, 2013
This paper examines the construct measured by automated essay scoring (AES) systems. AES systems measure features of the text structure, linguistic structure, and conventional print form of essays; as such, the systems primarily measure text production skills. In the current state-of-the-art, AES provide little direct evidence about such matters…
Descriptors: Scoring, Essays, Text Structure, Writing (Composition)
Ramineni, Chaitanya – Assessing Writing, 2013
In this paper, I describe the design and evaluation of automated essay scoring (AES) models for an institution's writing placement program. Information was gathered on admitted student writing performance at a science and technology research university in the northeastern United States. Under timed conditions, first-year students (N = 879) were…
Descriptors: Validity, Comparative Analysis, Internet, Student Placement
Knoch, Ute – Assessing Writing, 2011
Rating scales act as the de facto test construct in a writing assessment, although inevitably as a simplification of the construct (North, 2003). However, it is often not reported how rating scales are constructed. Unless the underlying framework of a rating scale takes some account of linguistic theory and research in the definition of…
Descriptors: Writing Evaluation, Writing Tests, Rating Scales, Linguistic Theory
Klobucar, Andrew; Elliot, Norbert; Deess, Perry; Rudniy, Oleksandr; Joshi, Kamal – Assessing Writing, 2013
This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the "Criterion"[R] Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation,…
Descriptors: Writing Evaluation, Scoring, Writing Instruction, Essays
Condon, William – Assessing Writing, 2013
Automated Essay Scoring (AES) has garnered a great deal of attention from the rhetoric and composition/writing studies community since the Educational Testing Service began using e-rater[R] and the "Criterion"[R] Online Writing Evaluation Service as products in scoring writing tests, and most of the responses have been negative. While the…
Descriptors: Measurement, Psychometrics, Evaluation Methods, Educational Testing
Weigle, Sara Cushing – Assessing Writing, 2013
This article presents considerations for using automated scoring systems to evaluate second language writing. A distinction is made between English language learners in English-medium educational systems and those studying English in their own countries for a variety of purposes, and between learning-to-write and writing-to-learn in a second…
Descriptors: Scoring, Second Language Learning, Second Languages, English Language Learners
DiPardo, Anne; Storms, Barbara A.; Selland, Makenzie – Assessing Writing, 2011
This paper describes the process by which a rubric development team affiliated with the National Writing Project negotiated difficulties and dilemmas concerning an analytic scoring category initially termed Voice and later renamed Stance. Although these labels reference an aspect of student writing that many teachers value, the challenge of…
Descriptors: Student Evaluation, Scoring Rubrics, Scoring, Educational Assessment