Publication Date
In 2025 | 1 |
Since 2024 | 2 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 43 |
Descriptor
Source
Author
Publication Type
Education Level
Elementary Secondary Education | 49 |
Secondary Education | 17 |
Elementary Education | 11 |
Grade 8 | 9 |
High Schools | 6 |
Higher Education | 6 |
Middle Schools | 6 |
Grade 6 | 5 |
Postsecondary Education | 5 |
Grade 5 | 4 |
Grade 12 | 3 |
More ▼ |
Audience
Teachers | 11 |
Administrators | 2 |
Policymakers | 1 |
Researchers | 1 |
Location
Canada | 6 |
United States | 3 |
New Zealand | 2 |
Utah | 2 |
Alabama | 1 |
California | 1 |
China | 1 |
Delaware | 1 |
Georgia | 1 |
Hong Kong | 1 |
Kentucky | 1 |
More ▼ |
Laws, Policies, & Programs
No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
National Assessment of… | 3 |
Delaware Student Testing… | 1 |
SAT (College Admission Test) | 1 |
What Works Clearinghouse Rating
Yue Huang; Joshua Wilson – Journal of Computer Assisted Learning, 2025
Background: Automated writing evaluation (AWE) systems, used as formative assessment tools in writing classrooms, are promising for enhancing instruction and improving student performance. Although meta-analytic evidence supports AWE's effectiveness in various contexts, research on its effectiveness in the U.S. K-12 setting has lagged behind its…
Descriptors: Writing Evaluation, Writing Skills, Writing Tests, Writing Instruction
Yi Gui – ProQuest LLC, 2024
This study explores using transfer learning in machine learning for natural language processing (NLP) to create generic automated essay scoring (AES) models, providing instant online scoring for statewide writing assessments in K-12 education. The goal is to develop an instant online scorer that is generalizable to any prompt, addressing the…
Descriptors: Writing Tests, Natural Language Processing, Writing Evaluation, Scoring
T.H.E. Journal, 2013
The West Virginia Department of Education's auto grading initiative dates back to 2004--a time when school districts were making their first forays into automation. The Charleston based WVDE had instituted a statewide writing assessment in 1984 for students in fourth, seventh, and 10th grades and was looking to expand that program without having…
Descriptors: Automation, Grading, Scoring, Computer Uses in Education
Condon, William – Assessing Writing, 2013
Automated Essay Scoring (AES) has garnered a great deal of attention from the rhetoric and composition/writing studies community since the Educational Testing Service began using e-rater[R] and the "Criterion"[R] Online Writing Evaluation Service as products in scoring writing tests, and most of the responses have been negative. While the…
Descriptors: Measurement, Psychometrics, Evaluation Methods, Educational Testing
Olinghouse, Natalie G.; Zheng, Jinjie; Morlock, Larissa – Reading & Writing Quarterly, 2012
This study evaluated large-scale state writing assessments for the inclusion of motivational characteristics in the writing task and written prompt. We identified 6 motivational variables from the authentic activity literature: time allocation, audience specification, audience intimacy, definition of task, allowance for multiple perspectives, and…
Descriptors: Writing Evaluation, Writing Tests, Writing Achievement, Audiences
Huang, Jinyan – Assessing Writing, 2012
Using generalizability (G-) theory, this study examined the accuracy and validity of the writing scores assigned to secondary school ESL students in the provincial English examinations in Canada. The major research question that guided this study was: Are there any differences between the accuracy and construct validity of the analytic scores…
Descriptors: Foreign Countries, Generalizability Theory, Writing Evaluation, Writing Tests
Gallagher, Chris W. – College Composition and Communication, 2011
I use Burkean analysis to show how neoliberalism undermines faculty assessment expertise and underwrites testing industry expertise in the current assessment scene. Contending that we cannot extricate ourselves from our limited agency in this scene until we abandon the familiar "stakeholder" theory of power, I propose a rewriting of the…
Descriptors: Writing Evaluation, Writing Tests, College Faculty, Political Attitudes
Liao, Chen-Huei; Kuo, Bor-Chen; Pai, Kai-Chih – Turkish Online Journal of Educational Technology - TOJET, 2012
Automated scoring by means of Latent Semantic Analysis (LSA) has been introduced lately to improve the traditional human scoring system. The purposes of the present study were to develop a LSA-based assessment system to evaluate children's Chinese sentence construction skills and to examine the effectiveness of LSA-based automated scoring function…
Descriptors: Foreign Countries, Program Effectiveness, Scoring, Personality
McCurry, Doug – Assessing Writing, 2010
This article considers the claim that machine scoring of writing test responses agrees with human readers as much as humans agree with other humans. These claims about the reliability of machine scoring of writing are usually based on specific and constrained writing tasks, and there is reason for asking whether machine scoring of writing requires…
Descriptors: Writing Tests, Scoring, Interrater Reliability, Computer Assisted Testing
Deane, Paul – Educational Testing Service, 2011
This paper presents a socio-cognitive framework for connecting writing pedagogy and writing assessment with modern social and cognitive theories of writing. It focuses on providing a general framework that highlights the connections between writing competency and other literacy skills; identifies key connections between literacy instruction,…
Descriptors: Writing (Composition), Writing Evaluation, Writing Tests, Cognitive Ability
Parr, Judy M.; Timperley, Helen S. – Assessing Writing, 2010
Traditionally, feedback to writing is written on drafts or given orally in roving or more formal conferences and is considered a significant part of instruction. This paper locates written response within an assessment for learning framework in the writing classroom. Within this framework, quality of response was defined in terms of providing…
Descriptors: Feedback (Response), Pedagogical Content Knowledge, Writing Evaluation, Writing Instruction
Peterson, Shelley Stagg; McClay, Jill – Assessing Writing, 2010
This paper reports on the feedback and assessment practices of Canadian grades 4-8 teachers; the data are drawn from a national study of the teaching of writing at the middle grades in all ten Canadian provinces and two (of three) territories. Respondents were 216 grades 4-8 teachers from rural and urban schools. Data sources were audio-recorded…
Descriptors: Feedback (Response), Urban Schools, Writing Instruction, Elementary School Teachers
Xu, Yun; Wu, Zunmin – Assessing Writing, 2012
This paper reports on a qualitative research study into the test-taking strategies employed in completing two picture prompt writing tasks--Situational Writing and Interpretational Writing in the Beijing Matriculation English Test. Think-aloud and retrospective interview protocols were collected from twelve Chinese students representing two key…
Descriptors: Foreign Countries, High School Students, Secondary School Teachers, Test Wiseness
National Assessment Governing Board, 2010
The purpose of the 2011 NAEP (National Assessment of Educational Progress) Writing Framework is to describe how the new NAEP Writing Assessment is designed to measure students' writing at grades 4, 8, and 12. As the ongoing national indicator of the academic achievement of students in the United States, NAEP regularly collects information on…
Descriptors: Writing Achievement, Writing Skills, Writing Evaluation, National Competency Tests
Dunn, David E. – ProQuest LLC, 2011
Many national reports indicate that more attention needs to be placed on writing and the teaching of writing in schools. The purpose of this quantitative study was to, first, examine the structure of the DWA and, second, to use the scores from the DWA to examine the relationship between ELL status and writing proficiency. Five major research…
Descriptors: Ethnicity, Writing Evaluation, Socioeconomic Status, Income