Publication Date
In 2025 | 1 |
Since 2024 | 2 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 21 |
Descriptor
Computer Assisted Testing | 21 |
Writing Evaluation | 21 |
Scoring | 12 |
Writing Tests | 11 |
Essay Tests | 9 |
Student Evaluation | 8 |
Computer Software | 7 |
Educational Technology | 7 |
Educational Testing | 7 |
Essays | 7 |
Educational Assessment | 6 |
More ▼ |
Source
Author
Alexander, R. Curby | 1 |
Allen, Nancy | 1 |
Attali, Yigal | 1 |
Bennett, Randy Elliott | 1 |
Burke, Jennifer N. | 1 |
Cizek, Gregory J. | 1 |
Condon, William | 1 |
Deane, Paul | 1 |
Deng, Hui | 1 |
Dikli, Semire | 1 |
Dooley, Scott | 1 |
More ▼ |
Publication Type
Education Level
Elementary Secondary Education | 21 |
Higher Education | 6 |
Postsecondary Education | 6 |
Secondary Education | 6 |
Elementary Education | 5 |
High Schools | 4 |
Grade 8 | 3 |
Middle Schools | 3 |
Grade 12 | 2 |
Grade 4 | 2 |
Grade 6 | 2 |
More ▼ |
Audience
Administrators | 1 |
Teachers | 1 |
Location
China | 1 |
Taiwan | 1 |
United Kingdom | 1 |
United States | 1 |
Utah | 1 |
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of… | 3 |
SAT (College Admission Test) | 1 |
What Works Clearinghouse Rating
Does not meet standards | 1 |
Yue Huang; Joshua Wilson – Journal of Computer Assisted Learning, 2025
Background: Automated writing evaluation (AWE) systems, used as formative assessment tools in writing classrooms, are promising for enhancing instruction and improving student performance. Although meta-analytic evidence supports AWE's effectiveness in various contexts, research on its effectiveness in the U.S. K-12 setting has lagged behind its…
Descriptors: Writing Evaluation, Writing Skills, Writing Tests, Writing Instruction
Yi Gui – ProQuest LLC, 2024
This study explores using transfer learning in machine learning for natural language processing (NLP) to create generic automated essay scoring (AES) models, providing instant online scoring for statewide writing assessments in K-12 education. The goal is to develop an instant online scorer that is generalizable to any prompt, addressing the…
Descriptors: Writing Tests, Natural Language Processing, Writing Evaluation, Scoring
Massey, Chris L.; Gambrell, Linda B. – Literacy Research and Instruction, 2014
Literacy educators and researchers have long recognized the importance of increasing students' writing proficiency across age and grade levels. With the release of the Common Core State Standards (CCSS), a new and greater emphasis is being placed on writing in the K-12 curriculum. Educators, as well as the authors of the CCSS, agree that…
Descriptors: Writing Evaluation, State Standards, Instructional Effectiveness, Writing Ability
Ramineni, Chaitanya; Williamson, David M. – Assessing Writing, 2013
In this paper, we provide an overview of psychometric procedures and guidelines Educational Testing Service (ETS) uses to evaluate automated essay scoring for operational use. We briefly describe the e-rater system, the procedures and criteria used to evaluate e-rater, implications for a range of potential uses of e-rater, and directions for…
Descriptors: Educational Testing, Guidelines, Scoring, Psychometrics
Deane, Paul – Assessing Writing, 2013
This paper examines the construct measured by automated essay scoring (AES) systems. AES systems measure features of the text structure, linguistic structure, and conventional print form of essays; as such, the systems primarily measure text production skills. In the current state-of-the-art, AES provide little direct evidence about such matters…
Descriptors: Scoring, Essays, Text Structure, Writing (Composition)
Condon, William – Assessing Writing, 2013
Automated Essay Scoring (AES) has garnered a great deal of attention from the rhetoric and composition/writing studies community since the Educational Testing Service began using e-rater[R] and the "Criterion"[R] Online Writing Evaluation Service as products in scoring writing tests, and most of the responses have been negative. While the…
Descriptors: Measurement, Psychometrics, Evaluation Methods, Educational Testing
Edwards, Virginia B., Ed. – Education Week, 2014
Figuring out how to use digital tools to transform testing requires a willingness to invest in new technologies and the patience to experiment with novel approaches, a commitment to ongoing professional development and reliable technical support, and an openness to learn from mistakes. Whatever bumpy ride this technological journey takes, experts…
Descriptors: Elementary Secondary Education, Technological Advancement, Testing, Computer Assisted Testing
Liao, Chen-Huei; Kuo, Bor-Chen; Pai, Kai-Chih – Turkish Online Journal of Educational Technology - TOJET, 2012
Automated scoring by means of Latent Semantic Analysis (LSA) has been introduced lately to improve the traditional human scoring system. The purposes of the present study were to develop a LSA-based assessment system to evaluate children's Chinese sentence construction skills and to examine the effectiveness of LSA-based automated scoring function…
Descriptors: Foreign Countries, Program Effectiveness, Scoring, Personality
McCurry, Doug – Assessing Writing, 2010
This article considers the claim that machine scoring of writing test responses agrees with human readers as much as humans agree with other humans. These claims about the reliability of machine scoring of writing are usually based on specific and constrained writing tasks, and there is reason for asking whether machine scoring of writing requires…
Descriptors: Writing Tests, Scoring, Interrater Reliability, Computer Assisted Testing
Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt – Journal of Interactive Learning Research, 2012
The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…
Descriptors: Feedback (Response), Scripts, Formative Evaluation, Essays
National Assessment Governing Board, 2010
The purpose of the 2011 NAEP (National Assessment of Educational Progress) Writing Framework is to describe how the new NAEP Writing Assessment is designed to measure students' writing at grades 4, 8, and 12. As the ongoing national indicator of the academic achievement of students in the United States, NAEP regularly collects information on…
Descriptors: Writing Achievement, Writing Skills, Writing Evaluation, National Competency Tests
Neal, Michael – Teachers College Press, 2010
Writing and the teaching of writing is changing at a rapid pace. How can educators understand writing assessment "as" and "with" technology in the 21st-century classroom? Michael Neal contends that new technologies are neither the problem nor the solution. Instead, educators need to tap into digital resources only inasmuch as they promote writing…
Descriptors: Writing Evaluation, Holistic Evaluation, Writing Tests, Educational Technology
National Center for Education Statistics, 2012
This report presents results of the 2011 National Assessment of Educational Progress (NAEP) in writing at grades 8 and 12. In this new national writing assessment sample, 24,100 eighth-graders and 28,100 twelfth-graders engaged with writing tasks and composed their responses on computer. The assessment tasks reflected writing situations common to…
Descriptors: National Competency Tests, Writing Tests, Grade 8, Grade 12
Landauer, Thomas K.; Lochbaum, Karen E.; Dooley, Scott – Theory Into Practice, 2009
Advances in assessment technologies are affording teachers and students new ways to efficiently assess and track achievement while also better promoting learning. WriteToLearn is one such technology, a Web-based tool that integrates practice and assessment in reading comprehension with writing about what is learned. Based on the principle of…
Descriptors: Feedback (Response), Reading Comprehension, Formative Evaluation, Educational Technology
Johnson, Martin; Nadas, Rita – Learning, Media and Technology, 2009
Within large scale educational assessment agencies in the UK, there has been a shift towards assessors marking digitally scanned copies rather than the original paper scripts that were traditionally used. This project uses extended essay examination scripts to consider whether the mode in which an essay is read potentially influences the…
Descriptors: Reading Comprehension, Educational Assessment, Internet, Essay Tests
Previous Page | Next Page ยป
Pages: 1 | 2