Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 14 |
Descriptor
Computer Assisted Testing | 14 |
Essays | 14 |
Scoring | 9 |
Essay Tests | 7 |
Writing Evaluation | 7 |
Writing Tests | 6 |
Evaluation Methods | 5 |
Standardized Tests | 5 |
Student Evaluation | 5 |
Automation | 4 |
Comparative Analysis | 4 |
More ▼ |
Source
Author
Darling-Hammond, Linda | 2 |
Alexander, R. Curby | 1 |
Burke, Jennifer N. | 1 |
Cizek, Gregory J. | 1 |
Condon, William | 1 |
Deane, Paul | 1 |
Deng, Hui | 1 |
DiVesta, Francis J. | 1 |
Dikli, Semire | 1 |
Enriquez Carrasco, Emilia | 1 |
Ferster, Bill | 1 |
More ▼ |
Publication Type
Journal Articles | 6 |
Reports - Research | 5 |
Reports - Descriptive | 4 |
Reports - Evaluative | 3 |
Books | 1 |
Collected Works - General | 1 |
Dissertations/Theses -… | 1 |
Guides - Non-Classroom | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Elementary Secondary Education | 14 |
Higher Education | 8 |
Postsecondary Education | 7 |
High Schools | 5 |
Secondary Education | 5 |
Elementary Education | 2 |
Middle Schools | 2 |
Early Childhood Education | 1 |
Grade 10 | 1 |
Grade 11 | 1 |
Grade 6 | 1 |
More ▼ |
Audience
Administrators | 1 |
Teachers | 1 |
Location
Australia | 2 |
Connecticut | 2 |
New Hampshire | 2 |
New York | 2 |
Rhode Island | 2 |
United Kingdom (England) | 2 |
Vermont | 2 |
Singapore | 1 |
Spain | 1 |
West Virginia | 1 |
Laws, Policies, & Programs
Every Student Succeeds Act… | 2 |
No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
National Assessment of… | 3 |
New York State Regents… | 2 |
Graduate Record Examinations | 1 |
SAT (College Admission Test) | 1 |
Test of English as a Foreign… | 1 |
What Works Clearinghouse Rating
Does not meet standards | 1 |
Yi Gui – ProQuest LLC, 2024
This study explores using transfer learning in machine learning for natural language processing (NLP) to create generic automated essay scoring (AES) models, providing instant online scoring for statewide writing assessments in K-12 education. The goal is to develop an instant online scorer that is generalizable to any prompt, addressing the…
Descriptors: Writing Tests, Natural Language Processing, Writing Evaluation, Scoring
Hixson, Nate; Rhudy, Vaughn – West Virginia Department of Education, 2012
To provide an opportunity for teachers to better understand the automated scoring process used by the state of West Virginia on our annual West Virginia Educational Standards Test 2 (WESTEST 2) Online Writing Assessment, the West Virginia Department of Education (WVDE) Office of Assessment and Accountability and the Office of Research conduct an…
Descriptors: Writing Tests, Computer Assisted Testing, Automation, Scoring
Deane, Paul – Assessing Writing, 2013
This paper examines the construct measured by automated essay scoring (AES) systems. AES systems measure features of the text structure, linguistic structure, and conventional print form of essays; as such, the systems primarily measure text production skills. In the current state-of-the-art, AES provide little direct evidence about such matters…
Descriptors: Scoring, Essays, Text Structure, Writing (Composition)
Darling-Hammond, Linda – Learning Policy Institute, 2017
After passage of the Every Student Succeeds Act (ESSA) in 2015, states assumed greater responsibility for designing their own accountability and assessment systems. ESSA requires states to measure "higher order thinking skills and understanding" and encourages the use of open-ended performance assessments, which are essential for…
Descriptors: Performance Based Assessment, Accountability, Portfolios (Background Materials), Task Analysis
Condon, William – Assessing Writing, 2013
Automated Essay Scoring (AES) has garnered a great deal of attention from the rhetoric and composition/writing studies community since the Educational Testing Service began using e-rater[R] and the "Criterion"[R] Online Writing Evaluation Service as products in scoring writing tests, and most of the responses have been negative. While the…
Descriptors: Measurement, Psychometrics, Evaluation Methods, Educational Testing
Darling-Hammond, Linda – Council of Chief State School Officers, 2017
The Every Student Succeeds Act (ESSA) opened up new possibilities for how student and school success are defined and supported in American public education. States have greater responsibility for designing and building their assessment and accountability systems. These new opportunities to develop performance assessments are critically important…
Descriptors: Performance Based Assessment, Accountability, Portfolios (Background Materials), Task Analysis
Kite, Mary E., Ed. – Society for the Teaching of Psychology, 2012
This book compiles several essays about effective evaluation of teaching. Contents of this publication include: (1) Conducting Research on Student Evaluations of Teaching (William E. Addison and Jeffrey R. Stowell); (2) Choosing an Instrument for Student Evaluation of Instruction (Jared W. Keeley); (3) Formative Teaching Evaluations: Is Student…
Descriptors: Feedback (Response), Student Evaluation of Teacher Performance, Online Courses, Teacher Effectiveness
Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt – Journal of Interactive Learning Research, 2012
The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…
Descriptors: Feedback (Response), Scripts, Formative Evaluation, Essays
Garcia Laborda, Jesus; Magal Royo, Teresa; Enriquez Carrasco, Emilia – Online Submission, 2010
This paper presents the results of writing processing among 260 high school senior students, their degree of satisfaction using the new trial version of the Computer Based University Entrance Examination in Spain and their degree of motivation towards written online test tasks. Currently, this is one of the closing studies to verify whether…
Descriptors: Foreign Countries, Curriculum Development, High Stakes Tests, Student Motivation
Quinlan, Thomas; Higgins, Derrick; Wolff, Susanne – Educational Testing Service, 2009
This report evaluates the construct coverage of the e-rater[R[ scoring engine. The matter of construct coverage depends on whether one defines writing skill, in terms of process or product. Originally, the e-rater engine consisted of a large set of components with a proven ability to predict human holistic scores. By organizing these capabilities…
Descriptors: Guides, Writing Skills, Factor Analysis, Writing Tests
Shermis, Mark D.; DiVesta, Francis J. – Rowman & Littlefield Publishers, Inc., 2011
"Classroom Assessment in Action" clarifies the multi-faceted roles of measurement and assessment and their applications in a classroom setting. Comprehensive in scope, Shermis and Di Vesta explain basic measurement concepts and show students how to interpret the results of standardized tests. From these basic concepts, the authors then…
Descriptors: Student Evaluation, Standardized Tests, Scores, Measurement
Kobrin, Jennifer L.; Deng, Hui; Shaw, Emily J. – Journal of Applied Testing Technology, 2007
This study was designed to address two frequent criticisms of the SAT essay--that essay length is the best predictor of scores, and that there is an advantage in using more "sophisticated" examples as opposed to personal experience. The study was based on 2,820 essays from the first three administrations of the new SAT. Each essay was…
Descriptors: Testing Programs, Computer Assisted Testing, Construct Validity, Writing Skills
Dikli, Semire – Journal of Technology, Learning, and Assessment, 2006
Automated Essay Scoring (AES) is defined as the computer technology that evaluates and scores the written prose (Shermis & Barrera, 2002; Shermis & Burstein, 2003; Shermis, Raymat, & Barrera, 2003). AES systems are mainly used to overcome time, cost, reliability, and generalizability issues in writing assessment (Bereiter, 2003; Burstein,…
Descriptors: Scoring, Writing Evaluation, Writing Tests, Standardized Tests
Burke, Jennifer N.; Cizek, Gregory J. – Assessing Writing, 2006
This study was conducted to gather evidence regarding effects of the mode of writing (handwritten vs. word-processed) on compositional quality in a sample of sixth grade students. Questionnaire data and essay scores were gathered to examine the effect of composition mode on essay scores of students of differing computer skill levels. The study was…
Descriptors: Computer Assisted Testing, High Stakes Tests, Writing Processes, Grade 6