NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Deane, Paul – Educational Testing Service, 2022
Writing is a critical 21st century skill. Today's knowledge economy places a premium upon collaboration and written communication, which means that the skilled writer enters the job market at a significant advantage (Aschliman, 2016; Brandt, 2005). And yet students typically enter the job market with weak writing skills. Only 27% of 12th-grade…
Descriptors: Writing Evaluation, Writing Instruction, Instructional Improvement, Automation
Deane, Paul; Quinlan, Thomas; Kostin, Irene – Educational Testing Service, 2011
ETS has recently instituted the Cognitively Based Assessments of, for, and as Learning (CBAL) research initiative to create a new generation of assessment designed from the ground up to enhance learning. It is intended as a general approach, covering multiple subject areas including reading, writing, and math. This paper is concerned with the…
Descriptors: Automation, Scoring, Educational Assessment, Writing Tests
Attali, Yigal – Educational Testing Service, 2011
The e-rater[R] automated essay scoring system is used operationally in the scoring of TOEFL iBT[R] independent essays. Previous research has found support for a 3-factor structure of the e-rater features. This 3-factor structure has an attractive hierarchical linguistic interpretation with a word choice factor, a grammatical convention within a…
Descriptors: Essay Tests, Language Tests, Test Scoring Machines, Automation
Sheehan, Kathleen M.; Kostin, Irene; Futagi, Yoko; Flor, Michael – Educational Testing Service, 2010
The Common Core Standards call for students to be exposed to a much greater level of text complexity than has been the norm in schools for the past 40 years. Textbook publishers, teachers, and assessment developers are being asked to refocus materials and methods to ensure that students are challenged to read texts at steadily increasing…
Descriptors: Automation, Content Analysis, Difficulty Level, Readability Formulas
Attali, Yigal – Educational Testing Service, 2011
This paper proposes an alternative content measure for essay scoring, based on the "difference" in the relative frequency of a word in high-scored versus low-scored essays. The "differential word use" (DWU) measure is the average of these differences across all words in the essay. A positive value indicates the essay is using…
Descriptors: Scoring, Essay Tests, Word Frequency, Content Analysis
Quinlan, Thomas; Higgins, Derrick; Wolff, Susanne – Educational Testing Service, 2009
This report evaluates the construct coverage of the e-rater[R[ scoring engine. The matter of construct coverage depends on whether one defines writing skill, in terms of process or product. Originally, the e-rater engine consisted of a large set of components with a proven ability to predict human holistic scores. By organizing these capabilities…
Descriptors: Guides, Writing Skills, Factor Analysis, Writing Tests