Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 4 |
Descriptor
Computer Assisted Testing | 5 |
Scoring | 5 |
Educational Assessment | 3 |
Writing Tests | 3 |
Educational Testing | 2 |
Elementary Secondary Education | 2 |
Essays | 2 |
Language Tests | 2 |
Psychometrics | 2 |
Academic Achievement | 1 |
Accuracy | 1 |
More ▼ |
Source
Educational Testing Service | 5 |
Author
Bennett, Randy Elliot | 1 |
Davey, Tim | 1 |
Haberman, Shelby J. | 1 |
Herbert, Erin | 1 |
Higgins, Derrick | 1 |
Quinlan, Thomas | 1 |
Rizavi, Saba | 1 |
Way, Walter D. | 1 |
Wolff, Susanne | 1 |
Publication Type
Reports - Descriptive | 2 |
Reports - Research | 2 |
Reports - Evaluative | 1 |
Education Level
Elementary Secondary Education | 3 |
Higher Education | 2 |
Postsecondary Education | 2 |
Elementary Education | 1 |
Grade 7 | 1 |
Grade 8 | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Audience
Location
China | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Test of English as a Foreign… | 2 |
Graduate Record Examinations | 1 |
What Works Clearinghouse Rating
Bennett, Randy Elliot – Educational Testing Service, 2011
CBAL, an acronym for Cognitively Based Assessment of, for, and as Learning, is a research initiative intended to create a model for an innovative K-12 assessment system that provides summative information for policy makers, as well as formative information for classroom instructional purposes. This paper summarizes empirical results from 16 CBAL…
Descriptors: Educational Assessment, Elementary Secondary Education, Summative Evaluation, Formative Evaluation
Haberman, Shelby J. – Educational Testing Service, 2011
Alternative approaches are discussed for use of e-rater[R] to score the TOEFL iBT[R] Writing test. These approaches involve alternate criteria. In the 1st approach, the predicted variable is the expected rater score of the examinee's 2 essays. In the 2nd approach, the predicted variable is the expected rater score of 2 essay responses by the…
Descriptors: Writing Tests, Scoring, Essays, Language Tests
Quinlan, Thomas; Higgins, Derrick; Wolff, Susanne – Educational Testing Service, 2009
This report evaluates the construct coverage of the e-rater[R[ scoring engine. The matter of construct coverage depends on whether one defines writing skill, in terms of process or product. Originally, the e-rater engine consisted of a large set of components with a proven ability to predict human holistic scores. By organizing these capabilities…
Descriptors: Guides, Writing Skills, Factor Analysis, Writing Tests
Rizavi, Saba; Way, Walter D.; Davey, Tim; Herbert, Erin – Educational Testing Service, 2004
Item parameter estimates vary for a variety of reasons, including estimation error, characteristics of the examinee samples, and context effects (e.g., item location effects, section location effects, etc.). Although we expect variation based on theory, there is reason to believe that observed variation in item parameter estimates exceeds what…
Descriptors: Adaptive Testing, Test Items, Computation, Context Effect
Educational Testing Service, 2006
Innovations, ETS's corporate magazine, provides information on educational assessment for educators, school leaders, researchers and policymakers around the world. Each issue of Innovations focuses on a particular theme in assessment. This issue reports on how new technologies in classrooms around the world are enhancing teaching, learning and…
Descriptors: Foreign Countries, Educational Assessment, Writing Evaluation, Periodicals