Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 2 |
| Since 2017 (last 10 years) | 5 |
| Since 2007 (last 20 years) | 7 |
Descriptor
| Computer Assisted Testing | 12 |
| Writing (Composition) | 12 |
| Writing Tests | 12 |
| Scores | 6 |
| Writing Evaluation | 6 |
| College Students | 4 |
| Essays | 4 |
| Scoring | 4 |
| Automation | 3 |
| Comparative Analysis | 3 |
| Elementary School Students | 3 |
| More ▼ | |
Source
Author
| Bonett, John | 1 |
| Chen, Dandan | 1 |
| Condon, William | 1 |
| Davey, Tim | 1 |
| Hebert, Michael | 1 |
| James, Cindy L. | 1 |
| Joanna E. Cannon | 1 |
| Kenworthy, Roger | 1 |
| Larkin, Kevin C. | 1 |
| Livingston, Samuel A. | 1 |
| Mazzeo, John | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 9 |
| Reports - Research | 7 |
| Reports - Evaluative | 3 |
| Tests/Questionnaires | 3 |
| Books | 1 |
| Dissertations/Theses -… | 1 |
| Guides - Classroom - Teacher | 1 |
Education Level
| Higher Education | 6 |
| Postsecondary Education | 6 |
| Elementary Education | 3 |
| Secondary Education | 3 |
| Elementary Secondary Education | 2 |
| High Schools | 1 |
Audience
| Administrators | 1 |
| Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| College Level Examination… | 1 |
| Praxis Series | 1 |
What Works Clearinghouse Rating
Sterett H. Mercer; Joanna E. Cannon – Grantee Submission, 2022
We evaluated the validity of an automated approach to learning progress assessment (aLPA) for English written expression. Participants (n = 105) were students in Grades 2-12 who had parent-identified learning difficulties and received academic tutoring through a community-based organization. Participants completed narrative writing samples in the…
Descriptors: Elementary School Students, Secondary School Students, Learning Problems, Learning Disabilities
Wilson, Joshua; Chen, Dandan; Sandbank, Micheal P.; Hebert, Michael – Journal of Educational Psychology, 2019
The present study examined issues pertaining to the reliability of writing assessment in the elementary grades, and among samples of struggling and nonstruggling writers. The present study also extended nascent research on the reliability and the practical applications of automated essay scoring (AES) systems in Response to Intervention frameworks…
Descriptors: Computer Assisted Testing, Automation, Scores, Writing Tests
Yue Huang – ProQuest LLC, 2023
Automated writing evaluation (AWE) is a cutting-edge technology-based intervention designed to help teachers meet their challenges in writing classrooms and improve students' writing proficiency. The fast development of AWE systems, along with the encouragement of technology use in the U.S. K-12 education system by the Common Core State Standards…
Descriptors: Computer Assisted Testing, Writing Tests, Automation, Writing Evaluation
Owada, Kazuharu – Journal of Pan-Pacific Association of Applied Linguistics, 2017
There are some English verbs that can be used both intransitively and transitively. Verbs such as "break," "close," and "melt" can appear in intransitive active, transitive active, and passive constructions. Although native English speakers know in what kind of context a target verb is used in a certain construction,…
Descriptors: Foreign Countries, Undergraduate Students, Second Language Learning, English (Second Language)
Pham, Duc Huu – International Journal of Virtual and Personal Learning Environments, 2019
To help EFL learners realize the use of nominals and clauses in practicing productive skills of academic writing in English writing tests, experiments have been exploited using the tasks similar to those of internet-based test of English as a foreign language to determine the nominal and clause level information during sentence and paragraph…
Descriptors: English (Second Language), Writing Skills, Second Language Learning, Second Language Instruction
Condon, William – Assessing Writing, 2013
Automated Essay Scoring (AES) has garnered a great deal of attention from the rhetoric and composition/writing studies community since the Educational Testing Service began using e-rater[R] and the "Criterion"[R] Online Writing Evaluation Service as products in scoring writing tests, and most of the responses have been negative. While the…
Descriptors: Measurement, Psychometrics, Evaluation Methods, Educational Testing
Neal, Michael – Teachers College Press, 2010
Writing and the teaching of writing is changing at a rapid pace. How can educators understand writing assessment "as" and "with" technology in the 21st-century classroom? Michael Neal contends that new technologies are neither the problem nor the solution. Instead, educators need to tap into digital resources only inasmuch as they promote writing…
Descriptors: Writing Evaluation, Holistic Evaluation, Writing Tests, Educational Technology
Peer reviewedDavey, Tim; And Others – Journal of Educational Measurement, 1997
The development and scoring of a recently introduced computer-based writing skills test is described. The test asks the examinee to edit a writing passage presented on a computer screen. Scoring difficulties are addressed through the combined use of option weighting and the sequential probability ratio test. (SLD)
Descriptors: Computer Assisted Testing, Educational Innovation, Probability, Scoring
James, Cindy L. – Assessing Writing, 2006
How do scores from writing samples generated by computerized essay scorers compare to those generated by ''untrained'' human scorers and what combination of scores, if any, is more accurate at placing students in composition courses? This study endeavored to answer this two-part question by evaluating the correspondence between writing sample…
Descriptors: Writing (Composition), Predictive Validity, Scoring, Validity
Mazzeo, John; And Others – 1991
Two studies investigated the comparability of scores from paper-and-pencil and computer-administered versions of the College-Level Examination Program (CLEP) General Examinations in mathematics and English composition. The first study used a prototype computer-administered version on each examination for 94 students for mathematics and 116 for…
Descriptors: College Entrance Examinations, College Students, Comparative Testing, Computer Assisted Testing
Yu, Lei; Livingston, Samuel A.; Larkin, Kevin C.; Bonett, John – ETS Research Report Series, 2004
This study compared essay scores from paper-based and computer-based versions of a writing test for prospective teachers. Scores for essays in the paper-based version averaged nearly half a standard deviation higher than those in the computer-based version, after applying a statistical control for demographic differences between the groups of…
Descriptors: Essays, Writing (Composition), Computer Assisted Testing, Technology Uses in Education
Kenworthy, Roger – TESL-EJ, 2006
This preliminary study examines what the effects of additional time and different media have upon the overall quality of English language learner's written assessment tests. Sixteen intermediate-level students (L1 Cantonese), enrolled at a satellite campus of an American university within Asia, manually wrote a 45-minute timed placement test in…
Descriptors: Writing (Composition), Writing Tests, Timed Tests, Second Language Learning

Direct link
