NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 9 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Almusharraf, Norah; Alotaibi, Hind – Technology, Knowledge and Learning, 2023
Evaluating written texts is believed to be a time-consuming process that can lack consistency and objectivity. Automated essay scoring (AES) can provide solutions to some of the limitations of human scoring. This research aimed to evaluate the performance of one AES system, Grammarly, in comparison to human raters. Both approaches' performances…
Descriptors: Writing Evaluation, Writing Tests, Essay Tests, Essays
Peer reviewed Peer reviewed
Direct linkDirect link
Reinertsen, Nathanael – English in Australia, 2018
The difference in how humans read and how Automated Essay Scoring (AES) systems process written language leads to a situation where a portion of student responses will be comprehensible to human markers, but unable to be parsed by AES systems. This paper examines a number of pieces of student writing that were marked by trained human markers, but…
Descriptors: Qualitative Research, Writing Evaluation, Essay Tests, Computer Assisted Testing
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gaibani, Ahmed – Advances in Language and Literary Studies, 2015
The purpose of this study was to examine the use of English articles as well the errors made by the students at Omar Al-Mukhtar University. The research objectives consists of: To identify the types and sources of errors made by Libyan EFL Undergraduates at Omar Al-Mukhtar University in the use of the indefinite article during their written…
Descriptors: Undergraduate Students, English (Second Language), Error Patterns, Error Analysis (Language)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bailey, Daniel; Lee, Andrea Rakushin – TESOL International Journal, 2020
Different genres of writing entail various levels of syntactic and lexical complexity, and how this complexity influences the results of Automatic Writing Evaluation (AWE) programs like Grammarly in second language (L2) writing is unknown. This study explored the use of Grammarly in the L2 writing context by comparing error frequency, error types…
Descriptors: Grammar, Computer Assisted Instruction, Error Correction, Feedback (Response)
Peer reviewed Peer reviewed
Direct linkDirect link
Mao, Liyang; Liu, Ou Lydia; Roohr, Katrina; Belur, Vinetha; Mulholland, Matthew; Lee, Hee-Sun; Pallant, Amy – Educational Assessment, 2018
Scientific argumentation is one of the core practices for teachers to implement in science classrooms. We developed a computer-based formative assessment to support students' construction and revision of scientific arguments. The assessment is built upon automated scoring of students' arguments and provides feedback to students and teachers.…
Descriptors: Computer Assisted Testing, Science Tests, Scoring, Automation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Blanchard, Daniel; Tetreault, Joel; Higgins, Derrick; Cahill, Aoife; Chodorow, Martin – ETS Research Report Series, 2013
This report presents work on the development of a new corpus of non-native English writing. It will be useful for the task of native language identification, as well as grammatical error detection and correction, and automatic essay scoring. In this report, the corpus is described in detail.
Descriptors: Language Tests, Second Language Learning, English (Second Language), Writing Tests
Obanya, P. A. I. – Audio-Visual Language Journal, 1974
This paper is the result of a study of ninety-two essays on three different topics by fifth-form pupils in the city of Ibadan. (Author)
Descriptors: Achievement Tests, Contrastive Linguistics, Error Patterns, Essay Tests
Peer reviewed Peer reviewed
Twigg, Helen Parramore – Teaching English in the Two-Year College, 1981
Examines students' amusing responses to essay test questions, while maintaining that such responses can still give a teacher a better indication of what students are learning in the classroom than can objective tests. (HTH)
Descriptors: Error Patterns, Essay Tests, Higher Education, Objective Tests
Peer reviewed Peer reviewed
Bridgeman, Brent; Morgan, Rick – Journal of Educational Psychology, 1996
Students from 38 colleges with high scores on an advanced placement examination essay and low scores on the multiple-choice portion were compared with students with the opposite pattern. The pattern was not related to college grades, but was related to other test performance. (SLD)
Descriptors: Academic Achievement, Advanced Placement, College Students, Error Patterns