NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Practitioners1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Does not meet standards1
Showing all 14 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Yang Jiang; Mo Zhang; Jiangang Hao; Paul Deane; Chen Li – Journal of Educational Measurement, 2024
The emergence of sophisticated AI tools such as ChatGPT, coupled with the transition to remote delivery of educational assessments in the COVID-19 era, has led to increasing concerns about academic integrity and test security. Using AI tools, test takers can produce high-quality texts effortlessly and use them to game assessments. It is thus…
Descriptors: Integrity, Artificial Intelligence, Technology Uses in Education, Ethics
Peer reviewed Peer reviewed
Direct linkDirect link
Choi, Ikkyu; Deane, Paul – Language Assessment Quarterly, 2021
Keystroke logs provide a comprehensive record of observable writing processes. Previous studies examining the keystroke logs of young L1 English writers performing experimental writing tasks have identified writing processes features predictive of the quality of responses. Contrarily, large-scale studies on the dynamic and temporal nature of L2…
Descriptors: Writing Processes, Writing Evaluation, Computer Assisted Testing, Learning Analytics
Peer reviewed Peer reviewed
Direct linkDirect link
Conijn, Rianne; Martinez-Maldonado, Roberto; Knight, Simon; Buckingham Shum, Simon; Van Waes, Luuk; van Zaanen, Menno – Computer Assisted Language Learning, 2022
Current writing support tools tend to focus on assessing final or intermediate products, rather than the writing process. However, sensing technologies, such as keystroke logging, can enable provision of automated feedback during, and on aspects of, the writing process. Despite this potential, little is known about the critical indicators that can…
Descriptors: Automation, Feedback (Response), Writing Evaluation, Learning Analytics
Peer reviewed Peer reviewed
Direct linkDirect link
Zhang, Mo; Bennett, Randy E.; Deane, Paul; van Rijn, Peter W. – Educational Measurement: Issues and Practice, 2019
This study compared gender groups on the processes used in writing essays in an online assessment. Middle-school students from four grades responded to essays in two persuasive subgenres, argumentation and policy recommendation. Writing processes were inferred from four indicators extracted from students' keystroke logs. In comparison to males, on…
Descriptors: Gender Differences, Essays, Computer Assisted Testing, Persuasive Discourse
Peer reviewed Peer reviewed
Direct linkDirect link
Tsai, Shu-Chiao – Computer Assisted Language Learning, 2019
This study investigates the impact on extemporaneous English-language first drafts by using Google Translate (GT) in three different tasks assigned to Chinese sophomore, junior, and senior students of English as a Foreign Language (EFL) majoring in English. Students wrote first in Chinese (Step 1), then drafted corresponding texts in English (Step…
Descriptors: English (Second Language), Second Language Learning, Second Language Instruction, Computer Software
Peer reviewed Peer reviewed
Direct linkDirect link
Zou, Xiao-Ling; Chen, Yan-Min – Technology, Pedagogy and Education, 2016
The effects of computer and paper test media on EFL test-takers with different computer familiarity in writing scores and in the cognitive writing process have been comprehensively explored from the learners' aspect as well as on the basis of related theories and practice. The results indicate significant differences in test scores among the…
Descriptors: English (Second Language), Second Language Learning, Second Language Instruction, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Barkaoui, Khaled – Language Testing, 2014
A major concern with computer-based (CB) tests of second-language (L2) writing is that performance on such tests may be influenced by test-taker keyboarding skills. Poor keyboarding skills may force test-takers to focus their attention and cognitive resources on motor activities (i.e., keyboarding) and, consequently, other processes and aspects of…
Descriptors: Language Tests, Computer Assisted Testing, English (Second Language), Second Language Learning
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Barkaoui, Khaled – ETS Research Report Series, 2015
This study aimed to describe the writing activities that test takers engage in when responding to the writing tasks in the "TOEFL iBT"[superscript R] test and to examine the effects of task type and test-taker English language proficiency (ELP) and keyboarding skills on the frequency and distribution of these activities. Each of 22 test…
Descriptors: Second Language Learning, Language Tests, English (Second Language), Writing Instruction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Deane, Paul – ETS Research Report Series, 2014
This paper explores automated methods for measuring features of student writing and determining their relationship to writing quality and other features of literacy, such as reading rest scores. In particular, it uses the "e-rater"™ automatic essay scoring system to measure "product" features (measurable traits of the final…
Descriptors: Writing Processes, Writing Evaluation, Student Evaluation, Writing Skills
National Assessment Governing Board, 2010
The purpose of the 2011 NAEP (National Assessment of Educational Progress) Writing Framework is to describe how the new NAEP Writing Assessment is designed to measure students' writing at grades 4, 8, and 12. As the ongoing national indicator of the academic achievement of students in the United States, NAEP regularly collects information on…
Descriptors: Writing Achievement, Writing Skills, Writing Evaluation, National Competency Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Crossley, Scott A.; McNamara, Danielle S. – Journal of Second Language Writing, 2009
The purpose of this paper is to provide a detailed analysis of how lexical differences related to cohesion and connectionist models can distinguish first language (L1) writers of English from second language (L2) writers of English. Key to this analysis is the use of the computational tool Coh-Metrix, which measures cohesion and text difficulty at…
Descriptors: Second Language Learning, Discriminant Analysis, English (Second Language), Educational Technology
Peer reviewed Peer reviewed
Direct linkDirect link
Li, Jiang – Assessing Writing, 2006
The present study investigated the influence of word processing on the writing of students of English as a second language (ESL) and on writing assessment as well. Twenty-one adult Mandarin-Chinese speakers with advanced English proficiency living in Toronto participated in the study. Each participant wrote two comparable writing tasks under…
Descriptors: Writing Evaluation, Protocol Analysis, Writing Tests, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Burke, Jennifer N.; Cizek, Gregory J. – Assessing Writing, 2006
This study was conducted to gather evidence regarding effects of the mode of writing (handwritten vs. word-processed) on compositional quality in a sample of sixth grade students. Questionnaire data and essay scores were gathered to examine the effect of composition mode on essay scores of students of differing computer skill levels. The study was…
Descriptors: Computer Assisted Testing, High Stakes Tests, Writing Processes, Grade 6
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, H. K. – Assessing Writing, 2004
This study aimed to comprehensively investigate the impact of a word-processor on an ESL writing assessment, covering comparison of inter-rater reliability, the quality of written products, the writing process across different testing occasions using different writing media, and students' perception of a computer-delivered test. Writing samples of…
Descriptors: Writing Evaluation, Student Attitudes, Writing Tests, Testing