NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Does not meet standards1
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Choi, Ikkyu; Hao, Jiangang; Deane, Paul; Zhang, Mo – ETS Research Report Series, 2021
"Biometrics" are physical or behavioral human characteristics that can be used to identify a person. It is widely known that keystroke or typing dynamics for short, fixed texts (e.g., passwords) could serve as a behavioral biometric. In this study, we investigate whether keystroke data from essay responses can lead to a reliable…
Descriptors: Accuracy, High Stakes Tests, Writing Tests, Benchmarking
Peer reviewed Peer reviewed
Direct linkDirect link
Foxworth, Lauren L.; Hashey, Andrew; Sukhram, Diana P. – Reading & Writing Quarterly, 2019
In an age when students are increasingly expected to demonstrate technology-based writing proficiency, fluency challenges with word processing programs can pose a barrier to successful writing when students are asked to compose using these tools. The current study was designed to determine whether differences existed in typing fluency and digital…
Descriptors: Writing Skills, Students with Disabilities, Learning Disabilities, Word Processing
White, Sheida; Kim, Young Yee; Chen, Jing; Liu, Fei – National Center for Education Statistics, 2015
This study examined whether or not fourth-graders could fully demonstrate their writing skills on the computer and factors associated with their performance on the National Assessment of Educational Progress (NAEP) computer-based writing assessment. The results suggest that high-performing fourth-graders (those who scored in the upper 20 percent…
Descriptors: National Competency Tests, Computer Assisted Testing, Writing Tests, Grade 4
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sumande, Caroline T.; Castolo, Carmencita L.; Comendador, Benilda Eleanor V. – Turkish Online Journal of Distance Education, 2016
The study addressed two questions: what is the ICT level of confidence of the course specialists handling Open University classes, and to what extent do course specialists integrated ICT applications such as word processing, electronic spread sheet, presentation software, YouTube and etc. in their OUS classes? The instruments were administered to…
Descriptors: Foreign Countries, Information Technology, Self Esteem, Specialists
Peer reviewed Peer reviewed
Direct linkDirect link
Koo, Malcolm; Norman, Cameron D.; Chang, Hsiao-Mei – International Electronic Journal of Health Education, 2012
The eight-item eHealth Literacy Scale (eHEALS) is a previously validated scale developed to assess consumers' combined knowledge, comfort, and perceived skills at finding, evaluating, and applying electronic health information to health problems. In the present study, a Chinese version of the eHEALS was developed and its psychometric properties…
Descriptors: College Students, Reliability, Measures (Individuals), Factor Analysis
Manalo, Jonathan R.; Wolfe, Edward W. – 2000
Recently, the Test of English as a Foreign Language (TOEFL) changed by including a writing section that gives the examinee an option between computer and handwritten formats to compose their responses. Unfortunately, this may introduce several potential sources of error that might reduce the reliability and validity of the scores. The seriousness…
Descriptors: Computer Assisted Testing, Essay Tests, Evaluators, Handwriting
Manalo, Jonathan R.; Wolfe, Edward W. – 2000
Recently, the Test of English as a Foreign Language (TOEFL) changed by including a direct writing assessment where examinees choose between computer and handwritten composition formats. Unfortunately, examinees may have differential access to and comfort with computers; as a result, scores across these formats may not be comparable. Analysis of…
Descriptors: Adults, Computer Assisted Testing, Essay Tests, Handwriting
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Breland, Hunter; Lee, Yong-Won; Muraki, Eiji – ETS Research Report Series, 2004
Eighty-three Test of English as a Foreign Language™ (TOEFL®) CBT writing prompts that were administered between July 1998 and August 2000 were examined in order to identify differences in scores that could be attributed to the response mode chosen by examinees (handwritten or word processed). Differences were examined statistically using…
Descriptors: English (Second Language), Language Tests, Second Language Learning, Cues
Peer reviewed Peer reviewed
Direct linkDirect link
Burke, Jennifer N.; Cizek, Gregory J. – Assessing Writing, 2006
This study was conducted to gather evidence regarding effects of the mode of writing (handwritten vs. word-processed) on compositional quality in a sample of sixth grade students. Questionnaire data and essay scores were gathered to examine the effect of composition mode on essay scores of students of differing computer skill levels. The study was…
Descriptors: Computer Assisted Testing, High Stakes Tests, Writing Processes, Grade 6
Peer reviewed Peer reviewed
Hunt, Nicoll; Hughes, Janet; Rowe, Glenn – British Journal of Educational Technology, 2002
Describes the development of a tool, FACT (Formative Automated Computer Testing), to formatively assess information technology skills of college students in the United Kingdom. Topics include word processing competency; tests designed by tutors and delivered via a network; and results of an evaluation that showed students preferred automated…
Descriptors: Competence, Computer Assisted Testing, Computer Networks, Evaluation Methods
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wolfe, Edward W.; Manalo, Jonathan R. – ETS Research Report Series, 2005
This study examined scores from 133,906 operationally scored Test of English as a Foreign Language™ (TOEFL®) essays to determine whether the choice of composition medium has any impact on score quality for subgroups of test-takers. Results of analyses demonstrate that (a) scores assigned to word-processed essays are slightly more reliable than…
Descriptors: English (Second Language), Language Tests, Second Language Learning, Scores