Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 13 |
Descriptor
Computer Software Evaluation | 13 |
Essays | 13 |
Writing Evaluation | 8 |
Computer Assisted Testing | 7 |
Comparative Analysis | 6 |
Foreign Countries | 6 |
Educational Technology | 5 |
Essay Tests | 5 |
Scoring | 5 |
Automation | 4 |
Computer Software | 4 |
More ▼ |
Source
Author
Alexander, R. Curby | 1 |
Attali, Yigal | 1 |
Baier, Herbert | 1 |
Burrows, Steven | 1 |
Burstein, Jill | 1 |
Deess, Perry | 1 |
Elliot, Norbert | 1 |
Enriquez Carrasco, Emilia | 1 |
Ferster, Bill | 1 |
Garcia Laborda, Jesus | 1 |
Garcia, Veronica | 1 |
More ▼ |
Publication Type
Journal Articles | 12 |
Reports - Research | 8 |
Reports - Evaluative | 3 |
Reports - Descriptive | 2 |
Information Analyses | 1 |
Speeches/Meeting Papers | 1 |
Education Level
Higher Education | 7 |
Postsecondary Education | 7 |
Elementary Secondary Education | 3 |
Secondary Education | 2 |
Elementary Education | 1 |
Grade 3 | 1 |
Grade 4 | 1 |
Grade 5 | 1 |
Grade 7 | 1 |
High Schools | 1 |
Junior High Schools | 1 |
More ▼ |
Audience
Laws, Policies, & Programs
Assessments and Surveys
Graduate Management Admission… | 1 |
What Works Clearinghouse Rating
Mochizuki, Toshio; Nishimori, Toshihisa; Tsubakimoto, Mio; Oura, Hiroki; Sato, Tomomi; Johansson, Henrik; Nakahara, Jun; Yamauchi, Yuhei – Educational Technology Research and Development, 2019
This paper describes the development of a software program that supports argumentative reading and writing, especially for novice students. The software helps readers create a graphic organizer from the text as a knowledge map while they are reading and use their prior knowledge to build their own opinion as new information while they think about…
Descriptors: Persuasive Discourse, Undergraduate Students, Instructional Materials, Concept Mapping
Ramineni, Chaitanya – Assessing Writing, 2013
In this paper, I describe the design and evaluation of automated essay scoring (AES) models for an institution's writing placement program. Information was gathered on admitted student writing performance at a science and technology research university in the northeastern United States. Under timed conditions, first-year students (N = 879) were…
Descriptors: Validity, Comparative Analysis, Internet, Student Placement
Klobucar, Andrew; Elliot, Norbert; Deess, Perry; Rudniy, Oleksandr; Joshi, Kamal – Assessing Writing, 2013
This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the "Criterion"[R] Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation,…
Descriptors: Writing Evaluation, Scoring, Writing Instruction, Essays
Wang, Y.; Harrington, M.; White, P. – Journal of Computer Assisted Learning, 2012
This paper introduces "CTutor", an automated writing evaluation (AWE) tool for detecting breakdowns in local coherence and reports on a study that applies it to the writing of Chinese L2 English learners. The program is based on Centering theory (CT), a theory of local coherence and salience. The principles of CT are first introduced and…
Descriptors: Foreign Countries, Educational Technology, Expertise, Feedback (Response)
On the Reliability and Validity of Human and LSA-Based Evaluations of Complex Student-Authored Texts
Seifried, Eva; Lenhard, Wolfgang; Baier, Herbert; Spinath, Birgit – Journal of Educational Computing Research, 2012
This study investigates the potential of a software tool based on Latent Semantic Analysis (LSA; Landauer, McNamara, Dennis, & Kintsch, 2007) to automatically evaluate complex German texts. A sample of N = 94 German university students provided written answers to questions that involved a high amount of analytical reasoning and evaluation.…
Descriptors: Foreign Countries, Computer Software, Computer Software Evaluation, Computer Uses in Education
Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt – Journal of Interactive Learning Research, 2012
The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…
Descriptors: Feedback (Response), Scripts, Formative Evaluation, Essays
Liu, Yuan-Chen; Lee, Wan-Chun; Huang, Tzu-Hua; Hsieh, Hsiao-Mei – Turkish Online Journal of Educational Technology - TOJET, 2012
This research investigates students' performance while writing Chinese essays using an interactive online writing system. Participants include students from two seventh-grade classes of a junior high school in Taoyuan County, Taiwan. The experimental group uses the conditioned writing interactive online system, while the control group receives…
Descriptors: Foreign Countries, Writing Instruction, Control Groups, Experimental Groups
Burrows, Steven; Shortis, Mark – Australasian Journal of Educational Technology, 2011
Online marking and feedback systems are critical for providing timely and accurate feedback to students and maintaining the integrity of results in large class teaching. Previous investigations have involved much in-house development and more consideration is needed for deploying or customising off the shelf solutions. Furthermore, keeping up to…
Descriptors: Foreign Countries, Integrated Learning Systems, Feedback (Response), Evaluation Criteria
Garcia Laborda, Jesus; Magal Royo, Teresa; Enriquez Carrasco, Emilia – Online Submission, 2010
This paper presents the results of writing processing among 260 high school senior students, their degree of satisfaction using the new trial version of the Computer Based University Entrance Examination in Spain and their degree of motivation towards written online test tasks. Currently, this is one of the closing studies to verify whether…
Descriptors: Foreign Countries, Curriculum Development, High Stakes Tests, Student Motivation
Kim, Seong-in; Hameed, Ibrahim A. – Art Therapy: Journal of the American Art Therapy Association, 2009
For mental health professionals, art assessment is a useful tool for patient evaluation and diagnosis. Consideration of various color-related elements is important in art assessment. This correlational study introduces the concept of variety of color as a new color-related element of an artwork. This term represents a comprehensive use of color,…
Descriptors: Mental Health Workers, Essays, Scoring, Visual Stimuli
Stoner, Mark R. – Communication Education, 2007
This essay offers an analysis of PowerPoint apart from the histrionics of the "'Tis and 'Taint" arguments about its value, and proposes a program of research to move forward our understanding of PowerPoint as an inscriptional system. To that end, the study begins with a discussion of PowerPoint as an inscriptional system that employs both…
Descriptors: Theory Practice Relationship, Design Requirements, Computer Software Evaluation, Instructional Material Evaluation
Rudner, Lawrence M.; Garcia, Veronica; Welch, Catherine – Journal of Technology, Learning, and Assessment, 2006
This report provides a two-part evaluation of the IntelliMetric[SM] automated essay scoring system based on its performance scoring essays from the Analytic Writing Assessment of the Graduate Management Admission Test[TM] (GMAT[TM]). The IntelliMetric system performance is first compared to that of individual human raters, a Bayesian system…
Descriptors: Writing Evaluation, Writing Tests, Scoring, Essays
Attali, Yigal; Burstein, Jill – Journal of Technology, Learning, and Assessment, 2006
E-rater[R] has been used by the Educational Testing Service for automated essay scoring since 1999. This paper describes a new version of e-rater (V.2) that is different from other automated essay scoring systems in several important respects. The main innovations of e-rater V.2 are a small, intuitive, and meaningful set of features used for…
Descriptors: Educational Testing, Test Scoring Machines, Scoring, Writing Evaluation