Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 4 |
Descriptor
Computer Assisted Testing | 4 |
Scoring | 4 |
Scoring Formulas | 4 |
Automation | 2 |
Evaluation Methods | 2 |
Scoring Rubrics | 2 |
Writing Evaluation | 2 |
Benchmarking | 1 |
Comparative Analysis | 1 |
Computer System Design | 1 |
Database Design | 1 |
More ▼ |
Author
Attali, Yigal | 1 |
Ben-Simon, Anat | 1 |
Bennett, Randy Elliott | 1 |
Higgins, Derrick | 1 |
Jancarík, Antonín | 1 |
Kostelecká, Yvona | 1 |
Williamson, David M. | 1 |
Xi, Xiaoming | 1 |
Zechner, Klaus | 1 |
Publication Type
Journal Articles | 4 |
Reports - Research | 4 |
Tests/Questionnaires | 1 |
Education Level
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Grade 8 | 1 |
Audience
Location
Czech Republic | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Jancarík, Antonín; Kostelecká, Yvona – Electronic Journal of e-Learning, 2015
Electronic testing has become a regular part of online courses. Most learning management systems offer a wide range of tools that can be used in electronic tests. With respect to time demands, the most efficient tools are those that allow automatic assessment. The presented paper focuses on one of these tools: matching questions in which one…
Descriptors: Online Courses, Computer Assisted Testing, Test Items, Scoring Formulas
Xi, Xiaoming; Higgins, Derrick; Zechner, Klaus; Williamson, David M. – ETS Research Report Series, 2008
This report presents the results of a research and development effort for SpeechRater? Version 1.0 (v1.0), an automated scoring system for the spontaneous speech of English language learners used operationally in the Test of English as a Foreign Language™ (TOEFL®) Practice Online assessment (TPO). The report includes a summary of the validity…
Descriptors: Speech, Scoring, Scoring Rubrics, Scoring Formulas
Attali, Yigal – ETS Research Report Series, 2007
Because there is no commonly accepted view of what makes for good writing, automated essay scoring (AES) ideally should be able to accommodate different theoretical positions, certainly at the level of state standards but also perhaps among teachers at the classroom level. This paper presents a practical approach and an interactive computer…
Descriptors: Computer Assisted Testing, Automation, Essay Tests, Scoring
Ben-Simon, Anat; Bennett, Randy Elliott – Journal of Technology, Learning, and Assessment, 2007
This study evaluated a "substantively driven" method for scoring NAEP writing assessments automatically. The study used variations of an existing commercial program, e-rater[R], to compare the performance of three approaches to automated essay scoring: a "brute-empirical" approach in which variables are selected and weighted solely according to…
Descriptors: Writing Evaluation, Writing Tests, Scoring, Essays