Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 3 |
Descriptor
Alternative Assessment | 6 |
Computer Assisted Testing | 6 |
Scoring | 6 |
Automation | 2 |
Design | 2 |
Educational Assessment | 2 |
Educational Technology | 2 |
Foreign Countries | 2 |
Item Response Theory | 2 |
Test Items | 2 |
Action Research | 1 |
More ▼ |
Source
Canadian Journal of Learning… | 1 |
Florida Educational Research… | 1 |
Grantee Submission | 1 |
Journal of Educational… | 1 |
Author
Casabianca, Jodi M. | 1 |
Chao, Szu-Fu | 1 |
Choi, Ikkyu | 1 |
Denis Dumas | 1 |
Donoghue, John R. | 1 |
French, Ann | 1 |
Godwin, Janet | 1 |
Kelly Berthiaume | 1 |
Liu, Xiufeng | 1 |
Newhouse, C. Paul | 1 |
Peter Organisciak | 1 |
More ▼ |
Publication Type
Reports - Research | 5 |
Journal Articles | 2 |
Speeches/Meeting Papers | 2 |
Collected Works - Proceedings | 1 |
Tests/Questionnaires | 1 |
Education Level
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Casabianca, Jodi M.; Donoghue, John R.; Shin, Hyo Jeong; Chao, Szu-Fu; Choi, Ikkyu – Journal of Educational Measurement, 2023
Using item-response theory to model rater effects provides an alternative solution for rater monitoring and diagnosis, compared to using standard performance metrics. In order to fit such models, the ratings data must be sufficiently connected in order to estimate rater effects. Due to popular rating designs used in large-scale testing scenarios,…
Descriptors: Item Response Theory, Alternative Assessment, Evaluators, Research Problems
Peter Organisciak; Selcuk Acar; Denis Dumas; Kelly Berthiaume – Grantee Submission, 2023
Automated scoring for divergent thinking (DT) seeks to overcome a key obstacle to creativity measurement: the effort, cost, and reliability of scoring open-ended tests. For a common test of DT, the Alternate Uses Task (AUT), the primary automated approach casts the problem as a semantic distance between a prompt and the resulting idea in a text…
Descriptors: Automation, Computer Assisted Testing, Scoring, Creative Thinking
Newhouse, C. Paul; Tarricone, Pina – Canadian Journal of Learning and Technology, 2014
High-stakes external assessment for practical courses is fraught with problems impacting on the manageability, validity and reliability of scoring. Alternative approaches to assessment using digital technologies have the potential to address these problems. This paper describes a study that investigated the use of these technologies to create and…
Descriptors: High Stakes Tests, Student Evaluation, Evaluation Methods, Scoring
Liu, Xiufeng – 1994
Problems of validity and reliability of concept mapping are addressed by using item-response theory (IRT) models for scoring. In this study, the overall structure of students' concept maps are defined by the number of links, the number of hierarchies, the number of cross-links, and the number of examples. The study was conducted with 92 students…
Descriptors: Alternative Assessment, Computer Assisted Testing, Concept Mapping, Correlation
French, Ann; Godwin, Janet – 1996
The development of innovative test item types that use multimedia technology to improve item authenticity and interaction and allow for objective scoring through partial-credit scoring methodologies was studied. Science test items were developed for community college developmental students using "Authorware 3.0," an instructional compact disc. The…
Descriptors: Alternative Assessment, College Students, Community Colleges, Computer Assisted Testing
Vernetson, Theresa, Ed. – Florida Educational Research Council Research Bulletin, 1993
This edition of the "Research Bulletin" is a compilation of papers presented at the annual William F. Breivogel Conference in 1993. The conference theme was alternative and portfolio assessment. Papers were grouped into assessment in general, portfolio assessment, and alternative assessments and curriculum questions. The selected papers…
Descriptors: Alternative Assessment, Computer Assisted Testing, Curriculum Development, Educational Assessment