Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 3 |
Descriptor
Interrater Reliability | 4 |
Undergraduate Students | 4 |
Writing Assignments | 4 |
Information Literacy | 2 |
Peer Evaluation | 2 |
Performance Based Assessment | 2 |
Research Methodology | 2 |
Scoring Rubrics | 2 |
Student Evaluation | 2 |
Academic Libraries | 1 |
Access to Information | 1 |
More ▼ |
Source
College & Research Libraries | 1 |
Journal of Computer Assisted… | 1 |
Journal of Education for… | 1 |
portal: Libraries and the… | 1 |
Author
Dance, Betty | 1 |
Davis, Erin | 1 |
Diller, Karen R. | 1 |
Fagerheim, Britt | 1 |
Hedrich, Anne | 1 |
Holliday, Wendy | 1 |
Joordens, S. | 1 |
Lohmann, Sam | 1 |
Lundstrom, Kacy | 1 |
Marcoulides, George A. | 1 |
Martin, Pamela | 1 |
More ▼ |
Publication Type
Journal Articles | 4 |
Reports - Research | 3 |
Reports - Evaluative | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 3 |
Postsecondary Education | 3 |
Audience
Location
Utah | 1 |
Washington | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Lohmann, Sam; Diller, Karen R.; Phelps, Sue F. – portal: Libraries and the Academy, 2019
This case study discusses an assessment project in which a rubric was used to evaluate information literacy (IL) skills as reflected in undergraduate students' research papers. Subsequent analysis sought relationships between the students' IL skills and their contact with the library through various channels. The project proved far longer and more…
Descriptors: Performance Based Assessment, Information Literacy, Undergraduate Students, Research Papers (Students)
Holliday, Wendy; Dance, Betty; Davis, Erin; Fagerheim, Britt; Hedrich, Anne; Lundstrom, Kacy; Martin, Pamela – College & Research Libraries, 2015
This paper outlines the process and results of an authentic assessment of student work using a revised version of the AAC&U's Information Literacy VALUE rubric. This rigorous assessment, which included the scoring of nearly 900 student papers from four different stages across the undergraduate curriculum, revealed much about the process of…
Descriptors: Information Literacy, Performance Based Assessment, Undergraduate Students, Student Evaluation
Pare, D. E.; Joordens, S. – Journal of Computer Assisted Learning, 2008
As class sizes increase, methods of assessments shift from costly traditional approaches (e.g. expert-graded writing assignments) to more economic and logistically feasible methods (e.g. multiple-choice testing, computer-automated scoring, or peer assessment). While each method of assessment has its merits, it is peer assessment in particular,…
Descriptors: Writing Assignments, Undergraduate Students, Teaching Assistants, Peer Evaluation

Marcoulides, George A.; Simkin, Mark G. – Journal of Education for Business, 1995
Each paper written by 60 sophomores in computer classes received 3 peer evaluations using a structured evaluation process. Overall, students were able to grade efficiently and consistently in terms of overall score and selected criteria (subject matter, content, and mechanics). (SK)
Descriptors: Higher Education, Interrater Reliability, Peer Evaluation, Undergraduate Students