NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)1
Since 2006 (last 20 years)2
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Zlatkin-Troitschanskaia, Olga; Shavelson, Richard J.; Schmidt, Susanne; Beck, Klaus – British Journal of Educational Psychology, 2019
Background: A holistic approach to performance assessment recognizes the theoretical complexity of multifaceted critical thinking (CT), a key objective of higher education. However, issues related to reliability, interpretation, and use arise with this approach. Aims and Method: Therefore, we take an analytic approach to scoring students' written…
Descriptors: Holistic Approach, Performance Based Assessment, Critical Thinking, College Students
Peer reviewed Peer reviewed
Direct linkDirect link
Lopez, Enrique J.; Shavelson, Richard J.; Nandagopal, Kiruthiga; Szu, Evan; Penn, John – Journal of Chemical Education, 2014
Problem solving is a highly valued skill in chemistry. Courses within this discipline place a substantial emphasis on problem-solving performance and tend to weigh such performance heavily in assessments of learning. Researchers have dedicated considerable effort investigating individual factors that influence problem-solving performance. The…
Descriptors: Organic Chemistry, Performance Factors, Academic Achievement, Problem Solving
Peer reviewed Peer reviewed
Solano-Flores, Guillermo; Jovanovic, Jasna; Shavelson, Richard J.; Bachman, Marilyn – International Journal of Science Education, 1999
Describes the construction of assessments on inclines and friction from blueprints--or shells--which address the tasks of planning, hands-on, analysis, and application. Examines the reliability and validity of the assessments as well as the comparability of the assessments generated with the shell. Contains 48 references. (Author/WRM)
Descriptors: Elementary Education, Evaluation Methods, Performance Based Assessment, Physical Sciences
Peer reviewed Peer reviewed
Ayala, Carlos Cuauhtemoc; Shavelson, Richard J.; Yin, Yue; Schultz, Susan E. – Educational Assessment, 2002
Studied reasoning dimensions underlying science achievement in a test made of items from three national and international examinations and items from only one of the tests (National Education Longitudinal Study of 1988; NELS:88) and in performance test results for 35 students from the larger study. Findings provide tentative support for three…
Descriptors: High School Students, High Schools, National Surveys, Performance Based Assessment
Peer reviewed Peer reviewed
Ruiz-Primo, Maria Araceli; Shavelson, Richard J. – Journal of Research in Science Teaching, 1996
Addresses the rhetoric of performance assessment with research on important claims about science performance assessments. Discusses findings related to concepts and terminology, sensitivity to task and method, higher-order thinking skills, cost, impact on teaching and understanding, and professional development. Presents a conceptual framework to…
Descriptors: Elementary Secondary Education, Evaluation, Performance Based Assessment, Process Education
Peer reviewed Peer reviewed
Ruiz-Primo, Maria Araceli; Shavelson, Richard J. – Journal of Research in Science Teaching, 1996
Examines the validity of claims that concept maps measure an important aspect of students' knowledge structures. Provides a working definition of concept mapping and a brief theoretical background, characterizes concept maps as a potential assessment in science, reviews empirical evidence on the reliability and validity of various concept mapping…
Descriptors: Cognitive Structures, Concept Formation, Elementary Secondary Education, Evaluation
Peer reviewed Peer reviewed
Stecher, Brian M.; Klein, Stephen P.; Solano-Flores, Guillermo; McCaffrey, Dan; Robyn, Abby; Shavelson, Richard J.; Haertel, Edward – Applied Measurement in Education, 2000
Studied content domain, format, and level of inquiry as factors contributing to the large variation in student performance across open-ended measures. Results for more than 1,200 eighth graders do not support the hypothesis that tasks similar in content, format, and level of inquiry would correlate higher with each other than with measures…
Descriptors: Correlation, Inquiry, Junior High School Students, Junior High Schools
Peer reviewed Peer reviewed
Klein, Stephen P.; Stecher, Brian M.; Shavelson, Richard J.; McCaffrey, Daniel; Ormseth, Tor; Bell, Robert M.; Comfort, Kathy; Othman, Abdul R. – Applied Measurement in Education, 1998
Two studies involving 368 elementary and high school students and 29 readers were conducted to investigate reader consistency, score reliability, and reader time requirements of three hands-on science performance tasks. Holistic scores were as reliable as analytic scores, and there was a high correlation between them after they were disattenuated…
Descriptors: Elementary School Students, Elementary Secondary Education, Hands on Science, High School Students
Stecher, Brian M.; Klein, Stephen P.; Solano-Flores, Guillermo; McCaffrey, Dan; Robyn, Abby; Shavelson, Richard J.; Haertel, Edward – 1998
This study investigated three factors that may contribute to the large variation in student performance across open-ended measures. These factors are content domain, format (whether the task required only pencil and paper or involved a hands-on manipulation of equipment), and level of inquiry (whether the task guided the student toward the…
Descriptors: Correlation, Grade 8, Junior High School Students, Junior High Schools
Rosenquist, Anders; Shavelson, Richard J.; Ruiz-Primo, Maria Araceli – 2000
Inconsistencies in scores from computer-simulated and "hands-on" science performance assessments have led to questions about the exchangeability of these two methods in spite of the highly touted potential of computer-simulated performance assessment. This investigation considered possible explanations for students' inconsistent performances: (1)…
Descriptors: Computer Assisted Testing, Computer Simulation, Elementary School Students, Hands on Science
Peer reviewed Peer reviewed
Shavelson, Richard J.; And Others – Educational Researcher, 1992
Investigates the validity and reliability of performance assessments using data from over 300 fifth and sixth graders. Results demonstrate the gap between the reality of measurement through performance assessments and the political rhetoric that would institute these assessments in a national examination system in the immediate future. (SLD)
Descriptors: Accountability, Computer Assisted Testing, Educational Assessment, Educational Change