NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Shavelson, Richard J. – Educational Psychologist, 2013
E. L. Thorndike contributed significantly to the field of educational and psychological testing as well as more broadly to psychological studies in education. This article follows in his testing legacy. I address the escalating demand, across societal sectors, to measure individual and group competencies. In formulating an approach to measuring…
Descriptors: Competence, Psychology, Psychological Testing, Psychological Studies
Peer reviewed Peer reviewed
Direct linkDirect link
Yin, Yue; Shavelson, Richard J. – Applied Measurement in Education, 2008
In the first part of this article, the use of Generalizability (G) theory in examining the dependability of concept map assessment scores and designing a concept map assessment for a particular practical application is discussed. In the second part, the application of G theory is demonstrated by comparing the technical qualities of two frequently…
Descriptors: Generalizability Theory, Concept Mapping, Validity, Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Cronbach, Lee J.; Shavelson, Richard J. – Educational and Psychological Measurement, 2004
In 1997, noting that the 50th anniversary of the publication of "Coefficient Alpha and the Internal Structure of Tests" was fast approaching, Lee Cronbach planned what have become the notes published here. His aim was to point out the ways in which his views on coefficient alpha had evolved, doubting now that the coefficient was the best way of…
Descriptors: Generalizability Theory, Reliability, Statistical Analysis
Yin, Yue; Shavelson, Richard J. – Center for Research on Evaluation Standards and Student Testing CRESST, 2004
In the first part of this paper we discuss the feasibility of using Generalizability (G) Theory to examine the dependability of concept map assessments and to design a concept map assessment for a particular practical application. In the second part, we apply G theory to compare the technical qualities of two frequently used mapping techniques:…
Descriptors: Formative Evaluation, Generalizability Theory, Concept Mapping, Comparative Analysis
Peer reviewed Peer reviewed
Shavelson, Richard J.; Solano-Flores, Guillermo; Ruiz-Primo, Maria Araceli – Evaluation and Program Planning, 1998
Research on developing technology for large-scale performance assessments in science is reported briefly, and a conceptual framework is presented for defining, generating, and evaluating science performance assessments. Types of tasks are discussed, and the technical qualities of performance assessments are discussed in the context of…
Descriptors: Educational Technology, Generalizability Theory, Models, Performance Based Assessment
Peer reviewed Peer reviewed
Shavelson, Richard J.; And Others – Journal of Educational Measurement, 1993
Evidence is presented on the generalizability and convergent validity of performance assessments using data from six studies of student achievement that sampled a wide range of measurement facets and methods. Results at individual and school levels indicate that task-sampling variability is the major source of measurement error. (SLD)
Descriptors: Academic Achievement, Educational Assessment, Error of Measurement, Generalizability Theory
Shavelson, Richard J.; And Others – 1993
In this paper, performance assessments are cast within a sampling framework. A performance assessment score is viewed as a sample of student performance drawn from a complex universe defined by a combination of all possible tasks, occasions, raters, and measurement methods. Using generalizability theory, the authors present evidence bearing on the…
Descriptors: Academic Achievement, Educational Assessment, Error of Measurement, Evaluators