Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 1 |
| Since 2007 (last 20 years) | 5 |
Descriptor
Source
| International Journal of… | 2 |
| National Center for Research… | 2 |
| Applied Psychological… | 1 |
| Center for Research on… | 1 |
| ETS Research Report Series | 1 |
| Journal of Educational Data… | 1 |
| Journal of Technology,… | 1 |
Author
Publication Type
| Journal Articles | 6 |
| Reports - Descriptive | 5 |
| Reports - Research | 5 |
| Reports - Evaluative | 3 |
| Speeches/Meeting Papers | 1 |
Education Level
| Elementary Secondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
| Graduate Record Examinations | 1 |
| Praxis Series | 1 |
What Works Clearinghouse Rating
Hao, Jiangang; Mislevy, Robert J. – ETS Research Report Series, 2018
Virtual performance assessments (VPAs), such as game- and simulation-based assessments, provide promising ways for assessing complex and integrated skills. However, the high cost, long development cycle, and complex scoring process significantly hinder the adoption of VPAs, particularly in large-scale assessments with tight deadlines and limited…
Descriptors: Performance Based Assessment, Computer Assisted Testing, Test Construction, Evidence
Behrens, John T.; Mislevy, Robert J.; DiCerbo, Kristen E.; Levy, Roy – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2010
The world in which learning and assessment must take place is rapidly changing. The digital revolution has created a vast space of interconnected information, communication, and interaction. Functioning effectively in this environment requires so-called 21st century skills such as technological fluency, complex problem solving, and the ability to…
Descriptors: Evidence, Student Evaluation, Educational Assessment, Influence of Technology
Mislevy, Robert J.; Behrens, John T.; Bennett, Randy E.; Demark, Sarah F.; Frezzo, Dennis C.; Levy, Roy; Robinson, Daniel H.; Rutstein, Daisy Wise; Shute, Valerie J.; Stanley, Ken; Winters, Fielding I. – Journal of Technology, Learning, and Assessment, 2010
People use external knowledge representations (KRs) to identify, depict, transform, store, share, and archive information. Learning how to work with KRs is central to be-coming proficient in virtually every discipline. As such, KRs play central roles in curriculum, instruction, and assessment. We describe five key roles of KRs in assessment: (1)…
Descriptors: Student Evaluation, Educational Technology, Computer Networks, Knowledge Representation
Mislevy, Robert J.; Behrens, John T.; Dicerbo, Kristen E.; Levy, Roy – Journal of Educational Data Mining, 2012
"Evidence-centered design" (ECD) is a comprehensive framework for describing the conceptual, computational and inferential elements of educational assessment. It emphasizes the importance of articulating inferences one wants to make and the evidence needed to support those inferences. At first blush, ECD and "educational data…
Descriptors: Educational Assessment, Psychometrics, Evidence, Computer Games
Mislevy, Robert J.; Behrens, John T.; Bennett, Randy E.; Demark, Sarah F.; Frezzo, Dennis C.; Levy, Roy; Robinson, Daniel H.; Rutstein, Daisy Wise; Shute, Valerie J.; Stanley, Ken; Winters, Fielding I. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2007
People use external knowledge representations (EKRs) to identify, depict, transform, store, share, and archive information. Learning how to work with EKRs is central to becoming proficient in virtually every discipline. As such, EKRs play central roles in curriculum, instruction, and assessment. Five key roles of EKRs in educational assessment are…
Descriptors: Educational Assessment, Computer Networks, Test Construction, Computer Assisted Testing
Peer reviewedAlmond, Russell G.; Mislevy, Robert J. – Applied Psychological Measurement, 1999
Considers computerized adaptive testing from the perspective of graphical modeling (GM). GM provides methods for making inferences about multifaceted skills and knowledge and for extracting data from complex performances. Provides examples from language-proficiency assessment. (SLD)
Descriptors: Adaptive Testing, Computer Assisted Testing, Language Proficiency, Language Tests
Mislevy, Robert J.; Almond, Russell G. – 1997
This paper synthesizes ideas from the fields of graphical modeling and education testing, particularly item response theory (IRT) applied to computerized adaptive testing (CAT). Graphical modeling can offer IRT a language for describing multifaceted skills and knowledge, and disentangling evidence from complex performances. IRT-CAT can offer…
Descriptors: Adaptive Testing, Computer Assisted Testing, Educational Testing, Higher Education
Mislevy, Robert J. – Center for Research on Evaluation Standards and Student Testing CRESST, 2004
In this paper we provide a rationale and approach for articulating a conceptual framework and corresponding development resources to guide the design of science inquiry assessments. Important here is attention to how and why research on cognition and learning, advances in technological capability, and development of sophisticated methods and…
Descriptors: Science, Test Construction, Student Evaluation, Science Tests
Williamson, David M.; Bauer, Malcolm; Steinberg, Linda S.; Mislevy, Robert J.; Behrens, John T.; DeMark, Sarah F. – International Journal of Testing, 2004
In computer-based interactive environments meant to support learning, students must bring a wide range of relevant knowledge, skills, and abilities to bear jointly as they solve meaningful problems in a learning domain. To function effectively as an assessment, a computer system must additionally be able to evoke and interpret observable evidence…
Descriptors: Computer Assisted Testing, Psychometrics, Task Analysis, Performance Based Assessment
Levy, Roy; Mislevy, Robert J. – International Journal of Testing, 2004
The challenges of modeling students' performance in computer-based interactive assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance. This article describes a Bayesian approach to modeling and estimating cognitive models…
Descriptors: Computer Assisted Testing, Markov Processes, Computer Networks, Bayesian Statistics
Mislevy, Robert J.; Steinberg, Linda S.; Breyer, F. Jay; Almond, Russell G.; Johnson, Lynn – 1998
To function effectively as a learning environment, a simulation system must present learners with situations in which they use relevant knowledge, skills, and abilities. To function effectively as an assessment, such a system must additionally be able to evoke and interpret observable evidence about targeted knowledge in a manner that is…
Descriptors: Cognitive Processes, Cognitive Tests, Computer Assisted Testing, Computer Simulation
Williamson, David M.; Bauer, Malcom; Steinberg, Linda S.; Mislevy, Robert J.; Behrens, John T. – 2003
In computer-based simulations meant to support learning, students must bring a wide range of relevant knowledge, skills, and abilities to bear jointly as they solve meaningful problems in a learning domain. To function efficiently as an assessment, a simulation system must also be able to evoke and interpret observable evidence about targeted…
Descriptors: College Students, Computer Assisted Testing, Computer Networks, Computer Simulation
Sheehan, Kathleen; Mislevy, Robert J. – 1994
The operating characteristics of 114 mathematics pretest items from the Praxis I: Computer Based Test were analyzed in terms of item attributes and test developers' judgments of item difficulty. Item operating characteristics were defined as the difficulty, discrimination, and asymptote parameters of a three parameter logistic item response theory…
Descriptors: Basic Skills, Computer Assisted Testing, Difficulty Level, Educational Assessment

Direct link
