NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)4
Since 2006 (last 20 years)14
Audience
Researchers1
Location
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 34 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Hao, Jiangang; Mislevy, Robert J. – ETS Research Report Series, 2018
Virtual performance assessments (VPAs), such as game- and simulation-based assessments, provide promising ways for assessing complex and integrated skills. However, the high cost, long development cycle, and complex scoring process significantly hinder the adoption of VPAs, particularly in large-scale assessments with tight deadlines and limited…
Descriptors: Performance Based Assessment, Computer Assisted Testing, Test Construction, Evidence
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, María Elena; Lawless, René; Mislevy, Robert J. – International Journal of Testing, 2019
Collaborative problem solving (CPS) ranks among the top five most critical skills necessary for college graduates to meet workforce demands (Hart Research Associates, 2015). It is also deemed a critical skill for educational success (Beaver, 2013). It thus deserves more prominence in the suite of courses and subjects assessed in K-16. Such…
Descriptors: Cooperation, Problem Solving, Evidence Based Practice, 21st Century Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J.; Oliveri, Maria Elena – Educational Measurement: Issues and Practice, 2019
In this digital ITEMS module, Dr. Robert [Bob] Mislevy and Dr. Maria Elena Oliveri introduce and illustrate a sociocognitive perspective on educational measurement, which focuses on a variety of design and implementation considerations for creating fair and valid assessments for learners from diverse populations with diverse sociocultural…
Descriptors: Educational Testing, Reliability, Test Validity, Test Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J. – Journal of Educational Measurement, 2016
Validity is the sine qua non of properties of educational assessment. While a theory of validity and a practical framework for validation has emerged over the past decades, most of the discussion has addressed familiar forms of assessment and psychological framings. Advances in digital technologies and in cognitive and social psychology have…
Descriptors: Test Validity, Technology, Cognitive Psychology, Social Psychology
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J. – Measurement: Interdisciplinary Research and Perspectives, 2013
Measurement is a semantic frame, a constellation of relationships and concepts that correspond to recurring patterns in human activity, highlighting typical roles, processes, and viewpoints (e.g., the "commercial event") but not others. One uses semantic frames to reason about unique and complex situations--sometimes intuitively, sometimes…
Descriptors: Educational Assessment, Measurement, Feedback (Response), Evidence
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J. – Measurement: Interdisciplinary Research and Perspectives, 2010
In "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century," Bejar and Graf (2010) propose extensions to the duplex design for large-scale assessment presented in Bock and Mislevy (1988). Examining the range of people who use assessment results--from students, teachers, administrators, curriculum designers,…
Descriptors: Measurement, Test Construction, Educational Testing, Data Collection
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J.; Haertel, Geneva; Cheng, Britte H.; Ructtinger, Liliana; DeBarger, Angela; Murray, Elizabeth; Rose, David; Gravel, Jenna; Colker, Alexis M.; Rutstein, Daisy; Vendlinski, Terry – Educational Research and Evaluation, 2013
Standardizing aspects of assessments has long been recognized as a tactic to help make evaluations of examinees fair. It reduces variation in irrelevant aspects of testing procedures that could advantage some examinees and disadvantage others. However, recent attention to making assessment accessible to a more diverse population of students…
Descriptors: Testing Accommodations, Access to Education, Testing, Psychometrics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Mislevy, Robert J.; Behrens, John T.; Bennett, Randy E.; Demark, Sarah F.; Frezzo, Dennis C.; Levy, Roy; Robinson, Daniel H.; Rutstein, Daisy Wise; Shute, Valerie J.; Stanley, Ken; Winters, Fielding I. – Journal of Technology, Learning, and Assessment, 2010
People use external knowledge representations (KRs) to identify, depict, transform, store, share, and archive information. Learning how to work with KRs is central to be-coming proficient in virtually every discipline. As such, KRs play central roles in curriculum, instruction, and assessment. We describe five key roles of KRs in assessment: (1)…
Descriptors: Student Evaluation, Educational Technology, Computer Networks, Knowledge Representation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Hansen, Eric G.; Mislevy, Robert J. – ETS Research Report Series, 2008
There is a great need to help test designers determine how to make tests that are accessible to individuals with disabilities. This report takes "design patterns", which were developed at SRI for assessment design, and uses them to clarify issues related to accessibility features for individuals with disabilities--such as low-vision and…
Descriptors: Disabilities, Adaptive Testing, Testing Accommodations, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J. – Educational Researcher, 2007
Lissitz and Samuelsen (2007) argue that the unitary conception of validity for educational assessments is too broad to guide applied work. They call for attention to considerations and procedures that focus on "test development and analysis of the test itself" and propose that those activities be collectively termed "content validity." The author…
Descriptors: Content Validity, Test Validity, Test Construction, Student Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J. – Research Papers in Education, 2010
An educational assessment embodies an argument from a handful of observations of what students say, do or make in a handful of particular circumstances, to what they know or can do in what kinds of situations more broadly. This article discusses ways in which research into the nature and development of expertise can help assessment designers…
Descriptors: Educational Assessment, Test Construction, Expertise, Research
Peer reviewed Peer reviewed
Direct linkDirect link
Mislevy, Robert J.; Haertel, Geneva D. – Educational Measurement: Issues and Practice, 2006
Evidence-centered assessment design (ECD) provides language, concepts, and knowledge representations for designing and delivering educational assessments, all organized around the evidentiary argument an assessment is meant to embody. This article describes ECD in terms of layers for analyzing domains, laying out arguments, creating schemas for…
Descriptors: Educational Testing, Test Construction, Evaluation Methods, Computer Simulation
Williamson, David M.; Bauer, Malcolm; Mislevy, Robert J.; Behrens, John T. – 2003
The current pace of technological advance has provided an unprecedented opportunity to use innovative simulated tasks in computerized assessment. A primary challenge for the successful use of innovation in assessment rests with the application of sound principles of design to produce a valid assessment. An additional challenge is to maximize the…
Descriptors: Educational Innovation, Research Design, Test Construction
Mislevy, Robert J.; Behrens, John T.; Bennett, Randy E.; Demark, Sarah F.; Frezzo, Dennis C.; Levy, Roy; Robinson, Daniel H.; Rutstein, Daisy Wise; Shute, Valerie J.; Stanley, Ken; Winters, Fielding I. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2007
People use external knowledge representations (EKRs) to identify, depict, transform, store, share, and archive information. Learning how to work with EKRs is central to becoming proficient in virtually every discipline. As such, EKRs play central roles in curriculum, instruction, and assessment. Five key roles of EKRs in educational assessment are…
Descriptors: Educational Assessment, Computer Networks, Test Construction, Computer Assisted Testing
Mislevy, Robert J.; Steinberg, Linda S.; Almond, Russell G. – 1999
Tasks are the most visible element in an educational assessment. Their purpose, however, is to provide evidence about targets of inference that cannot be directly seen at all: what examinees know and can do, more broadly conceived than can be observed in the context of any particular set of tasks. This paper concerns issues in an assessment design…
Descriptors: Educational Assessment, Evaluation Methods, Higher Education, Models
Previous Page | Next Page »
Pages: 1  |  2  |  3