Descriptor
Performance Based Assessment | 3 |
Performance Tests | 3 |
Testing Problems | 3 |
Elementary Education | 2 |
Multiple Choice Tests | 2 |
Test Reliability | 2 |
Test Validity | 2 |
Computer Assisted Testing | 1 |
Computer Simulation | 1 |
Cost Effectiveness | 1 |
Costs | 1 |
More ▼ |
Source
Educational Leadership | 1 |
Principal | 1 |
Publication Type
Reports - Evaluative | 3 |
Journal Articles | 2 |
Education Level
Audience
Location
California | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Armed Services Vocational… | 1 |
What Works Clearinghouse Rating

Shavelson, Richard J.; Baxter, Gail P. – Educational Leadership, 1992
A recent study compared hands-on scientific inquiry assessment to assessments involving lab notebooks, computer simulations, short-answer paper-and-pencil problems, and multiple-choice questions. Creating high quality performance assessments is a costly, time-consuming process requiring considerable scientific and technological know-how. Improved…
Descriptors: Computer Simulation, Costs, Elementary Education, Experiential Learning
Jones, Marshall B. – 1991
The microcomputer has increased interest in performance testing, which samples what a person can do rather than what he or she knows. Conventional psychometric theory is based on knowledge tests, but in performance testing the unit of analysis is a trial, and it is unreasonable to assume that mean performance and interim correlations are…
Descriptors: Computer Assisted Testing, Higher Education, Military Personnel, Performance Based Assessment
Bracey, Gerald W. – Principal, 1993
Describes recent efforts of the Center for Research in Evaluation, Standards and Student Testing (CRESST) to evaluate authentic assessment methods, such as portfolios and performance tests. When comparing the merits of authentic versus multiple-choice testing, it is wise to consider validity, reliability, consequences, fairness, generalization,…
Descriptors: Cost Effectiveness, Efficiency, Elementary Education, Evaluation Criteria