Publication Date
| In 2026 | 0 |
| Since 2025 | 225 |
| Since 2022 (last 5 years) | 1358 |
| Since 2017 (last 10 years) | 2816 |
| Since 2007 (last 20 years) | 4806 |
Descriptor
| Computer Assisted Testing | 7203 |
| Foreign Countries | 2049 |
| Test Construction | 1110 |
| Student Evaluation | 1062 |
| Evaluation Methods | 1061 |
| Test Items | 1057 |
| Adaptive Testing | 1052 |
| Educational Technology | 904 |
| Comparative Analysis | 835 |
| Scores | 830 |
| Higher Education | 823 |
| More ▼ | |
Source
Author
Publication Type
Education Level
Audience
| Practitioners | 182 |
| Researchers | 146 |
| Teachers | 122 |
| Policymakers | 40 |
| Administrators | 36 |
| Students | 15 |
| Counselors | 9 |
| Parents | 4 |
| Media Staff | 3 |
| Support Staff | 3 |
Location
| Australia | 169 |
| United Kingdom | 153 |
| Turkey | 126 |
| China | 117 |
| Germany | 108 |
| Canada | 106 |
| Spain | 94 |
| Taiwan | 89 |
| Netherlands | 73 |
| Iran | 71 |
| United States | 68 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
| Meets WWC Standards without Reservations | 4 |
| Meets WWC Standards with or without Reservations | 4 |
| Does not meet standards | 5 |
Thompson, Bruce; Levitov, Justin E. – Collegiate Microcomputer, 1985
Discusses features of a microcomputer program, SCOREIT, used at New Orleans' Loyola University and several high schools to score and analyze test results. Benefits and dimensions of the program's automated test and item analysis are outlined, and several examples illustrating test and item analyses by SCOREIT are presented. (MBR)
Descriptors: Computer Assisted Testing, Computer Software, Difficulty Level, Higher Education
Peer reviewedHale, Michael E.; And Others – Computers in the Schools, 1985
Both high school and college students served as subjects for an evaluation of the effectiveness of a computer-animated test of science problem-solving skills designed to minimize problems inherent in printed tests. The test development process is reviewed, and observations and conclusions relating to screen design and human interaction factors are…
Descriptors: Animation, Computer Assisted Testing, Computer Graphics, Display Systems
Peer reviewedJones, Randall L. – Foreign Language Annals, 1984
Reacts to Michael Canale's paper, "Considerations in the Testing of Reading and Listening Proficiency," concentrating on three areas: (1) the nature of the receptive skills and the requirements of a valid instrument to measure them, (2) the design features that are consistent with his test design principles, and (3) adaptive testing…
Descriptors: Computer Assisted Testing, Evaluation, Language Proficiency, Language Tests
Peer reviewedLowe, Pardee, Jr. – Foreign Language Annals, 1984
Examines the suggestions found in Michael Canale's paper, "Considerations in the Testing of Reading and Listening Proficiency," in the light of a possible U.S. Government's Interagency Language Roundtable receptive skills proficiency test which must supply the answer to the question of how well an individual can understand a particular…
Descriptors: Computer Assisted Testing, Criterion Referenced Tests, Language Proficiency, Language Tests
Peer reviewedWyatt, David H. – Foreign Language Annals, 1984
Describes and assesses what can be achieved in the learning and testing of the receptive language skills with computer hardware now available. Provides guidelines and suggestions for the development of language learning and testing software. Defines three types of computer programs: instructional, collaborative, and facilitative. (SED)
Descriptors: Computer Assisted Instruction, Computer Assisted Testing, Language Tests, Listening Comprehension
Peer reviewedHarrison, David; Pitre, John M. – Physics Teacher, 1983
Describes a computerized method to test error analysis that helps motivate introductory physics students to learn the topic. The computer generates a test consisting of four topics from a list of 10 that students should know. Numerical data within realistic ranges are also generated. (JN)
Descriptors: College Science, Computer Assisted Testing, Computer Oriented Programs, Computer Programs
Peer reviewedBarclay, James R. – School Psychology Review, 1983
This article discusses the crisis-coping model of school psychology practice, the problems brought about by this model, and the advantages of a preventive approach. It then describes the Barclay Classroom Assessment System, a multi-trait, multi-method assessment procedure that utilizes rating technology and computer synthesis to implement a…
Descriptors: Clinical Diagnosis, Computer Assisted Testing, Counseling Services, Crisis Intervention
Minnesota Department of Education, 2005
Minnesota Statute 120B.365 established an Assessment Advisory Committee to advise the commissioner on statewide assessment issues. The committee may consist of up to 11 members. The Committee will make recommendations to the commissioner and/or the legislature about issues involving statewide assessment. This report briefly provides a background…
Descriptors: Large Scale Assessment, Advisory Committees, Computer Assisted Testing, Strategic Planning
O'Neil, Harold F.; Chuang, San-hui; Chung, Gregory K. W. K. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2004
Collaborative problem-solving skills are considered necessary skills for success in today's world of work and school. Cooperative learning refers to learning environments in which small groups of people work together to achieve a common goal, and problem solving is defined as "cognitive processing directed at achieving a common goal when no…
Descriptors: Computer Assisted Testing, Cooperative Learning, Problem Solving, Skill Analysis
VanLehn, Kurt – 2001
Olae is a computer system for assessing student knowledge of physics, and Newtonian mechanics in particular, using performance data collected while students solve complex problems. Although originally designed as a stand-alone system, it has also been used as part of the Andes intelligent tutoring system. Like many other performance assessment…
Descriptors: Bayesian Statistics, Computer Assisted Testing, Intelligent Tutoring Systems, Knowledge Level
Williamson, David M.; Bauer, Malcom; Steinberg, Linda S.; Mislevy, Robert J.; Behrens, John T. – 2003
In computer-based simulations meant to support learning, students must bring a wide range of relevant knowledge, skills, and abilities to bear jointly as they solve meaningful problems in a learning domain. To function efficiently as an assessment, a simulation system must also be able to evoke and interpret observable evidence about targeted…
Descriptors: College Students, Computer Assisted Testing, Computer Networks, Computer Simulation
Bashford, Joanne – 1998
This information capsule explores the effectiveness of score ranges on the Computerized Placement Test (CPT), used to assess the skills of entry-level students at Miami-Dade Community College and place first-time-in-college students in classes. Data are provided for students entering in Fall terms 1996 and 1997 showing the number of students…
Descriptors: Community Colleges, Computer Assisted Testing, Developmental Studies Programs, Evaluation
Peer reviewedBishop, Thomas D. – Journal of Computers in Mathematics and Science Teaching, 1983
Describes a computer-assisted instruction program designed to help teachers develop and maintain continuously updated diagnostic records of student achievement. Gives hardware requirements and describes the program's eight components. These include deleting/changing records and lists, a testing manager, list producer, and student record reviewer.…
Descriptors: Computer Assisted Instruction, Computer Assisted Testing, Computer Programs, Elementary Education
Peer reviewedLansman, Marcy; And Others – Intelligence, 1982
Several measures of the speed of information processing were related to ability factors derived from the Cattell-Horn theory of fluid and crystallized intelligence. Correlations among the ability measures, among the information processing measures, and between the two domains were analyzed using confirmatory factor analysis. (Author/PN)
Descriptors: Cognitive Ability, Cognitive Processes, Computer Assisted Testing, Factor Analysis
Peer reviewedEdwards, John S. – Journal of Research and Development in Education, 1980
The most common functions of computer-assisted testing are item-banking, in which test items are collected and stored; test-construction, specifying item attributes and determining information required for identification of the test; and test scoring. (JN)
Descriptors: Computer Assisted Instruction, Computer Assisted Testing, Computer Oriented Programs, Computer Science


