Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 1 |
Descriptor
Comparative Testing | 11 |
Multiple Choice Tests | 11 |
Scores | 11 |
Higher Education | 8 |
Test Items | 6 |
Computer Assisted Testing | 5 |
Test Format | 5 |
College Students | 4 |
Difficulty Level | 4 |
Test Construction | 4 |
Achievement Tests | 3 |
More ▼ |
Source
Journal of Educational… | 2 |
Applied Measurement in… | 1 |
British Journal of… | 1 |
Educational Measurement:… | 1 |
Evaluation and the Health… | 1 |
Inquiry | 1 |
Author
Publication Type
Reports - Research | 9 |
Journal Articles | 7 |
Reports - Evaluative | 2 |
Speeches/Meeting Papers | 2 |
Numerical/Quantitative Data | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 2 |
Elementary Education | 1 |
Elementary Secondary Education | 1 |
Grade 5 | 1 |
Postsecondary Education | 1 |
Two Year Colleges | 1 |
Audience
Researchers | 2 |
Location
Connecticut | 1 |
Virginia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Graduate Record Examinations | 1 |
What Works Clearinghouse Rating
Park, Jooyong – British Journal of Educational Technology, 2010
The newly developed computerized Constructive Multiple-choice Testing system is introduced. The system combines short answer (SA) and multiple-choice (MC) formats by asking examinees to respond to the same question twice, first in the SA format, and then in the MC format. This manipulation was employed to collect information about the two…
Descriptors: Grade 5, Evaluation Methods, Multiple Choice Tests, Scores

Frary, Robert B. – Applied Measurement in Education, 1991
The use of the "none-of-the-above" option (NOTA) in 20 college-level multiple-choice tests was evaluated for classes with 100 or more students. Eight academic disciplines were represented, and 295 NOTA and 724 regular test items were used. It appears that the NOTA can be compatible with good classroom measurement. (TJH)
Descriptors: College Students, Comparative Testing, Difficulty Level, Discriminant Analysis
Sawyer, Richard; Welch, Catherine – 1990
The frequency of multiple testing on the Proficiency Examination Program (PEP) multiple-choice tests, the characteristics of examinees who retest, and the effects of retesting on test scores were examined. Tests in the PEP program cover a broad range of academic disciplines and generally include material covered in one or two semesters of an…
Descriptors: Achievement Gains, Achievement Tests, College Students, Comparative Testing
Laird, Barbara B. – Inquiry, 2003
Laird studies the relationship between two computerized nursing tests and finds a relationship between the two sets of scores. (Contains 2 tables.)
Descriptors: Nursing Education, Nurses, Computer Assisted Testing, Comparative Testing

Harasym, P. H.; And Others – Evaluation and the Health Professions, 1980
Coded, as opposed to free response items, in a multiple choice physiology test had a cueing effect which raised students' scores, especially for lower achievers. Reliability of coded items was also lower. Item format and scoring method had an effect on test results. (GDC)
Descriptors: Achievement Tests, Comparative Testing, Cues, Higher Education
Chissom, Brad; Chukabarah, Prince C. O. – 1985
The comparative effects of various sequences of test items were examined for over 900 graduate students enrolled in an educational research course at The University of Alabama, Tuscaloosa. experiment, which was conducted a total of four times using four separate tests, presented three different arrangements of 50 multiple-choice items: (1)…
Descriptors: Analysis of Variance, Comparative Testing, Difficulty Level, Graduate Students

Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing
National Evaluation Systems, Inc., Amherst, MA. – 1981
The Connecticut Assessment of Educational Progress (CAEP) is a continuing program designed to measure the efficiency of educational programs offered by public schools. During 1979-80, grades 4, 8, 11 were tested in science. The assessment goals were: to provide information on the quality of education in Connecticut; to collect and compare data…
Descriptors: Academic Achievement, Comparative Testing, Educational Assessment, Elementary Secondary Education
Anderson, Paul S.; Kanzler, Eileen M. – 1985
Test scores were compared for two types of objective achievement tests--multiple choice tests and the recently developed Multi-Digit Test (MDT) procedure. MDT is an approximation of the fill-in-the-blank technique. Students select their answers from long lists of alphabetized terms, with each answer corresponding to a number from 001 to 999. The…
Descriptors: Achievement Tests, Cloze Procedure, Comparative Testing, Computer Assisted Testing

Wise, Steven L.; And Others – Journal of Educational Measurement, 1992
Performance of 156 undergraduate and 48 graduate students on a self-adapted test (SFAT)--students choose the difficulty level of their test items--was compared with performance on a computer-adapted test (CAT). Those taking the SFAT obtained higher ability scores and reported lower posttest state anxiety than did CAT takers. (SLD)
Descriptors: Adaptive Testing, Comparative Testing, Computer Assisted Testing, Difficulty Level

Armstrong, Anne-Marie – Educational Measurement: Issues and Practice, 1993
The effects of test performance of differentially written multiple-choice tests and test takers' cognitive style were studied for 47 graduate students and 35 public school and college teachers. Adhering to test-writing item guidelines resulted in mean scores basically the same for two groups of differing cognitive style. (SLD)
Descriptors: Cognitive Style, College Faculty, Comparative Testing, Graduate Students