Descriptor
Higher Education | 14 |
Responses | 14 |
Test Format | 14 |
Test Items | 8 |
College Students | 4 |
Test Construction | 4 |
Computer Assisted Testing | 3 |
Multiple Choice Tests | 3 |
Questionnaires | 3 |
Rating Scales | 3 |
Testing | 3 |
More ▼ |
Source
Educational and Psychological… | 3 |
Contemporary Educational… | 2 |
Evaluation and the Health… | 1 |
Journal of Educational… | 1 |
PS: Political Science and… | 1 |
Author
Aiken, Lewis R. | 1 |
Arnau, Randolph C. | 1 |
Blumberg, Phyllis | 1 |
Cook, Colleen | 1 |
Enger, John M. | 1 |
Holley, Charles D. | 1 |
Huntley, Renee M. | 1 |
Kumar, V.K. | 1 |
Leitner, Dennis W. | 1 |
Marshall, Thomas E. | 1 |
Ory, John C. | 1 |
More ▼ |
Publication Type
Reports - Research | 13 |
Journal Articles | 8 |
Speeches/Meeting Papers | 6 |
Numerical/Quantitative Data | 1 |
Reports - Descriptive | 1 |
Tests/Questionnaires | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
Minnesota Multiphasic… | 1 |
What Works Clearinghouse Rating

Arnau, Randolph C.; Thompson, Russel L.; Cook, Colleen – Educational and Psychological Measurement, 2001
Used two different coherent cut kinetics taxonomic procedures to examine the latent structure of responses to a survey of library service quality using an unnumbered slider-bar user interface and a radio-button user interface. Results for 354 college students show that both interfaces yield similar latent structures of survey item responses. (SLD)
Descriptors: College Students, Higher Education, Responses, Surveys

Sanjivamurthy, P.T.; Kumar, V.K. – Contemporary Educational Psychology, 1983
After six weeks of testing college algebra students (n=84) either on recall or recognition tests, the test modes were changed without warning. Results showed that performance suffered when the test mode was changed for students anticipating a recognition test. Students anticipating a recall test did equally well in both test modes. (Author/PN)
Descriptors: Algebra, Higher Education, Long Term Memory, Recall (Psychology)
Schuldberg, David – 1988
Indices were constructed to measure individual differences in the effects of the automated testing format and repeated testing on Minnesota Multiphasic Personality Inventory (MMPI) responses. Two types of instability measures were studied within a data set from the responses of 150 undergraduate students who took a computer-administered and…
Descriptors: College Students, Computer Assisted Testing, Higher Education, Individual Differences
Huntley, Renee M.; Plake, Barbara S. – 1981
This investigation of the effect of making a set of alternatives conform grammatically to a test item stem showed there is subject sensitivity to such cues. Content-free versions of American College Testing Assessment Experimental Social Science items representing singular-plural and vowel-consonant agreement without inappropriate grammatical…
Descriptors: Cues, Grammatical Acceptability, Higher Education, Response Style (Tests)

Aiken, Lewis R. – Educational and Psychological Measurement, 1983
Each of six forms of a 10-item teacher evaluation rating scale, having two to seven response categories per form, was administered to over 100 college students. Means of item responses and item variances increased with the number of response categories. Internal consistency of total scores did not change systematically. (Author/PN)
Descriptors: College Students, Higher Education, Item Analysis, Rating Scales
Tollefson, Nona; Tripp, Alice – 1983
This study compared the item difficulty and item discrimination of three multiple choice item formats. The multiple choice formats studied were: a complex alternative (none of the above) as the correct answer; a complex alternative as a foil, and the one-correct answer format. One hundred four graduate students were randomly assigned to complete…
Descriptors: Analysis of Variance, Difficulty Level, Graduate Students, Higher Education

Ory, John C. – Educational and Psychological Measurement, 1982
In two studies, selections of evaluation form items were negatively worded and presented before or after overall student ratings. Ratings of courses and instructors were not significantly affected by wording. Differences in the global assessment of the courses are discussed. (Author/CM)
Descriptors: Course Evaluation, Evaluation Methods, Higher Education, Item Analysis

Marshall, Thomas E.; And Others – Journal of Educational Technology Systems, 1996
Examines the strategies used in answering a computerized multiple-choice test where all questions on a semantic topic were grouped together or randomly distributed. Findings indicate that students grouped by performance on the test used different strategies in completing the test due to distinct cognitive processes between the groups. (AEF)
Descriptors: Academic Achievement, Cognitive Processes, Computer Assisted Testing, Higher Education
Enger, John M.; And Others – 1993
The response rates of university graduates and the cost per return were studied for a 20-item questionnaire presented in 3 formats as follows: (1) a 2-page questionnaire with an accompanying self-addressed stamped envelope; (2) a condensed format with smaller type, on 1 page, with a self-addressed stamped envelope; and (3) the single-page…
Descriptors: Attitude Measures, College Graduates, Comparative Testing, Cost Effectiveness

Blumberg, Phyllis – Evaluation and the Health Professions, 1981
The utility of general or select response formats for evaluating certain types of clinical competence is studied. Consideration of the suitability of an examination format to fulfill its intended purpose and the appropriateness of the questions included is recommended when designing an examination. (Author/AL)
Descriptors: Certification, Clinical Diagnosis, Clinical Experience, Higher Education
Parshall, Cynthia G.; Stewart, Rob; Ritter, Judy – 1996
While computer-based tests might be as simple as computerized versions of paper-and-pencil examinations, more innovative applications also exist. Examples of innovations in computer-based assessment include the use of graphics or sound, some measure of interactivity, a change in the means in which examinees responded to items, and the application…
Descriptors: College Students, Computer Assisted Testing, Educational Innovation, Graphic Arts

Holley, Charles D.; And Others – Contemporary Educational Psychology, 1979
College students were trained on a hierarchical mapping technique designed to facilitate prose processing. The students studied a geology passage and five days later were given four types of tests. The treatment group significantly outperformed a control group; the major differences were attributable to concept cloze and essay exams. (Author/RD)
Descriptors: Cloze Procedure, Educational Testing, Essay Tests, Higher Education
Leitner, Dennis W.; And Others – 1979
To discover factors which contribute to a high response rate for questionnaire surveys, the preferences of 150 college teachers and teaching assistants were studied. Four different questionnaire formats using 34 common items were sent to the subjects: open-ended; Likert-type (five points, from "strong influence to return," to…
Descriptors: Check Lists, College Faculty, Comparative Testing, Higher Education

Seitz, John L. – PS: Political Science and Politics, 1996
Outlines a comprehensive political science final examination that covers the wide range of material found in introductory courses. The final examination consists of a single question that asks students to propose a solution to a major social problem. Their answer must incorporate political ideology, economic impact, and media representation. (MJP)
Descriptors: Grades (Scholastic), Grading, Higher Education, Instructional Innovation