Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 1 |
Descriptor
Comparative Testing | 6 |
Test Construction | 6 |
Test Reliability | 4 |
Higher Education | 3 |
Test Format | 3 |
Test Items | 3 |
Test Validity | 3 |
College Students | 2 |
Construct Validity | 2 |
Distractors (Tests) | 2 |
Factor Analysis | 2 |
More ▼ |
Source
Educational and Psychological… | 6 |
Author
Caruso, Grace-Ann L. | 1 |
Carver, Ronald P. | 1 |
Crehan, Kevin D. | 1 |
Huynh, Huynh | 1 |
Kim, Do-Hong | 1 |
Schriesheim, Chester A. | 1 |
Trevisan, Michael S. | 1 |
Publication Type
Journal Articles | 6 |
Reports - Research | 5 |
Reports - Evaluative | 2 |
Speeches/Meeting Papers | 1 |
Education Level
High Schools | 1 |
Middle Schools | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Kim, Do-Hong; Huynh, Huynh – Educational and Psychological Measurement, 2008
The current study compared student performance between paper-and-pencil testing (PPT) and computer-based testing (CBT) on a large-scale statewide end-of-course English examination. Analyses were conducted at both the item and test levels. The overall results suggest that scores obtained from PPT and CBT were comparable. However, at the content…
Descriptors: Reading Comprehension, Computer Assisted Testing, Factor Analysis, Comparative Testing

Caruso, Grace-Ann L. – Educational and Psychological Measurement, 1992
Results from a pilot group of 22 parents of preschoolers and 96 parents of toddlers suggest that 3 developed measures of the supportive relationship between parents and child caregivers--the Caregiver Support Appraisal Scale-V, the Caregiver Support Appraisal Scale-P, and the Caregiver Supportive Behavior Scale--are valid and reliable measures.…
Descriptors: Child Caregivers, Comparative Testing, Construct Validity, Factor Analysis

Crehan, Kevin D.; And Others – Educational and Psychological Measurement, 1993
Studies with 220 college students found that multiple-choice test items with 3 items are more difficult than those with 4 items, and items with the none-of-these option are more difficult than those without this option. Neither format manipulation affected item discrimination. Implications for test construction are discussed. (SLD)
Descriptors: College Students, Comparative Testing, Difficulty Level, Distractors (Tests)

Schriesheim, Chester A.; And Others – Educational and Psychological Measurement, 1991
Effects of item wording on questionnaire reliability and validity were studied, using 280 undergraduate business students who completed a questionnaire comprising 4 item types: (1) regular; (2) polar opposite; (3) negated polar opposite; and (4) negated regular. Implications of results favoring regular and negated regular items are discussed. (SLD)
Descriptors: Business Education, Comparative Testing, Higher Education, Negative Forms (Language)

Carver, Ronald P. – Educational and Psychological Measurement, 1992
Reliability and validity of a new measure of cognitive speed, the Speed of Thinking Test (SST), were investigated with 129 college students, who also completed a vocabulary test, a test of reading speed, and a test of reading comprehension. The SST appears to be a reliable and valid measure. (SLD)
Descriptors: Cognitive Ability, Cognitive Tests, College Students, Comparative Testing

Trevisan, Michael S.; And Others – Educational and Psychological Measurement, 1991
The reliability and validity of multiple-choice tests were computed as a function of the number of options per item and student ability for 435 parochial high school juniors, who were administered the Washington Pre-College Test Battery. Results suggest the efficacy of the three-option item. (SLD)
Descriptors: Ability, Comparative Testing, Distractors (Tests), Grade Point Average