NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 14 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Slepkov, Aaron D.; Shiell, Ralph C. – Physical Review Special Topics - Physics Education Research, 2014
Constructed-response (CR) questions are a mainstay of introductory physics textbooks and exams. However, because of the time, cost, and scoring reliability constraints associated with this format, CR questions are being increasingly replaced by multiple-choice (MC) questions in formal exams. The integrated testlet (IT) is a recently developed…
Descriptors: Science Tests, Physics, Responses, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Park, Jooyong – British Journal of Educational Technology, 2010
The newly developed computerized Constructive Multiple-choice Testing system is introduced. The system combines short answer (SA) and multiple-choice (MC) formats by asking examinees to respond to the same question twice, first in the SA format, and then in the MC format. This manipulation was employed to collect information about the two…
Descriptors: Grade 5, Evaluation Methods, Multiple Choice Tests, Scores
Peer reviewed Peer reviewed
Direct linkDirect link
Vuk, Jasna; Morse, David T. – Research in the Schools, 2009
In this study we observed college students' behavior on two self-tailored, multiple-choice exams. Self-tailoring was defined as an option to omit up to five items from being scored on an exam. Participants, 80 undergraduate college students enrolled in two sections of an educational psychology course, statistically significantly improved their…
Descriptors: College Students, Educational Psychology, Academic Achievement, Correlation
Thiede, Keith W.; And Others – 1991
A correlational analysis was performed to examine the relationship between recognition and recall test formats. A total of 236 college students completed one of four 80-item general knowledge tests; the forms contained 20 items of each of four formats: (1) true; (2) false; (3) multiple-choice; and (4) free response. Ninety-three of the subjects…
Descriptors: Cognitive Processes, College Students, Comparative Testing, Correlation
Peer reviewed Peer reviewed
Frary, Robert B. – Applied Measurement in Education, 1991
The use of the "none-of-the-above" option (NOTA) in 20 college-level multiple-choice tests was evaluated for classes with 100 or more students. Eight academic disciplines were represented, and 295 NOTA and 724 regular test items were used. It appears that the NOTA can be compatible with good classroom measurement. (TJH)
Descriptors: College Students, Comparative Testing, Difficulty Level, Discriminant Analysis
Sawyer, Richard; Welch, Catherine – 1990
The frequency of multiple testing on the Proficiency Examination Program (PEP) multiple-choice tests, the characteristics of examinees who retest, and the effects of retesting on test scores were examined. Tests in the PEP program cover a broad range of academic disciplines and generally include material covered in one or two semesters of an…
Descriptors: Achievement Gains, Achievement Tests, College Students, Comparative Testing
Peer reviewed Peer reviewed
Bennett, Randy Elliot; And Others – Applied Psychological Measurement, 1990
The relationship of an expert-system-scored constrained free-response item type to multiple-choice and free-response items was studied using data for 614 students on the College Board's Advanced Placement Computer Science (APCS) Examination. Implications for testing and the APCS test are discussed. (SLD)
Descriptors: College Students, Comparative Testing, Computer Assisted Testing, Computer Science
Peer reviewed Peer reviewed
Crehan, Kevin D.; And Others – Educational and Psychological Measurement, 1993
Studies with 220 college students found that multiple-choice test items with 3 items are more difficult than those with 4 items, and items with the none-of-these option are more difficult than those without this option. Neither format manipulation affected item discrimination. Implications for test construction are discussed. (SLD)
Descriptors: College Students, Comparative Testing, Difficulty Level, Distractors (Tests)
Peer reviewed Peer reviewed
Bridgeman, Brent; Rock, Donald A. – Journal of Educational Measurement, 1993
Exploratory and confirmatory factor analyses were used to explore relationships among existing item types and three new computer-administered item types for the analytical scale of the Graduate Record Examination General Test. Results with 349 students indicate constructs the item types are measuring. (SLD)
Descriptors: College Entrance Examinations, College Students, Comparative Testing, Computer Assisted Testing
PDF pending restoration PDF pending restoration
Anderson, Paul S.; Hyers, Albert D. – 1991
Three descriptive statistics (difficulty, discrimination, and reliability) of multiple-choice (MC) test items were compared to those of a new (1980s) format of machine-scored questions. The new method, answer-bank multi-digit testing (MDT), uses alphabetized lists of up to 1,000 alternatives and approximates the completion style of assessment…
Descriptors: College Students, Comparative Testing, Computer Assisted Testing, Correlation
Yachimowicz, David J.; And Others – 1990
The psychometric properties of a paper-and-pencil instrument for assessing individual differences in cerebral dominance are explored. The instrument, Your Style of Learning and Thinking (SOLAT), contains 50 multiple-choice questions. The study subjects consisted of three groups: 235 undergraduate and graduate students, 124 undergraduate and…
Descriptors: Adults, Brain Hemisphere Functions, College Students, Comparative Testing
Pike, Gary – 1989
Responses to American College Test College Outcome Measures Program (ACT-COMP) items by 481 black and 9,237 white students at the University of Tennessee (Knoxville) were analyzed using F. Samejima's graded model to determine the level of differential item functioning (DIF). Students had been tested using Form 8 of the ACT-COMP objective test…
Descriptors: Black Students, College Entrance Examinations, College Students, Comparative Testing
Peer reviewed Peer reviewed
Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing
PDF pending restoration PDF pending restoration
Hyers, Albert D.; Anderson, Paul S. – 1991
Using matched pairs of geography questions, a new testing method for machine-scored fill-in-the-blank, multiple-digit testing (MDT) questions was compared to the traditional multiple-choice (MC) style. Data were from 118 matched or parallel test items for 4 tests from 764 college students of geography. The new method produced superior results when…
Descriptors: College Students, Comparative Testing, Computer Assisted Testing, Difficulty Level