NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)0
Since 2006 (last 20 years)3
Location
Australia1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Tasdemir, Mehmet – Journal of Instructional Psychology, 2010
This study aims at comparing the difficulty levels, discrimination powers and powers of testing achievement of multiple choice tests and true-false tests, and thus revealing the rightness or wrongness of the commonly believed hypothesis that multiple choice tests don't bear the same properties as true-false tests. The research was performed with…
Descriptors: Achievement Tests, Multiple Choice Tests, Objective Tests, Student Evaluation
National Assessment Governing Board, 2012
Since 1973, the National Assessment of Educational Progress (NAEP) has gathered information about student achievement in mathematics. Results of these periodic assessments, produced in print and web-based formats, provide valuable information to a wide variety of audiences. They inform citizens about the nature of students' comprehension of the…
Descriptors: Academic Achievement, Mathematics Achievement, National Competency Tests, Grade 4
Peer reviewed Peer reviewed
Hsu, Louis M. – Educational and Psychological Measurement, 1980
Relative difficulty of Separate (Form S) and Grouped (Form G) True-False tests may be expected to be dependent on the ability levels of examinees. At some levels Form S should be less difficult, at others equally difficult, and at still others, more difficult, than Form G. (Author/RL)
Descriptors: Academic Ability, Cluster Grouping, Difficulty Level, Knowledge Level
Ebel, Robert L. – 1981
An alternate-choice test item is a simple declarative sentence, one portion of which is given with two different wordings. For example, "Foundations like Ford and Carnegie tend to be (1) eager (2) hesitant to support innovative solutions to educational problems." The examinee's task is to choose the alternative that makes the sentence…
Descriptors: Comparative Testing, Difficulty Level, Guessing (Tests), Multiple Choice Tests
Perrin, David W.; Kerasotes, Dean L. – 1979
It was hypothesized that using asterisks as attention focusing devices would cause students to read all asteriked test items more carefully and would improve test scores of undergraduate education students. Sixty-three undergraduates majoring in elementary or special education were administered a 36-item objective test. Asterisks were used to…
Descriptors: Difficulty Level, Higher Education, Objective Tests, Response Style (Tests)
Peer reviewed Peer reviewed
McMorris, Robert F.; And Others – Journal of Educational Measurement, 1987
Consistency of gain from changing test answers was tested for students instructed about answer-changing research results, and composition of the gain was analyzed by examining the students' reasons for changing. Mean gain remained positive and consistent with gain for previously studied uninstructed groups; amount of change was also stable.…
Descriptors: Difficulty Level, Graduate Students, Higher Education, Instruction
Dodds, Jeffrey – 1999
Basic precepts for test development are described and explained as they are presented in measurement textbooks commonly used in the fields of education and psychology. The five building blocks discussed as the foundation of well-constructed tests are: (1) specification of purpose; (2) standard conditions; (3) consistency; (4) validity; and (5)…
Descriptors: Difficulty Level, Educational Research, Grading, Higher Education
Peer reviewed Peer reviewed
Tsai, Fu-Ju; Suen, Hoi K. – Educational and Psychological Measurement, 1993
Six methods of scoring multiple true-false items were compared in terms of reliabilities, difficulties, and discrimination. Results suggest that, for norm-referenced score interpretations, there is insufficient evidence to support any one of the methods as superior. For criterion-referenced score interpretations, effects of scoring method must be…
Descriptors: Comparative Analysis, Criterion Referenced Tests, Difficulty Level, Guessing (Tests)
Peer reviewed Peer reviewed
Frisbie, David A.; Sweeney, Daryl C. – Journal of Educational Measurement, 1982
A 100-item five-choice multiple choice (MC) biology final exam was converted to multiple choice true-false (MTF) form to yield two content-parallel test forms comprised of the two item types. Students found the MTF items easier and preferred MTF over MC; the MTF subtests were more reliable. (Author/GK)
Descriptors: Biology, College Science, Comparative Analysis, Difficulty Level
Peer reviewed Peer reviewed
Kolstad, Rosemarie K.; And Others – Journal of Research and Development in Education, 1983
A study compared college students' performance on complex multiple-choice tests with scores on multiple true-false clusters. Researchers concluded that the multiple-choice tests did not accurately measure students' knowledge and that cueing and guessing led to grade inflation. (PP)
Descriptors: Achievement Tests, Difficulty Level, Guessing (Tests), Higher Education
Huntley, Renee M.; Plake, Barbara S. – 1988
The combinational-format item (CFI)--multiple-choice item with combinations of alternatives presented as response choices--was studied to determine whether CFIs were different from regular multiple-choice items in item characteristics or in cognitive processing demands. Three undergraduate Foundations of Education classes (consisting of a total of…
Descriptors: Cognitive Processes, Computer Assisted Testing, Difficulty Level, Educational Psychology
Maihoff, N. A.; Mehrens, Wm. A. – 1985
A comparison is presented of alternate-choice and true-false item forms used in an undergraduate natural science course. The alternate-choice item is a modified two-choice multiple-choice item in which the two responses are included within the question stem. This study (1) compared the difficulty level, discrimination level, reliability, and…
Descriptors: Classroom Environment, College Freshmen, Comparative Analysis, Comparative Testing
Wu, Margaret; Donovan, Jenny; Hutton, Penny; Lennon, Melissa – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In July 2001, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to the development of assessment instruments and key performance measures for reporting on student skills, knowledge and understandings in primary science. It directed the newly established Performance Measurement and Reporting Taskforce…
Descriptors: Foreign Countries, Scientific Literacy, Science Achievement, Comparative Analysis