NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)2
Since 2006 (last 20 years)9
Audience
Teachers1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 16 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Perret, Cecile A.; Johnson, Amy M.; McCarthy, Kathryn S.; Guerrero, Tricia A.; Dai, Jianmin; McNamara, Danielle S. – Grantee Submission, 2017
This paper introduces StairStepper, a new addition to Interactive Strategy Training for Active Reading and Thinking (iSTART), an intelligent tutoring system (ITS) that provides adaptive self-explanation training and practice. Whereas iSTART focuses on improving comprehension at levels geared toward answering challenging questions associated with…
Descriptors: Reading Comprehension, Reading Instruction, Intelligent Tutoring Systems, Reading Strategies
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Koçdar, Serpil; Karadag, Nejdet; Sahin, Murat Dogan – Turkish Online Journal of Educational Technology - TOJET, 2016
This is a descriptive study which intends to determine whether the difficulty and discrimination indices of the multiple-choice questions show differences according to cognitive levels of the Bloom's Taxonomy, which are used in the exams of the courses in a business administration bachelor's degree program offered through open and distance…
Descriptors: Multiple Choice Tests, Difficulty Level, Distance Education, Open Education
Peer reviewed Peer reviewed
Direct linkDirect link
Towns, Marcy H. – Journal of Chemical Education, 2014
Chemistry faculty members are highly skilled in obtaining, analyzing, and interpreting physical measurements, but often they are less skilled in measuring student learning. This work provides guidance for chemistry faculty from the research literature on multiple-choice item development in chemistry. Areas covered include content, stem, and…
Descriptors: Multiple Choice Tests, Test Construction, Psychometrics, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Domyancich, John M. – Journal of Chemical Education, 2014
Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exam. However, past AP chemistry exam items often lacked the ability to test conceptual understanding and higher-order cognitive skills. The redesigned AP chemistry exam shows a distinctive shift in item types toward…
Descriptors: Multiple Choice Tests, Science Instruction, Chemistry, Summative Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Andrich, David; Marais, Ida; Humphry, Stephen – Journal of Educational and Behavioral Statistics, 2012
Andersen (1995, 2002) proves a theorem relating variances of parameter estimates from samples and subsamples and shows its use as an adjunct to standard statistical analyses. The authors show an application where the theorem is central to the hypothesis tested, namely, whether random guessing to multiple choice items affects their estimates in the…
Descriptors: Test Items, Item Response Theory, Multiple Choice Tests, Guessing (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
Schroeder, Jacob; Murphy, Kristen L.; Holme, Thomas A. – Journal of Chemical Education, 2012
General chemistry tests from the Examinations Institute of the Division of Chemical Education of the American Chemical Society have been analyzed to identify factors that may influence how individual test items perform. In this paper, issues of item order (position within a set of items that comprise a test) and answer order (position of correct…
Descriptors: Chemistry, Test Items, Individual Testing, Test Construction
Sawchuk, Stephen – Education Digest: Essential Readings Condensed for Quick Review, 2010
Most experts in the testing community have presumed that the $350 million promised by the U.S. Department of Education to support common assessments would promote those that made greater use of open-ended items capable of measuring higher-order critical-thinking skills. But as measurement experts consider the multitude of possibilities for an…
Descriptors: Educational Quality, Test Items, Comparative Analysis, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Jensen, Murray; Duranczyk, Irene; Staats, Susan; Moore, Randy; Hatch, Jay; Somdahl, Chas – American Biology Teacher, 2006
This paper describes and evaluates a new type of multiple-choice test question that is relatively easy to construct and that challenges students' understandings of biological concepts. The questions involve a small narrative of scientific text that students must evaluate for accuracy. These are termed "You are the Teacher" questions because the…
Descriptors: Reciprocal Teaching, Multiple Choice Tests, Biology, Evaluation
Alonzo, Julie; Liu, Kimy; Tindal, Gerald – Behavioral Research and Teaching, 2007
In this technical report, the authors describe the development and piloting of reading comprehension measures as part of a comprehensive progress monitoring literacy assessment system developed in 2006 for use with students in Kindergarten through fifth grade. They begin with a brief overview of the two conceptual frameworks underlying the…
Descriptors: Reading Comprehension, Emergent Literacy, Test Construction, Literacy Education
Peer reviewed Peer reviewed
Direct linkDirect link
Revuelta, Javier – Psychometrika, 2004
Two psychometric models are presented for evaluating the difficulty of the distractors in multiple-choice items. They are based on the criterion of rising distractor selection ratios, which facilitates interpretation of the subject and item parameters. Statistical inferential tools are developed in a Bayesian framework: modal a posteriori…
Descriptors: Multiple Choice Tests, Psychometrics, Models, Difficulty Level
Matlock-Hetzel, Susan – 1997
When norm-referenced tests are developed for instructional purposes, to assess the effects of educational programs, or for educational research purposes, it can be very important to conduct item and test analyses. These analyses can evaluate the quality of items and of the test as a whole. Such analyses can also be employed to revise and improve…
Descriptors: Difficulty Level, Distractors (Tests), Elementary Secondary Education, Item Analysis
Smith, Richard M. – 1982
There have been many attempts to formulate a procedure for extracting information from incorrect responses to multiple choice items, i.e., the assessment of partial knowledge. The results of these attempts can be described as inconsistent at best. It is hypothesized that these inconsistencies arise from three methodological problems: the…
Descriptors: Difficulty Level, Evaluation Methods, Goodness of Fit, Guessing (Tests)
Linacre, John M. – 1987
This paper describes a computer program in Microsoft BASIC which selects and administers test items from a small item bank. The level of the difficulty of the item selected depends on the test taker's previous response. This adaptive system is based on the Rasch model. The Rasch model uses a unit of measurement based on the logarithm of the…
Descriptors: Adaptive Testing, Computer Assisted Testing, Difficulty Level, Individual Testing
Velanoff, John – 1987
This report describes courseware for comprehensive computer-assisted testing and instruction. With this program, a personal computer can be used to: (1) generate multiple test versions to meet test objectives; (2) create study guides for self-directed learning; and (3) evaluate student and teacher performance. Numerous multiple-choice examples,…
Descriptors: Computer Assisted Instruction, Computer Assisted Testing, Computer Uses in Education, Courseware
Arth, Thomas O. – 1986
The process of revising and validating two English language tests used by the United States armed forces in hiring foreign nationals overseas is described. Development of the item banks and classification of items are outlined, and field testing in the United States and overseas is described. The tests were the basic and intermediate level…
Descriptors: Armed Forces, Comparative Analysis, Correlation, Difficulty Level
Previous Page | Next Page »
Pages: 1  |  2