NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
National Defense Education Act1
What Works Clearinghouse Rating
Showing 1 to 15 of 21 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Valentina Albano; Donatella Firmani; Luigi Laura; Jerin George Mathew; Anna Lucia Paoletti; Irene Torrente – Journal of Learning Analytics, 2023
Multiple-choice questions (MCQs) are widely used in educational assessments and professional certification exams. Managing large repositories of MCQs, however, poses several challenges due to the high volume of questions and the need to maintain their quality and relevance over time. One of these challenges is the presence of questions that…
Descriptors: Natural Language Processing, Multiple Choice Tests, Test Items, Item Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Kyeng Gea Lee; Mark J. Lee; Soo Jung Lee – International Journal of Technology in Education and Science, 2024
Online assessment is an essential part of online education, and if conducted properly, has been found to effectively gauge student learning. Generally, textbased questions have been the cornerstone of online assessment. Recently, however, the emergence of generative artificial intelligence has added a significant challenge to the integrity of…
Descriptors: Artificial Intelligence, Computer Software, Biology, Science Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Sheng, Yanyan – Measurement: Interdisciplinary Research and Perspectives, 2019
Classical approach to test theory has been the foundation for educational and psychological measurement for over 90 years. This approach concerns with measurement error and hence test reliability, which in part relies on individual test items. The CTT package, developed in light of this, provides functions for test- and item-level analyses of…
Descriptors: Item Response Theory, Test Reliability, Item Analysis, Error of Measurement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Yusof, Safiah Md; Lim, Tick Meng; Png, Leo; Khatab, Zainuriyah Abd; Singh, Harvinder Kaur Dharam – Journal of Learning for Development, 2017
Open University Malaysia (OUM) is progressively moving towards implementing assessment on demand and online assessment. This move is deemed necessary for OUM to continue to be the leading provider of flexible learning. OUM serves a very large number of students each semester and these students are vastly distributed throughout the country. As the…
Descriptors: Foreign Countries, Computer Assisted Testing, Computer Managed Instruction, Management Systems
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Koçdar, Serpil; Karadag, Nejdet; Sahin, Murat Dogan – Turkish Online Journal of Educational Technology - TOJET, 2016
This is a descriptive study which intends to determine whether the difficulty and discrimination indices of the multiple-choice questions show differences according to cognitive levels of the Bloom's Taxonomy, which are used in the exams of the courses in a business administration bachelor's degree program offered through open and distance…
Descriptors: Multiple Choice Tests, Difficulty Level, Distance Education, Open Education
National Assessment Governing Board, 2017
The National Assessment of Educational Progress (NAEP) is the only continuing and nationally representative measure of trends in academic achievement of U.S. elementary and secondary school students in various subjects. For more than four decades, NAEP assessments have been conducted periodically in reading, mathematics, science, writing, U.S.…
Descriptors: Mathematics Achievement, Multiple Choice Tests, National Competency Tests, Educational Trends
Peer reviewed Peer reviewed
Direct linkDirect link
Towns, Marcy H. – Journal of Chemical Education, 2014
Chemistry faculty members are highly skilled in obtaining, analyzing, and interpreting physical measurements, but often they are less skilled in measuring student learning. This work provides guidance for chemistry faculty from the research literature on multiple-choice item development in chemistry. Areas covered include content, stem, and…
Descriptors: Multiple Choice Tests, Test Construction, Psychometrics, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Wen-Chung; Huang, Sheng-Yun – Educational and Psychological Measurement, 2011
The one-parameter logistic model with ability-based guessing (1PL-AG) has been recently developed to account for effect of ability on guessing behavior in multiple-choice items. In this study, the authors developed algorithms for computerized classification testing under the 1PL-AG and conducted a series of simulations to evaluate their…
Descriptors: Computer Assisted Testing, Classification, Item Analysis, Probability
Setzer, J. Carl – GED Testing Service, 2009
The GED[R] English as a Second Language (GED ESL) Test was designed to serve as an adjunct to the GED test battery when an examinee takes either the Spanish- or French-language version of the tests. The GED ESL Test is a criterion-referenced, multiple-choice instrument that assesses the functional, English reading skills of adults whose first…
Descriptors: Language Tests, High School Equivalency Programs, Psychometrics, Reading Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Wentzel, Carolyn – Journal of Science Education and Technology, 2006
INTEGRITY, an item analysis and statistical collusion detection (answer copying) online application, was reviewed. Features of the software and examples of program output are described in detail. INTEGRITY was found to be easily utilized with an abundance of well-organized documentation and built-in features designed to guide the user through the…
Descriptors: Item Analysis, Computer Software, Multiple Choice Tests, Costs
Vacc, Nicholas A.; Loesch, Larry C.; Lubik, Ruth E. – 2001
Multiple choice tests are widely viewed as the most effective and objective means of assessment. Item development is the central component of creating an effective test, but test developers often do not have the background in item development. This document describes recall, application, and analysis, the three cognitive levels of test items. It…
Descriptors: Educational Assessment, Evaluation, Item Analysis, Measures (Individuals)
McCowan, Richard J. – Online Submission, 1999
Item writing is a major responsibility of trainers. Too often, qualified staff who prepare lessons carefully and teach conscientiously use inadequate tests that do not validly reflect the true level of trainee achievement. This monograph describes techniques for constructing multiple-choice items that measure student performance accurately. It…
Descriptors: Multiple Choice Tests, Item Analysis, Test Construction, Test Items
Matlock-Hetzel, Susan – 1997
When norm-referenced tests are developed for instructional purposes, to assess the effects of educational programs, or for educational research purposes, it can be very important to conduct item and test analyses. These analyses can evaluate the quality of items and of the test as a whole. Such analyses can also be employed to revise and improve…
Descriptors: Difficulty Level, Distractors (Tests), Elementary Secondary Education, Item Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Pommerich, Mary – Journal of Technology, Learning, and Assessment, 2004
As testing moves from paper-and-pencil administration toward computerized administration, how to present tests on a computer screen becomes an important concern. Of particular concern are tests that contain necessary information that cannot be displayed on screen all at once for an item. Ideally, the method of presentation should not interfere…
Descriptors: Test Content, Computer Assisted Testing, Multiple Choice Tests, Computer Interfaces
Leuba, Richard J. – Engineering Education, 1986
Explains how multiple choice test items can be devised to measure higher-order learning, including engineering problem solving. Discusses the value and information provided in item analysis procedures with machine-scored tests. Suggests elements to consider in test design. (ML)
Descriptors: College Science, Creative Thinking, Engineering Education, Evaluation Methods
Previous Page | Next Page »
Pages: 1  |  2