Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 15 |
Since 2006 (last 20 years) | 36 |
Descriptor
Multiple Choice Tests | 54 |
Statistical Analysis | 54 |
Test Reliability | 54 |
Test Validity | 28 |
Test Construction | 23 |
Foreign Countries | 14 |
Guessing (Tests) | 12 |
College Students | 11 |
Test Items | 11 |
Item Response Theory | 10 |
Reading Comprehension | 10 |
More ▼ |
Source
Author
Alonzo, Julie | 7 |
Irvin, P. Shawn | 6 |
Lai, Cheng-Fei | 6 |
Park, Bitnara Jasmine | 6 |
Tindal, Gerald | 6 |
Biancarosa, Gina | 2 |
Carlson, Sarah E. | 2 |
Davison, Mark L. | 2 |
Frary, Robert B. | 2 |
Liu, Bowen | 2 |
Seipel, Ben | 2 |
More ▼ |
Publication Type
Education Level
Higher Education | 19 |
Postsecondary Education | 16 |
Elementary Education | 9 |
Secondary Education | 6 |
Elementary Secondary Education | 5 |
High Schools | 4 |
Middle Schools | 4 |
Grade 7 | 3 |
Junior High Schools | 3 |
Grade 2 | 2 |
Grade 3 | 2 |
More ▼ |
Audience
Practitioners | 1 |
Laws, Policies, & Programs
Assessments and Surveys
SAT (College Admission Test) | 1 |
Test of English as a Foreign… | 1 |
Test of English for… | 1 |
What Works Clearinghouse Rating
Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen – Assessment for Effective Intervention, 2018
The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Pilot Projects, Measurement
Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen – Grantee Submission, 2018
The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Pilot Projects, Measurement
Quaigrain, Kennedy; Arhin, Ato Kwamina – Cogent Education, 2017
Item analysis is essential in improving items which will be used again in later tests; it can also be used to eliminate misleading items in a test. The study focused on item and test quality and explored the relationship between difficulty index (p-value) and discrimination index (DI) with distractor efficiency (DE). The study was conducted among…
Descriptors: Item Analysis, Teacher Developed Materials, Test Reliability, Educational Assessment
Alonzo, Julie; Anderson, Daniel – Behavioral Research and Teaching, 2018
In response to a request for additional analyses, in particular reporting confidence intervals around the results, we re-analyzed the data from prior studies. This supplementary report presents the results of the additional analyses addressing classification accuracy, reliability, and criterion-related validity evidence. For ease of reference, we…
Descriptors: Curriculum Based Assessment, Computation, Statistical Analysis, Classification
Fiedler, Daniela; Tröbst, Steffen; Harms, Ute – CBE - Life Sciences Education, 2017
Students of all ages face severe conceptual difficulties regarding key aspects of evolution-- the central, unifying, and overarching theme in biology. Aspects strongly related to abstract "threshold" concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument…
Descriptors: College Students, Concept Formation, Probability, Evolution
Krell, Moritz – Cogent Education, 2017
This study evaluates a 12-item instrument for subjective measurement of mental load (ML) and mental effort (ME) by analysing different sources of validity evidence. The findings of an expert judgement (N = 8) provide "evidence based on test content" that the formulation of the items corresponds to the meaning of ML and ME. An empirical…
Descriptors: Cognitive Processes, Test Validity, Secondary School Students, Multiple Choice Tests
Kalinowski, Steven T.; Leonard, Mary J.; Taper, Mark L. – CBE - Life Sciences Education, 2016
We developed and validated the Conceptual Assessment of Natural Selection (CANS), a multiple-choice test designed to assess how well college students understand the central principles of natural selection. The expert panel that reviewed the CANS concluded its questions were relevant to natural selection and generally did a good job sampling the…
Descriptors: Science Instruction, Science Tests, Genetics, Evolution
Guffey, Sarah Katie; Slater, Timothy F.; Slater, Stephanie J. – Journal of Astronomy & Earth Sciences Education, 2017
Geoscience education researchers have considerable need for criterion-referenced, easy-to-administer, easy-to-score, conceptual surveys for undergraduates taking introductory science survey courses in order for faculty to monitor the learning impacts of innovative teaching. In response, this study establishes the reliability and validity of a…
Descriptors: Geology, Scientific Concepts, Science Tests, Undergraduate Students
Akkanat, Cigdem; Gokdere, Murat – Cypriot Journal of Educational Sciences, 2017
Student's ability to use and manipulate scientific concepts has been widely explored; however there is still a need to define the characteristics and nature of science ability. Also, the tests and performance scales that require minimal conceptual knowledge to measure this ability are relatively less common. The aim of this study was to develop an…
Descriptors: Scientific Concepts, Academically Gifted, Middle School Students, Multiple Choice Tests
Negari, Giti Mousapour; Azizi, Aliye; Arani, Davood Khedmatkar – International Journal of Instruction, 2018
The present study attempted to investigate the effects of audio input enhancement on EFL learners' retention of intensifiers. To this end, two research questions were formulated. In order to address these research questions, this study attempted to reject two null hypotheses. Pretest-posttest control group quasi-experimental design was employed to…
Descriptors: English (Second Language), Second Language Learning, Second Language Instruction, Quasiexperimental Design
Tarun, Prashant; Krueger, Dale – Journal of Learning in Higher Education, 2016
In the United States System of Education the growth of student evaluations from 1973 to 1993 has increased from 29% to 86% which in turn has increased the importance of student evaluations on faculty retention, tenure, and promotion. However, the impact student evaluations have had on student academic development generates complex educational…
Descriptors: Critical Thinking, Teaching Methods, Multiple Choice Tests, Essay Tests
Maltepe, Sadet – Eurasian Journal of Educational Research, 2016
Problem Statement: Critical reading refers to individuals' thinking about what they read, assessing what they have read, and using their own judgment about what they have read. In order to teach critical reading skills to students, a teacher is expected to have knowledge about text selection, use of appropriate methods, preparation of functional…
Descriptors: Critical Reading, Preservice Teachers, Language Teachers, Turkish
Schwichow, Martin; Christoph, Simon; Boone, William J.; Härtig, Hendrik – International Journal of Science Education, 2016
The so-called control-of-variables strategy (CVS) incorporates the important scientific reasoning skills of designing controlled experiments and interpreting experimental outcomes. As CVS is a prominent component of science standards appropriate assessment instruments are required to measure these scientific reasoning skills and to evaluate the…
Descriptors: Thinking Skills, Science Instruction, Science Experiments, Science Tests
McFarland, Jenny L.; Price, Rebecca M.; Wenderoth, Mary Pat; Martinková, Patrícia; Cliff, William; Michael, Joel; Modell, Harold; Wright, Ann – CBE - Life Sciences Education, 2017
We present the Homeostasis Concept Inventory (HCI), a 20-item multiple-choice instrument that assesses how well undergraduates understand this critical physiological concept. We used an iterative process to develop a set of questions based on elements in the Homeostasis Concept Framework. This process involved faculty experts and undergraduate…
Descriptors: Scientific Concepts, Multiple Choice Tests, Science Tests, Test Construction
Bichi, Ado Abdu; Hafiz, Hadiza; Bello, Samira Abdullahi – International Journal of Evaluation and Research in Education, 2016
High-stakes testing is used for the purposes of providing results that have important consequences. Validity is the cornerstone upon which all measurement systems are built. This study applied the Item Response Theory principles to analyse Northwest University Kano Post-UTME Economics test items. The developed fifty (50) economics test items was…
Descriptors: Item Response Theory, Test Items, Difficulty Level, Statistical Analysis