NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)28
Since 2006 (last 20 years)39
Audience
Researchers2
Laws, Policies, & Programs
Showing 1 to 15 of 47 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Howard, Matt C. – Practical Assessment, Research & Evaluation, 2018
Scale pretests analyze the suitability of individual scale items for further analysis, whether through judging their face validity, wording concerns, and/or other aspects. The current article reviews scale pretests, separated by qualitative and quantitative methods, in order to identify the differences, similarities, and even existence of the…
Descriptors: Pretesting, Measures (Individuals), Test Items, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Batsell, W. Robert, Jr.; Perry, Jennifer L.; Hanley, Elizabeth; Hostetter, Autumn B. – Teaching of Psychology, 2017
The testing effect is the enhanced retention of learned information by individuals who have studied and completed a test over the material relative to individuals who have only studied the material. Although numerous laboratory studies and simulated classroom studies have provided evidence of the testing effect, data from a natural class setting…
Descriptors: Tests, Psychology, Introductory Courses, Quasiexperimental Design
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Smith, Tamarah; Smith, Samantha – International Journal of Teaching and Learning in Higher Education, 2018
The Research Methods Skills Assessment (RMSA) was created to measure psychology majors' statistics knowledge and skills. The American Psychological Association's Guidelines for the Undergraduate Major in Psychology (APA, 2007, 2013) served as a framework for development. Results from a Rasch analysis with data from n = 330 undergraduates showed…
Descriptors: Psychology, Statistics, Undergraduate Students, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Foss, Donald J.; Pirozzolo, Joseph W. – Journal of Educational Psychology, 2017
We carried out 4 semester-long studies of student performance in a college research methods course (total N = 588). Two sections of it were taught each semester with systematic and controlled differences between them. Key manipulations were repeated (with some variation) across the 4 terms, allowing assessment of replicability of effects.…
Descriptors: Undergraduate Students, Student Evaluation, Testing, Incidence
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Eleje, Lydia I.; Esomonu, Nkechi P. M. – Asian Journal of Education and Training, 2018
A Test to measure achievement in quantitative economics among secondary school students was developed and validated in this study. The test is made up 20 multiple choice test items constructed based on quantitative economics sub-skills. Six research questions guided the study. Preliminary validation was done by two experienced teachers in…
Descriptors: Item Response Theory, Secondary Schools, Foreign Countries, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Benítez, Isabel; Padilla, José-Luis; Hidalgo Montesinos, María Dolores; Sireci, Stephen G. – Applied Measurement in Education, 2016
Analysis of differential item functioning (DIF) is often used to determine if cross-lingual assessments are equivalent across languages. However, evidence on the causes of cross-lingual DIF is still evasive. Expert appraisal is a qualitative method useful for obtaining detailed information about problematic elements in the different linguistic…
Descriptors: Test Bias, Mixed Methods Research, Questionnaires, International Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Le Hebel, Florence; Montpied, Pascale; Tiberghien, Andrée; Fontanieu, Valérie – International Journal of Science Education, 2017
The understanding of what makes a question difficult is a crucial concern in assessment. To study the difficulty of test questions, we focus on the case of PISA, which assesses to what degree 15-year-old students have acquired knowledge and skills essential for full participation in society. Our research question is to identify PISA science item…
Descriptors: Achievement Tests, Foreign Countries, International Assessment, Secondary School Students
Pawade, Yogesh R.; Diwase, Dipti S. – Journal of Educational Technology, 2016
Item analysis of Multiple Choice Questions (MCQs) is the process of collecting, summarizing and utilizing information from students' responses to evaluate the quality of test items. Difficulty Index (p-value), Discrimination Index (DI) and Distractor Efficiency (DE) are the parameters which help to evaluate the quality of MCQs used in an…
Descriptors: Test Items, Item Analysis, Multiple Choice Tests, Curriculum Development
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Aksakalli, Ayhan; Turgut, Umit; Salar, Riza – Journal of Education and Practice, 2016
The purpose of this study is to investigate whether students are more successful on abstract or illustrated test questions. To this end, the questions on an abstract test were changed into a visual format, and these tests were administered every three days to a total of 240 students at six middle schools located in the Erzurum city center and…
Descriptors: Comparative Analysis, Scores, Middle School Students, Grade 8
Peer reviewed Peer reviewed
Direct linkDirect link
Wesolowski, Brian C.; Amend, Ross M.; Barnstead, Thomas S.; Edwards, Andrew S.; Everhart, Matthew; Goins, Quentin R.; Grogan, Robert J., III; Herceg, Amanda M.; Jenkins, S. Ira; Johns, Paul M.; McCarver, Christopher J.; Schaps, Robin E.; Sorrell, Gary W.; Williams, Jonathan D. – Journal of Research in Music Education, 2017
The purpose of this study was to describe the development of a valid and reliable rubric to assess secondary-level solo instrumental music performance based on principles of invariant measurement. The research questions that guided this study included (1) What is the psychometric quality (i.e., validity, reliability, and precision) of a scale…
Descriptors: Music, Music Education, Musical Instruments, Measurement Techniques
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Khaksefidi, Saman – International Education Studies, 2017
This study investigates the psychological effect of a wrong question with wrong items on answering to the next question in a test of structure. Forty students selected through stratified random sampling are given 15 questions of a standardized test namely a TOEFL structure test in which questions number 7 and number 11 are wrong and their answers…
Descriptors: Language Tests, English (Second Language), Second Language Learning, Statistical Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Pachai, Matthew V.; DiBattista, David; Kim, Joseph A. – Canadian Journal for the Scholarship of Teaching and Learning, 2015
Multiple choice writing guidelines are decidedly split on the use of "none of the above" (NOTA), with some authors discouraging and others advocating its use. Moreover, empirical studies of NOTA have produced mixed results. Generally, these studies have utilized NOTA as either the correct response or a distractor and assessed its effect…
Descriptors: Multiple Choice Tests, Test Items, Introductory Courses, Psychology
Peer reviewed Peer reviewed
Direct linkDirect link
McColgan, Michele W.; Finn, Rose A.; Broder, Darren L.; Hassel, George E. – Physical Review Physics Education Research, 2017
We present the Electricity and Magnetism Conceptual Assessment (EMCA), a new assessment aligned with second-semester introductory physics courses. Topics covered include electrostatics, electric fields, circuits, magnetism, and induction. We have two motives for writing a new assessment. First, we find other assessments such as the Brief…
Descriptors: Energy, Magnets, Scientific Concepts, Student Evaluation
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sener, Nilay; Tas, Erol – Journal of Education and Learning, 2017
The purpose of this study is to prepare a multiple-choice achievement test with high reliability and validity for the "Let's Solve the Puzzle of Our Body" unit. For this purpose, a multiple choice achievement test consisting of 46 items was applied to 178 fifth grade students in total. As a result of the test and material analysis…
Descriptors: Achievement Tests, Grade 5, Science Instruction, Biology
Peer reviewed Peer reviewed
Direct linkDirect link
Dirks-Naylor, Amie J. – Advances in Physiology Education, 2016
An active learning activity was used to engage students and enhance in-class learning of cell cycle regulation in a PharmD level integrated biological sciences course. The aim of the present study was to determine the effectiveness and perception of the in-class activity. After completion of a lecture on the topic of cell cycle regulation,…
Descriptors: Active Learning, Student Attitudes, Cytology, Biological Sciences
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4