NotesFAQContact Us
Collection
Advanced
Search Tips
Location
Denmark1
Laws, Policies, & Programs
Assessments and Surveys
International English…1
What Works Clearinghouse Rating
Showing all 12 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Marli Crabtree; Kenneth L. Thompson; Ellen M. Robertson – HAPS Educator, 2024
Research has suggested that changing one's answer on multiple-choice examinations is more likely to lead to positive academic outcomes. This study aimed to further understand the relationship between changing answer selections and item attributes, student performance, and time within a population of 158 first-year medical students enrolled in a…
Descriptors: Anatomy, Science Tests, Medical Students, Medical Education
Peer reviewed Peer reviewed
Direct linkDirect link
Son, Seung-Hee Claire; Butcher, Kirsten R.; Liang, Lauren Aimonette – Elementary School Journal, 2020
This study investigates how interactive features embedded in the illustrations of storybook apps influenced young readers' story enjoyment and comprehension. Kindergartners and second graders (N = 91) were randomly assigned to read storybook apps in an interactive or noninteractive condition. Findings showed that children's self-reported enjoyment…
Descriptors: Computer Software, Reading Comprehension, Preferences, Recall (Psychology)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Liao, Linyu – English Language Teaching, 2020
As a high-stakes standardized test, IELTS is expected to have comparable forms of test papers so that test takers from different test administration on different dates receive comparable test scores. Therefore, this study examined the text difficulty and task characteristics of four parallel academic IELTS reading tests to reveal to what extent…
Descriptors: Second Language Learning, English (Second Language), Language Tests, High Stakes Tests
Phelan, Julia; Vendlinski, Terry; Choi, Kilchan; Dai, Yunyun; Herman, Joan; Baker, Eva L. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2011
The POWERSOURCE[c] intervention is intended as a generalizable and powerful formative assessment strategy that can be integrated with any mathematics curriculum. POWERSOURCE[c] includes both a system of learning-based assessments and an infrastructure to support teachers' use of those assessments to improve student learning. The core undertaking…
Descriptors: Computer Software, Intervention, Formative Evaluation, Mathematics Curriculum
Peer reviewed Peer reviewed
Direct linkDirect link
Kreiner, Svend – Applied Psychological Measurement, 2011
To rule out the need for a two-parameter item response theory (IRT) model during item analysis by Rasch models, it is important to check the Rasch model's assumption that all items have the same item discrimination. Biserial and polyserial correlation coefficients measuring the association between items and restscores are often used in an informal…
Descriptors: Item Analysis, Correlation, Item Response Theory, Models
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Zhang, Mo; Breyer, F. Jay; Lorenz, Florian – ETS Research Report Series, 2013
In this research, we investigated the suitability of implementing "e-rater"® automated essay scoring in a high-stakes large-scale English language testing program. We examined the effectiveness of generic scoring and 2 variants of prompt-based scoring approaches. Effectiveness was evaluated on a number of dimensions, including agreement…
Descriptors: Computer Assisted Testing, Computer Software, Scoring, Language Tests
Nering, Michael L., Ed.; Ostini, Remo, Ed. – Routledge, Taylor & Francis Group, 2010
This comprehensive "Handbook" focuses on the most used polytomous item response theory (IRT) models. These models help us understand the interaction between examinees and test questions where the questions have various response categories. The book reviews all of the major models and includes discussions about how and where the models…
Descriptors: Guides, Item Response Theory, Test Items, Correlation
Peer reviewed Peer reviewed
Kim, Seock-Ho – Applied Psychological Measurement, 1997
Reviews the most recent version of the BILOG computer program, which estimates item and trait level parameters for the one-, two-, and three-parameter logistic unidimensional Item Response Models for dichotomously scored data. Finds this version useful. (SLD)
Descriptors: Computer Software, Item Analysis, Item Response Theory, Scores
Peer reviewed Peer reviewed
Alderson, J. Charles; Percsich, Richard; Szabo, Gabor – Language Testing, 2000
Reports on the potential problems in scoring responses to sequencing tests, the development of a computer program to overcome these difficulties, and an exploration of the value of scoring procedures. (Author/VWL)
Descriptors: Computer Software, Foreign Countries, Item Analysis, Language Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Penfield, Randall D. – Applied Psychological Measurement, 2005
Differential item functioning (DIF) is an important consideration in assessing the validity of test scores (Camilli & Shepard, 1994). A variety of statistical procedures have been developed to assess DIF in tests of dichotomous (Hills, 1989; Millsap & Everson, 1993) and polytomous (Penfield & Lam, 2000; Potenza & Dorans, 1995) items. Some of these…
Descriptors: Test Bias, Item Analysis, Psychological Studies, Evaluation Methods
Harnisch, Delwyn L. – 1987
A multi-purpose, multi-user evaluation system was designed to improve the quality of the reporting of information gathered in testing and evaluation practices. One part of the system, the reporting system for information resulting from a district standardized testing program, was described. The Student-Problem Package software was modified to…
Descriptors: Basic Skills, Charts, Computer Managed Instruction, Computer Software
Switzer, Deborah M.; Connell, Michael L. – 1989
This paper describes teacher usage of the microcomputer programs Test Analysis Package (TAP) and Student Problem Package (SPP) to analyze students' test item responses. These methods of organizing, analyzing, and reporting test results have proven useful to classroom teachers. The TAP consists of four integrated microcomputer programs to edit,…
Descriptors: Academic Achievement, Computer Assisted Testing, Computer Managed Instruction, Computer Software