NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
Elementary and Secondary…2
What Works Clearinghouse Rating
Showing 151 to 165 of 690 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Setzer, J. Carl; Wise, Steven L.; van den Heuvel, Jill R.; Ling, Guangming – Applied Measurement in Education, 2013
Assessment results collected under low-stakes testing situations are subject to effects of low examinee effort. The use of computer-based testing allows researchers to develop new ways of measuring examinee effort, particularly using response times. At the item level, responses can be classified as exhibiting either rapid-guessing behavior or…
Descriptors: Testing, Guessing (Tests), Reaction Time, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Andrich, David; Marais, Ida; Humphry, Stephen Mark – Educational and Psychological Measurement, 2016
Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The…
Descriptors: Guessing (Tests), Statistical Bias, Item Response Theory, Multiple Choice Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Pokropek, Artur – Journal of Educational and Behavioral Statistics, 2016
A response model that is able to detect guessing behaviors and produce unbiased estimates in low-stake conditions using timing information is proposed. The model is a special case of the grade of membership model in which responses are modeled as partial members of a class that is affected by motivation and a class that responds only according to…
Descriptors: Reaction Time, Models, Guessing (Tests), Computation
Peer reviewed Peer reviewed
Direct linkDirect link
Stenlund, Tova; Sundström, Anna; Jonsson, Bert – Educational Psychology, 2016
This study examined whether practice testing with short-answer (SA) items benefits learning over time compared to practice testing with multiple-choice (MC) items, and rereading the material. More specifically, the aim was to test the hypotheses of "retrieval effort" and "transfer appropriate processing" by comparing retention…
Descriptors: Short Term Memory, Long Term Memory, Test Format, Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Couchman, Justin J.; Miller, Noelle E.; Zmuda, Shaun J.; Feather, Kathryn; Schwartzmeyer, Tina – Metacognition and Learning, 2016
Students often gauge their performance before and after an exam, usually in the form of rough grade estimates or general feelings. Are these estimates accurate? Should they form the basis for decisions about study time, test-taking strategies, revisions, subject mastery, or even general competence? In two studies, undergraduates took a real…
Descriptors: Higher Education, College Students, Tests, Metacognition
Peer reviewed Peer reviewed
Direct linkDirect link
Lin, Jing-Wen – Journal of Science Education and Technology, 2016
This study adopted a quasi-experimental design with follow-up interview to develop a computer-based two-tier assessment (CBA) regarding the science topic of electric circuits and to evaluate the diagnostic power of the assessment. Three assessment formats (i.e., paper-and-pencil, static computer-based, and dynamic computer-based tests) using…
Descriptors: Computer Assisted Testing, Diagnostic Tests, Quasiexperimental Design, Interviews
Falk, Carl F.; Cai, Li – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2015
We present a logistic function of a monotonic polynomial with a lower asymptote, allowing additional flexibility beyond the three-parameter logistic model. We develop a maximum marginal likelihood based approach to estimate the item parameters. The new item response model is demonstrated on math assessment data from a state, and a computationally…
Descriptors: Guessing (Tests), Item Response Theory, Mathematics Instruction, Mathematics Tests
Novacek, Paul – International Association for Development of the Information Society, 2013
Traditional knowledge assessments rely on multiple-choice type questions that only report a right or wrong answer. The reliance within the education system on this technique infers that a student who provides a correct answer purely through guesswork possesses knowledge equivalent to a student who actually knows the correct answer. A more complete…
Descriptors: Adult Learning, Multiple Choice Tests, Guessing (Tests), Confidence Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Huff, Mark J.; Balota, David A.; Hutchison, Keith A. – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2016
We examined whether 2 types of interpolated tasks (i.e., retrieval-practice via free recall or guessing a missing critical item) improved final recognition for related and unrelated word lists relative to restudying or completing a filler task. Both retrieval-practice and guessing tasks improved correct recognition relative to restudy and filler…
Descriptors: Testing, Guessing (Tests), Memory, Retention (Psychology)
Peer reviewed Peer reviewed
Direct linkDirect link
Sideridis, Georgios; Tsaousis, Ioannis; Al Harbi, Khaleel – Educational and Psychological Measurement, 2017
The purpose of the present article was to illustrate, using an example from a national assessment, the value from analyzing the behavior of distractors in measures that engage the multiple-choice format. A secondary purpose of the present article was to illustrate four remedial actions that can potentially improve the measurement of the…
Descriptors: Multiple Choice Tests, Attention Control, Testing, Remedial Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Kornell, Nate – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2014
Attempting to retrieve information from memory enhances subsequent learning even if the retrieval attempt is unsuccessful. Recent evidence suggests that this benefit materializes only if subsequent study occurs immediately after the retrieval attempt. Previous studies have prompted retrieval using a cue (e.g., "whale-???") that has no…
Descriptors: Memory, Feedback (Response), Recall (Psychology), Word Lists
Peer reviewed Peer reviewed
Direct linkDirect link
Kong, Xiaojing; Davis, Laurie Laughlin; McBride, Yuanyuan; Morrison, Kristin – Applied Measurement in Education, 2018
Item response time data were used in investigating the differences in student test-taking behavior between two device conditions: computer and tablet. Analyses were conducted to address the questions of whether or not the device condition had a differential impact on rapid guessing and solution behaviors (with response time effort used as an…
Descriptors: Educational Technology, Technology Uses in Education, Computers, Handheld Devices
Sahin, Füsun – ProQuest LLC, 2017
Examining the testing processes, as well as the scores, is needed for a complete understanding of validity and fairness of computer-based assessments. Examinees' rapid-guessing and insufficient familiarity with computers have been found to be major issues that weaken the validity arguments of scores. This study has three goals: (a) improving…
Descriptors: Computer Assisted Testing, Evaluation Methods, Student Evaluation, Guessing (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
San Martin, Ernesto; Rolin, Jean-Marie; Castro, Luis M. – Psychometrika, 2013
In this paper, we study the identification of a particular case of the 3PL model, namely when the discrimination parameters are all constant and equal to 1. We term this model, 1PL-G model. The identification analysis is performed under three different specifications. The first specification considers the abilities as unknown parameters. It is…
Descriptors: Item Response Theory, Models, Identification, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Campbell, Mark L. – Journal of Chemical Education, 2015
Multiple-choice exams, while widely used, are necessarily imprecise due to the contribution of the final student score due to guessing. This past year at the United States Naval Academy the construction and grading scheme for the department-wide general chemistry multiple-choice exams were revised with the goal of decreasing the contribution of…
Descriptors: Multiple Choice Tests, Chemistry, Science Tests, Guessing (Tests)
Pages: 1  |  ...  |  7  |  8  |  9  |  10  |  11  |  12  |  13  |  14  |  15  |  ...  |  46