NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Bostic, Jonathan David – Applied Measurement in Education, 2021
Think alouds are valuable tools for academicians, test developers, and practitioners as they provide a unique window into a respondent's thinking during an assessment. The purpose of this special issue is to highlight novel ways to use think alouds as a means to gather evidence about respondents' thinking. An intended outcome from this special…
Descriptors: Protocol Analysis, Cognitive Processes, Data Collection, STEM Education
Peer reviewed Peer reviewed
Direct linkDirect link
Bostic, Jonathan David; Sondergeld, Toni A.; Matney, Gabriel; Stone, Gregory; Hicks, Tiara – Applied Measurement in Education, 2021
Response process validity evidence provides a window into a respondent's cognitive processing. The purpose of this study is to describe a new data collection tool called a whole-class think aloud (WCTA). This work is performed as part of test development for a series of problem-solving measures to be used in elementary and middle grades. Data from…
Descriptors: Data Collection, Protocol Analysis, Problem Solving, Cognitive Processes
Peer reviewed Peer reviewed
Direct linkDirect link
Bonner, Sarah M.; D'Agostino, Jerome V. – Applied Measurement in Education, 2012
We investigated examinees' cognitive processes while they solved selected items from the Multistate Bar Exam (MBE), a high-stakes professional certification examination. We focused on ascertaining those mental processes most frequently used by examinees, and the most common types of errors in their thinking. We compared the relationships between…
Descriptors: Cognitive Processes, Test Items, Problem Solving, Thinking Skills
Peer reviewed Peer reviewed
Direct linkDirect link
Leighton, Jacqueline P.; Cui, Ying; Cor, M. Ken – Applied Measurement in Education, 2009
The objective of the present investigation was to compare the adequacy of two cognitive models for predicting examinee performance on a sample of algebra I and II items from the March 2005 administration of the SAT[TM]. The two models included one generated from verbal reports provided by 21 examinees as they solved the SAT[TM] items, and the…
Descriptors: Test Items, Inferences, Cognitive Ability, Prediction
Peer reviewed Peer reviewed
Kobrin, Jennifer L.; Young, John W. – Applied Measurement in Education, 2003
Studied the cognitive equivalence of computerized and paper-and-pencil reading comprehension tests using verbal protocol analysis. Results for 48 college students indicate that the only significant difference between the computerized and paper-and-pencil tests was in the frequency of identifying important information in the passage. (SLD)
Descriptors: Cognitive Processes, College Students, Computer Assisted Testing, Difficulty Level