NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Brian E. Clauser; Victoria Yaneva; Peter Baldwin; Le An Ha; Janet Mee – Applied Measurement in Education, 2024
Multiple-choice questions have become ubiquitous in educational measurement because the format allows for efficient and accurate scoring. Nonetheless, there remains continued interest in constructed-response formats. This interest has driven efforts to develop computer-based scoring procedures that can accurately and efficiently score these items.…
Descriptors: Computer Uses in Education, Artificial Intelligence, Scoring, Responses
Peer reviewed Peer reviewed
Direct linkDirect link
Fail, Stefanie; Schober, Michael F.; Conrad, Frederick G. – International Journal of Social Research Methodology, 2021
To explore socially desirable responding in telephone surveys, this study examines response latencies in answers to 27 questions in a corpus of 319 audio-recorded voice interviews on iPhones. Response latencies were compared when respondents (a) answered questions on sensitive vs. nonsensitive topics (as classified by online raters); (b) produced…
Descriptors: Telephone Surveys, Handheld Devices, Responses, Interviews
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Cong; Liu, Xiufeng; Wang, Lei; Sun, Ying; Zhang, Hongyan – Journal of Science Education and Technology, 2021
Assessing scientific argumentation is one of main challenges in science education. Constructed-response (CR) items can be used to measure the coherence of student ideas and inform science instruction on argumentation. Published research on automated scoring of CR items has been conducted mostly in English writing, rarely in other languages. The…
Descriptors: Automation, Scoring, Accuracy, Responses