NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Campbell, Rebecca; Goodman-Williams, Rachael; Feeney, Hannah; Fehler-Cabral, Giannina – American Journal of Evaluation, 2020
The purpose of this study was to develop triangulation coding methods for a large-scale action research and evaluation project and to examine how practitioners and policy makers interpreted both convergent and divergent data. We created a color-coded system that evaluated the extent of triangulation across methodologies (qualitative and…
Descriptors: Mixed Methods Research, Action Research, Data Interpretation, Coding
Peer reviewed Peer reviewed
Direct linkDirect link
Hilton, Lara G.; Azzam, Tarek – American Journal of Evaluation, 2019
Evaluations that include stakeholders aim to understand their perspectives and to ensure that their views are represented. This article offers a new approach to gaining stakeholder perspectives through crowdsourcing. We recruited a sample of individuals with chronic low back pain through a crowdsourcing site. This sample coded textual data…
Descriptors: Qualitative Research, Stakeholders, Data Collection, Chronic Illness
Peer reviewed Peer reviewed
Direct linkDirect link
Jacobson, Miriam R.; Whyte, Cristina E.; Azzam, Tarek – American Journal of Evaluation, 2018
Evaluators can work with brief units of text-based data, such as open-ended survey responses, text messages, and social media postings. Online crowdsourcing is a promising method for quantifying large amounts of text-based data by engaging hundreds of people to categorize the data. To further develop and test this method, individuals were…
Descriptors: Mixed Methods Research, Evaluation Methods, Comparative Analysis, Feedback (Response)