An investigation of using mobile and situated crowdsourcing to collect annotated travel activity data in real-word settings

Yung-Ju Chang*, Gaurav Paruthi, Hsin Ying Wu, Hsin Yu Lin, Mark W. Newman

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

15 Scopus citations

Abstract

Collecting annotated activity data is vital to many forms of context-aware system development. Leveraging a crowd of smartphone users to collect annotated activity data in the wild is a promising direction because the data being collected are realistic and diverse. However, current research lacks a systematic analysis comparing different approaches for collecting such data and investigating how users use these approaches to collect activity data in real world settings. In this paper, we report results from a field study investigating the use of mobile crowdsourcing to collect annotated travel activity data through three approaches: Participatory, Context-Triggered In Situ, and Context-Triggered Post Hoc. In particular, we conducted two phases of analysis. In Phase One, we analyzed and compared the resulting data collected via the three approaches and user experience. In Phase Two, we analyzed users’ recording and annotation behavior as well as the annotation content in using each approach in the field. Our results suggested that although Context-Triggered approaches produced a larger number of recordings, they did not necessarily lead to a larger quantity of data than the Participatory approach. It was because many of the recordings were either not labeled, incomplete, and/or fragmented due to the imperfect context detection. In addition, recordings collected by the Participatory approach tended to be more complete and contain less noise. In terms of user experience, while users appreciated automated recording and reminders because of their convenience, they highly valued having the control over what and when to record and annotate that the Participatory approach provided. Finally, we showed that activity type (Driver, Riding as Passenger, Walking) influenced users’ behaviors in recording and annotating their activity data. It influenced the timing of recording and annotating using the Participatory approach, users’ receptivity using the Context-Triggered In Situ approach, and the characteristics of the content of annotations. Based on these findings, we provide design and methodological recommendations for future work that aims to leverage mobile crowdsourcing to collect annotated activity data.

Original languageEnglish
Pages (from-to)81-102
Number of pages22
JournalInternational Journal of Human Computer Studies
Volume102
DOIs
StatePublished - 1 Jun 2017

Keywords

  • Annotated activity data collection
  • Crowdsensing
  • Mobile crowdsourcing
  • Travel activity
  • Wearable camera

Fingerprint Dive into the research topics of 'An investigation of using mobile and situated crowdsourcing to collect annotated travel activity data in real-word settings'. Together they form a unique fingerprint.

Cite this