Conversation Workflows with Micro-breaks in Conversational Crowdsourcing

More Info
expand_more

Abstract

With the growing popularity of conversational interfaces, new use-cases in which conversation may be an effective medium for human-computer interaction are explored. Recent works have shown how conversational interfaces can improve worker engagement in micro-task crowdsourcing and that task performance is comparable to web-based interfaces. This research explores how conversation workflows can be designed with micro-breaks to effectively support task execution in conversational crowdsourcing. A database model is proposed that supports conversation workflows with micro-breaks.
Furthermore, two experiments were held with 10 participants each to evaluate how micro-breaks affect worker engagement, preferences and task performance. The results show that break proposals by the conversational system slightly increase worker engagement, task quality, and task duration. Moreover, it was found that workers take less breaks than they prefer to and that break preferences vary among workers.