@MISC{Yin_bonusor, author = {Ming Yin and Yiling Chen}, title = {Bonus or Not? Learn to Reward in Crowdsourcing}, year = {} }
Share
OpenURL
Abstract
Recent work has shown that the quality of work produced in a crowdsourcing working session can be influenced by the presence of performance-contingent financial incentives, such as bonuses for exceptional performance, in the session. We take an algorithmic approach to decide when to offer bonuses in a working session to improve the over-all utility that a requester derives from the ses-sion. Specifically, we propose and train an input-output hidden Markov model to learn the impact of bonuses on work quality and then use this model to dynamically decide whether to offer a bonus on each task in a working session to maximize a re-quester’s utility. Experiments on Amazon Mechan-ical Turk show that our approach leads to higher utility for the requester than fixed and random bonus schemes do. Simulations on synthesized data sets further demonstrate the robustness of our approach against different worker population and worker behavior in improving requester utility. 1