How Crowdsourcable is Your Task?

Lately, crowdsourcing has become an accepted means of creating resources for tasks that require human intelligence. Information Retrieval and related fields frequently exploit it for system building and evaluation purposes. However, malicious workers often try to maximise their financial gains by producing generic answers rather than actually working on the task. Identifying these individuals is a challenging process into which both crowdsourcing providers and requesters invest significant amounts of time.
Based on our experience from several corwdsourcing experiments Arjen P. de Vries and I compiled an overview of typically observed malicious strategies on crowdsourcing platforms and evaluated how careful HIT design can discourage cheaters. Based on a range of experiments, we conclude that malicious workers are less frequently encountered in novel tasks that involve a degree of creativity and abstraction. While there are various means of identifying forged submissions, setting tasks up in a non-repetitive way and requiring creative input can greatly increase the share of faithful workers.

“How Crowdsourcable is Your Task” is going to be presented at the WSDM 2011 Workshop on Crowdsourcing for Search and Data Mining (CSDM) in Hong Kong, China.