PROJECT TITLE :
Personalized and Diverse Task Composition in Crowdsourcing - 2018
We have a tendency to study task composition in crowdsourcing and therefore the effect of personalization and diversity on performance. A central process in crowdsourcing is task assignment, the mechanism through that employees notice tasks. On widespread platforms like Amazon Mechanical Turk, task assignment is facilitated by the power to type tasks by dimensions such as creation date or reward quantity. Task composition improves task assignment by producing for each employee, a personalized summary of tasks, referred to as a Composite Task (CT). We tend to propose completely different ways of manufacturing CTs and formulate an optimization problem that finds for a employee, the most relevant and numerous CTs. We tend to show empirically that workers' experience is greatly improved because of personalization that enforces an adequation of CTs with workers' skills and preferences. We tend to conjointly study and formalize various ways that of diversifying tasks in every CT. Task diversity is grounded in organization studies that have shown its impact on employee motivation . Our experiments show that numerous CTs contribute to improving outcome quality. Additional specifically, we show that while task throughput and employee retention are best with ranked lists, crowdwork quality reaches its best with CTs diversified by requesters, thereby confirming that staff look to expose their “smart” work to many requesters.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here