Crystal Clear or Very Vague ?Effects of Task Clarity in the
Microtask Crowdsourcing Ecosystem
Ujwal Gadiraju
Web Science 2016Hannover, Germany
May 22nd, 2016
1
Vignettes from Previous Studies (½)Sumita from Gujarat, India : "The requester gives the link, but when you open the link it shows that the survey is already closed; but first we accept the task only then we can open the link. [..] Sometimes we don’t get the completion code (at the end of a survey), then we cannot understand, we spent so much time – 1 hour, and wasted our time and we are not getting the code, then how are we going to submit the survey. [..] At that time, we have to return the HIT and write to the requester. Some requesters are very good and give immediate reply – ’sorry this happened, we will see to it’, but some don’t even bother to give us a reply whether it was their fault".
Vignettes from Case Studies (2/2)
From an AMT-related forum
general65: "I don’t like it. Another idiot professor who thinks he knows what’s best for the private market. This will only mean the government getting involved and regulating the requester’s which in turn will end up in less pay for us. Someone please tell this idiot professor to stay in the classroom."
Marketplace Dynamics & Reputation
Crowd workers aim to maintain high reputation scores.
● A worker’s good reputation gives access to more tasks.
Are crowdsourced tasks always
CLEAR?
STUDY - Methodology
● Survey deployed on CrowdFlower
● 100 independent workers● Quality control (highest
level workers)● Attention-check questions● 3 workers failed to pass at
least one attention-check question
Experience of Workers
● 36% workers ⇒ Primary source of income
● 32.6 % workers ⇒ Crowd work since 3-6 months
● 63.2 % workers ⇒ Crowd work since 1-2 years
● 3.2 % workers ⇒ Crowd work since 3-5 years
● Remaining workers > 5 years
● 74% completed > 500 tasks
What factors make tasks unclear?
What factors make tasks unclear?Instructions & Task Description
vague, blank, unclear, inconsistent,
imprecise, ambiguous, or poor
Language Used in Description
too many words, high standard of English,
broken English, spelling
Task Clarity & Influence on Worker Performance (½)
● 49% workers claimed up to 30% tasks were unclear
● 37% workers claimed that between 31% - 60% of tasks were unclear
● 14% workers claimed that more than 60% tasks were unclear
Task Clarity & Influence on Worker Performance (2/2)
How do workers deal with unclear tasks? (½)
● 18% of workers used dictionaries or other tools to help them complete 50% of tasks
● 20% of workers used language translators to help them complete over 50% of tasks
How do workers deal with unclear tasks? (2/2)
Conclusions & Future Work
● Workers confront unclear tasks on a regular basis.
● Workers attempt to overcome the difficulties by using external help, dictionaries or translators.
● Workers tend to complete unclear tasks despite not understanding the objectives.
WE (workers, requesters, platform) need mechanisms that can ensure task clarity
so that workers are not penalized for suboptimal performance on poorly
designed tasks.
More here . . .
https://www.l3s.de/~gadiraju/TrustInCW/
Paid Microtask Crowdsourcing
18
Crowd Workers
Task Administrators / Requesters
Payment vs. Quality of Work Conundrum:How Requesters Deal With Crowd Work
Qualification Tests to Select a Reliable Crowd
⇒ Using a sample/simulating real task as a qualification test.
● Should requesters pay workers for participating in pre-screening tests?
● Should this be proportional to the length & effort required during the pre-screening phase?
Payment vs. Quality of Work Conundrum:How Requesters Deal With Crowd Work
Impact of Poor Task Design
● Is a requester within her ethical rights to reject work without paying when the task design is poor?
● How can requesters share accountability of poor quality work?
● Could crowd workers rate the requesters’ task design and clarity of instructions before requesters are allowed to deploy the HITs?
Payment vs. Quality of Work Conundrum:How Requesters Deal With Crowd Work
Reject Work With/Without Paying Workers
● Is a piece of work acceptance or rejection? Requesters have complete authority in adjudication.
● Is it fair on the part of the workers to accept full pay despite providing sub-optimal work?
● What happens when optimal work is rejected by the requesters?
How Workers Deal With Crowd Work
Aspects that Hinder Crowd Workers
● Barriers such as language, technology and poor task design.
● High variability of task types pose challenges with regards to workers’ familiarity with tasks.
● Where work is rejected unfairly, workers make efforts to get in touch with the platform / the requesters to state their case. This does not always receive a positive response.
How Workers Deal With Crowd Work
Risks Crowd Workers Take
● Work environments may not always be appropriate, and the devices that workers use to complete tasks may not be ergonomically suitable.
● Daily chores in tandem with completing available work, subjecting workers to distractions and disrupting concentration and their flow of work.
● Risks and issues generally remain invisible to the requesters, whose focus is usually on drawing out good work from the crowd.
How Workers Deal With Crowd Work
Risks Crowd Workers Take
● Should a worker who is using poorer equipment to provide a high quality of work be awarded a bonus (similar to corporates rewarding employees for working overtime or exhibiting extra efforts)?
● Should monetary incentives be a function of socio-economic aspects to an extent? Should requesters consider this in their task design?
Non-aggressive Filtering of Crowd Workers in Paid Crowdsourcing
Microtasks