25
Crystal Clear or Very Vague ? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem Ujwal Gadiraju Web Science 2016 Hannover, Germany May 22 nd , 2016 1

Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Embed Size (px)

Citation preview

Page 1: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Crystal Clear or Very Vague ?Effects of Task Clarity in the

Microtask Crowdsourcing Ecosystem

Ujwal Gadiraju

Web Science 2016Hannover, Germany

May 22nd, 2016

1

Page 2: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Vignettes from Previous Studies (½)Sumita from Gujarat, India : "The requester gives the link, but when you open the link it shows that the survey is already closed; but first we accept the task only then we can open the link. [..] Sometimes we don’t get the completion code (at the end of a survey), then we cannot understand, we spent so much time – 1 hour, and wasted our time and we are not getting the code, then how are we going to submit the survey. [..] At that time, we have to return the HIT and write to the requester. Some requesters are very good and give immediate reply – ’sorry this happened, we will see to it’, but some don’t even bother to give us a reply whether it was their fault".

Page 3: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Vignettes from Case Studies (2/2)

From an AMT-related forum

general65: "I don’t like it. Another idiot professor who thinks he knows what’s best for the private market. This will only mean the government getting involved and regulating the requester’s which in turn will end up in less pay for us. Someone please tell this idiot professor to stay in the classroom."

Page 4: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Marketplace Dynamics & Reputation

Crowd workers aim to maintain high reputation scores.

● A worker’s good reputation gives access to more tasks.

Page 5: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Are crowdsourced tasks always

CLEAR?

Page 6: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

STUDY - Methodology

● Survey deployed on CrowdFlower

● 100 independent workers● Quality control (highest

level workers)● Attention-check questions● 3 workers failed to pass at

least one attention-check question

Page 7: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Experience of Workers

● 36% workers ⇒ Primary source of income

● 32.6 % workers ⇒ Crowd work since 3-6 months

● 63.2 % workers ⇒ Crowd work since 1-2 years

● 3.2 % workers ⇒ Crowd work since 3-5 years

● Remaining workers > 5 years

● 74% completed > 500 tasks

Page 8: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

What factors make tasks unclear?

Page 9: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

What factors make tasks unclear?Instructions & Task Description

vague, blank, unclear, inconsistent,

imprecise, ambiguous, or poor

Language Used in Description

too many words, high standard of English,

broken English, spelling

Page 10: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Task Clarity & Influence on Worker Performance (½)

● 49% workers claimed up to 30% tasks were unclear

● 37% workers claimed that between 31% - 60% of tasks were unclear

● 14% workers claimed that more than 60% tasks were unclear

Page 11: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Task Clarity & Influence on Worker Performance (2/2)

Page 12: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

How do workers deal with unclear tasks? (½)

● 18% of workers used dictionaries or other tools to help them complete 50% of tasks

● 20% of workers used language translators to help them complete over 50% of tasks

Page 13: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

How do workers deal with unclear tasks? (2/2)

Page 14: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Conclusions & Future Work

● Workers confront unclear tasks on a regular basis.

● Workers attempt to overcome the difficulties by using external help, dictionaries or translators.

● Workers tend to complete unclear tasks despite not understanding the objectives.

Page 15: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

WE (workers, requesters, platform) need mechanisms that can ensure task clarity

so that workers are not penalized for suboptimal performance on poorly

designed tasks.

Page 16: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Contact Details :

[email protected]

Twitter: @UjLaw

http://www.l3s.de/~gadiraju/

Page 17: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

More here . . .

https://www.l3s.de/~gadiraju/TrustInCW/

Page 18: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Paid Microtask Crowdsourcing

18

Crowd Workers

Task Administrators / Requesters

Page 19: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Payment vs. Quality of Work Conundrum:How Requesters Deal With Crowd Work

Qualification Tests to Select a Reliable Crowd

⇒ Using a sample/simulating real task as a qualification test.

● Should requesters pay workers for participating in pre-screening tests?

● Should this be proportional to the length & effort required during the pre-screening phase?

Page 20: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Payment vs. Quality of Work Conundrum:How Requesters Deal With Crowd Work

Impact of Poor Task Design

● Is a requester within her ethical rights to reject work without paying when the task design is poor?

● How can requesters share accountability of poor quality work?

● Could crowd workers rate the requesters’ task design and clarity of instructions before requesters are allowed to deploy the HITs?

Page 21: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Payment vs. Quality of Work Conundrum:How Requesters Deal With Crowd Work

Reject Work With/Without Paying Workers

● Is a piece of work acceptance or rejection? Requesters have complete authority in adjudication.

● Is it fair on the part of the workers to accept full pay despite providing sub-optimal work?

● What happens when optimal work is rejected by the requesters?

Page 22: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

How Workers Deal With Crowd Work

Aspects that Hinder Crowd Workers

● Barriers such as language, technology and poor task design.

● High variability of task types pose challenges with regards to workers’ familiarity with tasks.

● Where work is rejected unfairly, workers make efforts to get in touch with the platform / the requesters to state their case. This does not always receive a positive response.

Page 23: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

How Workers Deal With Crowd Work

Risks Crowd Workers Take

● Work environments may not always be appropriate, and the devices that workers use to complete tasks may not be ergonomically suitable.

● Daily chores in tandem with completing available work, subjecting workers to distractions and disrupting concentration and their flow of work.

● Risks and issues generally remain invisible to the requesters, whose focus is usually on drawing out good work from the crowd.

Page 24: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

How Workers Deal With Crowd Work

Risks Crowd Workers Take

● Should a worker who is using poorer equipment to provide a high quality of work be awarded a bonus (similar to corporates rewarding employees for working overtime or exhibiting extra efforts)?

● Should monetary incentives be a function of socio-economic aspects to an extent? Should requesters consider this in their task design?

Page 25: Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

Non-aggressive Filtering of Crowd Workers in Paid Crowdsourcing

Microtasks