Crisp Answers to Fuzzy Questions: Design lessons for crowdsourcing decision inputs

Preview:

DESCRIPTION

Crisp Answers to Fuzzy Questions: Design lessons for crowdsourcing decision inputs. Alex Quinn, Ben Bederson. R. L. Polk & Co. / Autotrader.com, Automotive Buyer Study, 2011. - PowerPoint PPT Presentation

Citation preview

Crisp Answers to Fuzzy Questions: Design lessons for crowdsourcing decision inputs

Alex Quinn, Ben Bederson

“Market research firm J.D. Power and Associates says […] more than 80% of buyers have

already spent an average of 18 hours online researching models and prices, according to

Google data.” Wall Street Journal, 2/26/2013

R. L. Polk & Co. / Autotrader.com, Automotive Buyer Study, 2011

Vacationitinerary

Location forheadquarters

Grad schoolapplications

Pediatrician Car Smartphone

DATA-DRIVEN DECISIONS

Building blocks for Mechanical Turk:

HITs(human intelligence task)

Keep instructions short.

Input labels should be unambiguous

HITs must be grouped by

common templates

See Mechanical Turk Requester Best Practices Guide

Example #1: Find a pediatrician

Requirements

• Accepts my insurance

• ≥4 stars at RateMDs.com

• >80% positive at HealthGrades.com

• ≤15 minutes drive from home

Effort should be proportional to

the reward.

HITs in a group share a common

base price.

Information sources should be

traceable.

See Mechanical Turk Requester Best Practices Guide

Example #2: Buy a stroller

Requirements

• Fits a 30-pound baby

• Reclines for sleeping

• Medium/large-sized soft tires

• Can purchase online in US

Bonus offers allow rewardto scale with effort

Find creative ways to track sources

Design lessons

1) Consider effort-reward balance from the start.

2) Look for implicit ways of capturing sources.

3) Use word economy to conserve vertical space.

4) Choose unambiguous input labels.

Alex Quinnaq@cs.umd.edu

Recommended