Upload
muhammad-enfield
View
215
Download
1
Tags:
Embed Size (px)
Citation preview
Participatory Privacy in Urban Sensing
Katie Shilton, Jeff Burke, Deborah Estrin, Mark Hansen, Mani B. Srivastava
MODUS 2008: April 21, 2008
Talked to your IRB today?
• Respect
• Beneficence
• Justice
2
Broad principles balance risk & discovery for many kinds of investigations.
Are there principles like this for urban sensing?
• Confidentiality
• Informed consent
• Statement of risks
Investigations by the public
Close to individuals & intermixed in daily life.
Wide-spread ability to collect & share data.
Pilots: •PEIR•CBE
3
Participatory sensing: Campaigns to help people gather data, make case
InfluencesParticipatory sensing
• Community-based participatory research (CBPR)• Participatory action research (PAR) [1, 2]• Participatory design (PD) [3]
4
Privacy
•Contextual privacy [4]•Information ethics [5-7]
Participatory Privacy Regulation
5
System Designers
Campaign Groups
Participants
Decision about
boundaries Trust & commitment
Process of system
design and use
Where privacy regulation fits
6
Control over
capture
Control over
resolution
Control over
sharing
Control over
retention
Decision about
boundaries
Participation
Campaign goals
Instrument design
Data collection
Data analysis
Technical approaches to privacy
Existing toolbox includes:
• Privacy warning, notification, or feedback systems [8-10];
• User control over data sharing [11];
• Identity management systems [12] ;
• Selective retention systems [8];
• Encryption, privacy-enhancing technologies [13];
• Statistical anonymization of data [4];
• Data retention or its opposite, ‘forgetting’ [14, 15].
7
Participatory privacy regulation design guidelines
1. Participant primacy
2. Participatory design
3. Participant autonomy
4. Minimal, auditable information
5. Synergy between policy & technology
8
Design guideline 1: participant primacy
Feature examples:
• Data visualization, interfaces (where did I go today?)
• Alerts and reminders (it’s 9 pm: turn sensing off!)
9
Challenges:
•Legible interfaces
•Developing effective alert mechanisms that do not disrupt data collection or annoy participants
Help users take role, responsibilities of investigators
Design guideline 2: participatory design
Customizable features:
• Data representation (can you see our houses?)
• Selective sharing (share only with campaign leaders)
• Retention, reuse (we don’t need data after Jan 1, 2009)
10
Challenges:
• Flexible systems to adjust capture, storage, representation of data.
• Flexibility achieved early in the design process.
Customize systems to campaign needs
Design guideline 3: participant autonomy
Feature examples:• Discretion tools
(replace this trip with ‘average’ trip)
• Selective retention (delete from 9 to 10 am)
11
Challenges:• Building discretion tools
• Analyzing incomplete and/or falsified data
• Logging use of discretion tools
Enabling participants to negotiate privacy context
Design guideline 4: minimal, auditable information
Feature examples:• Parsimonious sensors
(collect location using only cell tower triangulation)
• Processing close to source
• Audit mechanisms (log who accesses data)
12
Challenges:• Designing systems that
support and benefit from minimal data collection.
• Building auditing mechanisms viewable, legible, useable by participants.
Parsimonious capture, watchdogs
Design guideline 5: synergy between policy & technology
Achieved through:• Sharing responsibility
• Discussing problems best addressed by policy vs. technology.
13
Challenges:• Authoring policy to
support technology
• Designing technology to support policy.
Software and hardware can’t do everything
In the future: evaluation
How well do these principles – and resultant software – work?
Meet design challenges: negotiating policy, building discretion tools, audit mechanisms, etc.
Log data: measuring use of the privacy regulation features
Interviews: evaluating participant trust of systems
Participant observation: determining when and why participants feel boundary or identity sensitivities; evaluating whether systems adequately address these sensitivities
Participant critique: of design methods, software, and conclusions
14
Conclusions
Participation over restriction
Balance between privacy and participation enables sensing systems to reach research, empowerment , documentary potential.
15
Participation in restriction
Enables participants to limit sensing according to their needs, values.
Citations[1] M. Cargo and S. L. Mercer, "The value and challenges of participatory research:
strengthening its practice," Annual Review of Public Health, vol. 29, 2008. [2] E. Byrne and P. M. Alexander, "Questions of ethics: Participatory information
systems research in community settings," in SAICSIT Cape Winelands, South Africa, 2006, pp. 117-126.
[3] S. Pilemalm and T. Timpka, "Third generation participatory design in health informatics - making user participation applicable to large-scale information system projects," Journal of Biomedial Informatics, 2007 (in press)
[4] H. Nissenbaum, "Privacy as contextual integrity," Washington Law Review, vol. 79, pp. 119–158, 2004.
[5] J. Waldo, H. S. Lin, and L. I. Millett, Engaging privacy and information technology in a digital age. Washington, D.C.: The National Academies Press, 2007.
[6] L. Palen and P. Dourish, "Unpacking "privacy" for a networked world," in CHI 2003. vol. 5 Ft. Lauderdale, FL: ACM, 2003, pp. 129-136.
[7] J. E. Cohen, "Privacy, Visibility, Transparency, and Exposure," University of Chicago Law Review, vol. 75, 2008.
[8] G. R. Hayes, E. S. Poole, G. Iachello, S. N. Patel, A. Grimes, G. D. Abowd, and K. N. Truong, "Physical, social and experiential knowledge in pervasive computing environments," Pervasive Computing, vol. 6, pp. 56-63, 2007.
16
Citations cont.[9] M. S. Ackerman and L. Cranor, "Privacy critics: UI components to safeguard users‘
privacy," in Conference on Human Factors in Computing Systems CHI’99: ACM Publications, 1999, pp. 258-259.
[10] D. H. Nguyen and E. D. Mynatt, "Privacy mirrors: understanding and shaping socio-technical ubiquitous computing systems," Georgia Institute of Technology GIT-GVU-02-16, 2002.
[11] D. Anthony, D. Kotz, and T. Henderson, "Privacy in location-aware computing environments," Pervasive Computing, vol. 6, pp. 64-72, 2007.
[12] S. Patil and J. Lai, "Who gets to know what when: configuring privacy permissions in an awareness application," in SIGCHI Conf. Human Factors in Computing Systems (CHI 05) Portland, Oregon: ACM Press, 2005, pp. 101–110.
[13] H. Burkert, "Privacy-enhancing technologies: Typology, critique, vision," in Technology and privacy: The new landscape, P. E. Agre and M. Rotenberg, Eds. Cambridge, MA and London: The MIT Press, 1998, pp. 125-142.
[14] L. Bannon, "Forgetting as a feature, not a bug: the duality of memory and implications for ubiquitous computing," CoDesign, vol. 2, pp. 3-15, 2006.
[15] J.-F. Blanchette and D. G. Johnson, "Data retention and the panoptic society: the social benefits of forgetfulness," The Information Society, vol. 18, 2002.
17