117
Context Fabric: Privacy Support for Ubiquitous Computing Jason I. Hong G r o u p f o r User Interface Research University of California Berkeley

Context Fabric: Privacy Support for Ubiquitous Computing

Embed Size (px)

Citation preview

Context Fabric:Privacy Support for Ubiquitous Computing

Jason I. Hong

G r o u p f o rUser Interface Research

University of CaliforniaBerkeley

Apr 24 2003 2

Ubiquitous Computing Scenario

Diversity of devicesMobile and embeddedMany kinds of interactionsMany kinds of sensorsAll networked together

Apr 24 2003 3

Privacy and Ubicomp

• Tension: information can be used for great benefit and great harm

• Privacy is the most often-cited criticism of Ubicomp– “The Boss That Never Blinks” [San Jose Mercury

News 1992]

• What is new here is the scale of Ubicomp– Past: costly to collect, store, and use info– Future: everywhere, always on, far easier to collect

data

Apr 24 2003 4

Problem

• Hard to create privacy-aware Ubicomp systems– Hard to analyze privacy

• What should the privacy goals be?• Which system interactions should we focus on?

– Hard to implement privacy-aware systems• What are the basic abstractions?• What are the privacy mechanisms?

Apr 24 2003 5

Solution Overview

• Approximate Information Flows– Framework for analyzing privacy in terms of info flow– Minimize flow out of sensitive data, maximize flow back

in of how that data is used• Context Fabric, infrastructure for privacy-aware

apps– InfoSpaces, repositories of personal data– Operators, reusable mechanisms for managing info

flow• Evaluation through building apps

– Person Finder & Building Emergency Response

Apr 24 2003 6

Talk Outline

Motivation Privacy and Managing Information Flows Architectural Overview of Context Fabric Applications Built with Context Fabric Conclusions

Apr 24 2003 7

Defining Information Privacy

• Different kinds of privacy– Territorial, Bodily, Communications, Info rm a tio n

• Information Privacy conflates many issues– Security, Confidentiality, Anonymity

• Defining Information Privacy [Westin 1967]– “Privacy is the claim of individuals, groups or institutions

to determine for themselves when, how, and to what extent information about them is communicated to others”

• My work is on providing end-users with greater control and understanding

Apr 24 2003 8

Privacy & Managing Information Flows

• Control & understanding hard due to how info flows

• Examples– Collecting info without person knowing– Sharing (or selling) info without person knowing

• Design Goal: Manage information flows by– Minimizing flow of outgoing sensitive data (control)– Maximizing flow of incoming data about use

(understanding)

ServiceProviders

You OutIn

Out

In

Apr 24 2003 9

Example of Managing Information Flow

• First time touring France• PDA, GPS, maps, wireless

Alice, a Tourist Carol, a Tour Operator

• Lines at Museums• Current Events• Recommendations• Route Finder

• Sets up tour packages• Wants demographics• Wants places visited

Bob, Provides Real-time Tourist Info

Apr 24 2003 10

Example of Managing Information Flow

Alice, a Tourist

• Reads a review• Finds Bob's website• Skims privacy policy• Decides to try

Bob, Provides Real-time Tourist Info

Apr 24 2003 11

Example of Managing Information Flow

Alice, a Tourist

• Basic service• Demographics + City• Events, museum lines

• Gold service• Demographics + GPS• Recs, route finder

• Will sell aggregated data

Bob, Provides Real-time Tourist Info

Apr 24 2003 12

Example of Managing Information Flow

Alice, a Tourist

• Opts for Basic Service• Logs outgoing data

Bob, Provides Real-time Tourist Info

Apr 24 2003 13

Example of Managing Information Flow

• Lower Precision • Aggregate• Garbage Collect• Log outgoing

Carol, a Tour Operator

Bob, Provides Real-time Tourist Info

Apr 24 2003 14

Approximate Information Flows

• Approximate Information Flows is a framework for analyzing information flows in Ubicomp systems

• Two questions:– When does data flow to others?– Strategies for protecting that data?

Apr 24 2003 15

When Does Data Flow to Others

• Co lle c tio n, when data is gathered– Ex. When Alice gets her location data (ex. GPS)

• Acc e s s , when data is first requested or provided– Ex. Alice sends her location data to Bob

• Se c o nd us e , sharing data after access– Ex. Bob shares data with Carol

Apr 24 2003 16

Strategies for Protecting Data

• Pre ve nt privacy violations from occurring– Ex. Refuse request, Turn off device– Minimizing flow out of sensitive data

• Avo id potential privacy risks– Ex. Lowering precision, Notification– Minimizing flow out & maximizing flow in

• De te c t any privacy violations– Ex. Internal and Third party audits of Bob and Carol– Maximizing flow in about how data is used

• I am focusing on Avoidance & Detection

Apr 24 2003 17

Technical & Non-Technical Solutions

• Privacy cannot be managed by Technology alone

• Appropriate Technology can make it easier for other forces to act

Privacy

Social

Market Legal

Technology

Lessig, “Architecture of Privacy”

Apr 24 2003 18

Talk Outline

Motivation Privacy and Managing Information Flows Architectural Overview of Context Fabric Applications Built with Context Fabric Conclusions

Apr 24 2003 19

Assumptions

• Pessimistic case: Designers and service providers don't care or are trying to violate users' privacy

• My work is on Optimistic case: Designers and service providers trying to deploy privacy-aware apps– Minimize privacy risks (perceived and real) for their

users• Market, Social, Legal Forces support optimistic

case– Market: Toysrus.com– Social: Code of ethics for Doctors– Legal: EU Data Protection

• Ex. AT&T m-life uses privacy as a key selling point

Apr 24 2003 20

Building Privacy-aware Apps Today

• P3P (Platform for Privacy Preferences)– Focuses on communicating policy and obtaining

consent• Privacy Mirrors

– GUIs for helping people understand how system is tracking

– No control over how information flows or how to build• Cricket Location Beacons

– Does not deal with sharing of information• Ubicomp infrastructures [ParcTab system, Context

Toolkit]– No support for privacy or end-user control

• Today, would have to be done in ad hoc manner

Apr 24 2003 21

Architectural Requirements

• Easy to create privacy-aware Ubicomp apps

• Low barrier to entry– Make it simple for programmers, admin, end-users

• Easy to add or modify app-specific privacy controls

• Easy for end-users to control and understand• Easy to share info at level users comfortable with

Apr 24 2003 22

High-Level Architecture

Bob'sInformatio

n

Bob'sInformatio

n

Alice'sInformationAlice'sInformation

GPS

Loc(GPS)

TourguideApp

Bob'sService

Operators

Apr 24 2003 23

Bob'sInformatio

n

Bob'sInformatio

n

Alice'sInformationAlice'sInformation

Loc(GPS)

High-Level Architecture

GPS

Operators

TourguideApp

Bob'sService

Loc(City)

Loc(City)

Loc(GPS)

Apr 24 2003 24

Alice'sInformationAlice'sInformation

Bob'sInformatio

n

Bob'sInformatio

n

High-Level Architecture

GPS

Tuple

Operators

TourguideApp

Bob'sService

Loc(GPS)

Events

Apr 24 2003 25

InfoSpaces

• Key abstraction is the InfoSpace– Represents data about a single entity– Like an object with dynamic properties– Decentralize data, put in user’s hands– Implemented as a TupleSpace

• InfoSpaces contain Tuples– Represents single piece of data– Sensors & Apps can add or query

Tuples– Default value is UNKNOWN– Tuples can point to other InfoSpaces

Alice’s InfoSpaceAlice’s InfoSpace

Loc Activity

Name

Room 525’s InfoSpace

Room 525’s InfoSpace

Temp SoundLevel

Apr 24 2003 26

Tuples

• Metadata– Data type (ex. "location") – Data format (ex. "edu.berkeley.soda.room")

• Values– Value (ex. "525" with 88% confidence)– Link to InfoSpace (ex.

"http://guir.berkeley.edu/rooms/525/")• Privacy Tag

– When this Tuple should be garbage collected– Who to notify on second use

Apr 24 2003 27

Operators

• Pieces of chainable code for manipulating Tuples– Designed for reusability and extensibility

Apr 24 2003 28

Operators

• Pieces of chainable code for manipulating Tuples– Designed for reusability and extensibility

• In-Operators modify incoming tuples– Ex. Check that we are only receiving data we are

allowed to see ("please don't pass on to other people")

In Operators

SourceAlice’s InfoSpaceAlice’s InfoSpace

TupleTupleTupleTupleTupleTuple

Apr 24 2003 29

Operators

• Pieces of chainable code for manipulating Tuples– Designed for reusability and extensibility

• In-Operators modify incoming tuples• Out-Operators modify outgoing tuples

– Ex. Lowering precision of data

SinkAlice’s InfoSpaceAlice’s InfoSpace

TupleTupleTupleTupleTupleTuple

Out Operators

Apr 24 2003 30

Operators

• Pieces of chainable code for manipulating tuples– Designed for reusability and extensibility

• In-Operators modify incoming tuples• Out-Operators modify outgoing tuples• On-Operators run periodically on tuples in

InfoSpace– Ex. Garbage Collection

On OperatorsAlice’s InfoSpaceAlice’s InfoSpace

TupleTupleTupleTupleTupleTuple

Apr 24 2003 31

Suite of Privacy Techniques

• Privacy Techniques for Managing Info Flows– Lowering Precision– Access Control– Logging and Periodic Reports– Privacy Tags– Garbage Collection

• All implemented as in-, out-, or on-operators

Apr 24 2003 32

• Problem: Some tuples provide too much info

• Solution: Lower precision of data– Minimize flow of outgoing data by

reducing quality• Tourist Example

– "Alice is at 56°N 36°E" => "Alice is in Paris"

– Implemented as out-operator using a region lookup

Privacy TechniqueLowering Precision

Paris

Marseilles

56°N 36°E

Apr 24 2003 33

Privacy TechniqueAccess Control

• Problem: Want to provide different info in different situations

• Tourist Example– Let Bob see my location at city level only

• Emergency Response Example– Let firefighters see my room location when I am in Soda

Hall• Other Examples

– Let all people in Soda Hall see my location at floor level– Let co-workers see my location if between 9AM and 5PM

Apr 24 2003 34

Privacy TechniqueAccess Control

• Solution: Fine-grained control through Conditions• Conditions

– Age of data – Data Format– Requestor Domain – Data Type– Requestor ID – Current Time– Requestor Location

• Actions– Lower Precision – Allow– Set (fake value) – Hide (data is removed)– Invisible (no out data) – Timeout (fake network load)– Interactive – Deny (forbidden)

Apr 24 2003 35

• Problem: Need better understanding of who knows what about you for auditing purposes

• Solution: Logs and periodic reports

Privacy TechniqueLogging and Periodic Reports

Apr 24 2003 36

• US Federal Trade Commission recently established National Do-not-call list for telemarketers

James Haverly 404-333-3456

Jason Hong 510-345-3456

Tommy Horn 212-567-8910• Question: What guarantees do we have that

telemarketers will not use this list to spam people?

• Answer: Seed with fake data, monitor phone calls to fake entries, punish violators (detection)

Privacy TechniqueLogging and Periodic Reports

Apr 24 2003 37

Privacy TechniqueSeeding Fake Data and Periodic Reports

Alice, a Tourist

Bob, Provides Real-time Info

• Alice, EPIC, or Consumer Reports sends fake data

• Checks they are receiving periodic reports properly

• Also monitors for spam and other misuses

Apr 24 2003 38

Privacy TechniqueSeeding Fake Data and Periodic Reports

Carol, a Tour Operator

Bob, Provides Real-time Info

• Wants assurances Carol won't abuse data• Notifies Carol• Create fake people

with email addresses• Monitors results for

abuses

Apr 24 2003 39

• Problem: Need a way of controlling data after it has left one’s InfoSpace (second use)

• Solution: Tag each tuple with usage preferences• Email Analogy

– “Please don't forward this to anyone else”– “Please delete this in three days”

• Example Privacy Tags– For Bob and only Bob– Garbage collect if data over week old– Garbage collect if user leaves Soda Hall– Who to notify on violations (Ex.

[email protected])

Privacy TechniquePrivacy Tags

Apr 24 2003 40

Privacy TechniquePeer Enforcement of Privacy Tags

Alice’s InfoSpaceAlice’s InfoSpace

Bob’s InfoSpaceBob’s InfoSpace

Carol’s InfoSpaceCarol’s InfoSpace

Loc

Loc

LocBob has datahe shouldn't

PTag

PTag

PTag

Delete in 7 days

Apr 24 2003 41

Architectural Overview

• Tuples

<ContextTuple dataformat="edu.berkeley.soda.room" datatype="location" description="Represents location of an entity" entity-name="Jason Hong" timestamp-created="2003.Mar.12 10:10:30 PST" timestamp-received="2003.Mar.12 10:10:30 PST" tuple-id="tmp14975.xml"> <Values> <Value value="525" link="http://guir.berkeley.edu:8080/infospace/soda525/" confidence="0.88" /> <Value value="527" link="http://guir.berkeley.edu:8080/infospace/soda527/" confidence="0.12" /> </Values> <Sources> <Source dataformat="edu.berkeley.soda.room" datatype="location" link="http://guir.berkeley.edu:8080/sim/loc/map.jsp" source-name="Soda Hall Location Simulator" timestamp-out="2003.Mar.12 10:10:30 PST" value="525" /> </Sources> <PrivacyTags notify-address=“im:[email protected]"> <GarbageCollect can-aggregate="true" notify="true"> <Where max-age-of-data="48 hours" /> <Where current-location=“not edu.berkeley.soda.*" /> </GarbageCollect> <SecondUse allow="false" notify="true" /> </PrivacyTags></ContextTuple>

<PrivacyTags notify-address="im:[email protected]"> <Aggregate allow="true" notify="true" /> <GarbageCollect> <Where max-age-of-data="48 hours" /> <Where current-location="not edu.berkeley.soda.*"/> </GarbageCollect> <SecondUse allow="false" notify="true" /></PrivacyTags>

Apr 24 2003 42

Architectural Details

• InfoSpaces– Leverages web servers, HTTP– Ex. http://www.cs.berkeley.edu/~jasonh/infospace– Push data to edge, to where end-users have control– Low barrier to entry for admin, programmers, & end-

users

• Operators– Separate functionality into composable components– Easy for programmers & end-users to add or modify

• Tuples– Uses XML docs vs. mobile objects or RPC (hidden state)– Transparent data model easy to view & understand

Apr 24 2003 43

Implementation Details

• All written in Java 1.4– http://sourceforge.net/projects/confab– 410 Source classes– 18500 Lines of Code

• Uses Apache Tomcat web server• Other infrastructure has been built using my

APIs– Liquid Distributed Querying package– C++ Confab-lite PDA version (subset of full version)

Apr 24 2003 44

Talk Outline

Motivation Privacy and Managing Information Flows Architectural Overview of Context Fabric Applications Built with Context Fabric Conclusions

Apr 24 2003 45

Applications

• Person Finder– Like AT&T m-life's Find Friends app but for web

• Building Emergency Response Service– Based on field studies with firefighters

• Currently using Wizard of Oz simulations for data rather than actual sensors– Make sure on right track before devoting time to

sensors– Sensors being deployed in Berkeley now

Apr 24 2003 46

Building Emergency Response Service

• One of a suite of applications we are developing

• Keep track of people in a building– Help building managers check if a building is clear in

the event of an evacuation– Help firefighters understand where people are

• Also provide reasonable privacy protection– People don't like to be tracked– Emergency situations relatively rare– Ensuring that data is used properly

Apr 24 2003 47

Field Study and Iterative Design with Firefighters

• What are big problems sensors can help with?– Four month field study, 30+ hours– Iterative prototyping and evaluation with firefighters– Gives the tools we build a better chance of

succeedingFirefighters said knowing where people were in building would help determine their strategy

Apr 24 2003 48

LocationBeacons

Building Emergency Response ServiceSoftware Prototype 1

Alice’s InfoSpaceAlice’s InfoSpace

EmergencyResponse InfoSpace

EmergencyResponse InfoSpace

BuildingGeneral UseInfoSpace

BuildingGeneral UseInfoSpace

Alice’s Personal Info

General Purpose Apps

Emergency Response Apps

Apr 24 2003 49

LocationBeacons

Building Emergency Response ServiceSoftware Prototype 1

Alice’s InfoSpaceAlice’s InfoSpace

EmergencyResponse InfoSpace

EmergencyResponse InfoSpace

Alice(Room)

BuildingGeneral UseInfoSpace

BuildingGeneral UseInfoSpace

Apr 24 2003 50

LocationBeacons

Building Emergency Response ServiceSoftware Prototype 1

Alice’s InfoSpaceAlice’s InfoSpace

EmergencyResponse InfoSpace

EmergencyResponse InfoSpace

Alice(Room)

BuildingGeneral UseInfoSpace

BuildingGeneral UseInfoSpace

Out

Apr 24 2003 51

LocationBeacons

Building Emergency Response ServiceSoftware Prototype 1

Alice’s InfoSpaceAlice’s InfoSpace

EmergencyResponse InfoSpace

EmergencyResponse InfoSpace

BuildingGeneral UseInfoSpace

BuildingGeneral UseInfoSpace

OutAlice

(Floor)

Alice(Room)Alice

(Room)

Notify

Apr 24 2003 52

LocationBeacons

Building Emergency Response ServiceSoftware Prototype 1

Alice’s InfoSpaceAlice’s InfoSpace

EmergencyResponse InfoSpace

EmergencyResponse InfoSpace

BuildingGeneral UseInfoSpace

BuildingGeneral UseInfoSpace

Out

Alice(Room)Alice

(Room)

Alice(Floor)

Person(Room)

Apr 24 2003 53

LocationBeacons

Building Emergency Response ServiceSoftware Prototype 1

Alice’s InfoSpaceAlice’s InfoSpace

EmergencyResponse InfoSpace

EmergencyResponse InfoSpace

BuildingGeneral UseInfoSpace

BuildingGeneral UseInfoSpace

Alice(Room)

Alice(Floor)

Person(Room)Person(Room)

Apr 24 2003 54

Architecture Evaluation Plan

• First iteration of infrastructure done– Two prototype apps with simulated data– C++ Confab-lite for PDAs (3 weeks by 1 person)– Liquid Distributed Querying (4 weeks by 3 people)

• Low barrier to entry– Person Finder and Emergency Response ~ 1 week each– Further evaluation by having others create simple apps

• Easy to add or modify app-specific privacy controls – Each Privacy Operator ~ 5 days each

• Easy for end-users to control and understandEasy to share info at level users comfortable with – Running preliminary user studies

Apr 24 2003 55

Long-term Evaluation of Privacy

• How effective are the privacy techniques?– Requires long-term deployment of sensors and

apps• Surveys since 1990s on privacy have shown

three basic groups [Westin]

• Risk / benefit sweet spot?– Privacy for Safety– Privacy for Convenience

• My work makes this possibleUnconcerned

12%

Pragmatists63%

Fundamentalists25%

Apr 24 2003 56

Contributions

• Framework: Approximate Information Flows – Minimize flow out, maximize flow in

• Architecture: Context Fabric, infrastructure for privacy-aware apps– Privacy Operators for controlling information flow– Privacy Tags, for limiting dissemination of information– Peer enforcement for privacy

• Evaluation– Applications: Person Finder and Emergency Response– Ease of creating apps by others, in progress

Apr 24 2003 57

Future WorkIterative Design of Ubicomp Applications

• Iterative design is the best practice for creating UIs

• Getting it right the first time is hard

• Lots of experience in iterative design for the Web that can apply to Ubicomp

Design

Prototype

Evaluate

Apr 24 2003 58

Future WorkDesign of Ubicomp Applications

• Co-authored Book on Web Design Patterns– Used in several classes– E-commerce sites,

Shopping Carts, Action Buttons

• What are Design Patterns for Ubicomp?– Context-sensitive I/O– …

• Which existing GUI/Web patterns apply to Ubicomp?

• Can patterns improve the speed with which we can build Ubicomp apps?

Apr 24 2003 59

Future WorkPrototyping of Ubicomp Applications

• Developed SATIN toolkit– Infrastructure for sketching

apps– Ink, interpretation, & zooming– Downloaded over 1200 times

• Georgia Tech, PARC, NRL, UCB

• What are the infrastructure needs of Ubicomp apps?– Privacy– Scalability

• How do you simulate services that are not yet ubiquitous?

• What types of higher level inference services needed?

UIST 2000

Apr 24 2003 60

Future WorkPrototyping of Ubicomp Applications

• Co-developed DENIM– Informal Web prototyping tool– Downloaded over 13,000

times

• What do rapid prototyping tools for Ubicomp apps look like?– Sketch-based– Multimodal– Wizard of Oz– Programming by

Demonstration

CHI 2000

Apr 24 2003 61

Future WorkEvaluation of Ubicomp Applications

• Started WebQuilt Project– Remote Web site usability

testing & analysis tool– Downloaded over 600 times

• How can we evaluate Ubicomp apps?

• What are new methodologies & tools?

• Ubicomp apps often mobile, so remote evaluation tools may work well!

WWW10

Apr 24 2003 62

Conclusions

• Approximate Information Flows Framework for analyzing privacy in ubicomp systems

• Context Fabric architecture for privacy-aware apps

• Evaluation with two applications• Privacy just one aspect of Ubicomp

– Future Work lies in better tools and methods needed to Design, Prototype, and Evaluate Ubicomp

• Ubicomp is coming– Let’s guide it in the right directions

Jason I. Honghttp://guir.berkeley.edu/cfabric

G r o u p f o rUser Interface Research

University of CaliforniaBerkeley

Thanks to:DARPA ExpeditionsPARCIntel Fellowship

NSF ITRGUIR

Apr 24 2003 64

Backup Slides

Backup Slides

Apr 24 2003 65

Verifying Trust Online

• Seals of Approval, Branding• Audits• Consumer Reports, Epinions for ratings• Open Source code that is visible to all• Web of Trust

– A trusts B, B trusts C, A has a reason to trust C• Extend research on Web site credibility to

Ubicomp

Apr 24 2003 66

Incentives for Service Providers

• Market, Social, and Legal forces– Lower costs for “good guys”, raise costs for “bad guys”– Make it harder for “bad guys”, more money by being “good”

• We are in the early phases of Ubicomp– We can help set people’s expectations high– Make it easy for service providers to do, no excuses

• Bias the system towards privacy– Even without privacy operators, still pushes data to edge

• Educate future designers and engineers• Tipping Point for privacy?

– How much buy-in before peer enforcement really effective?

Apr 24 2003 67

Categorizing Privacy Techniques

Strategies for Protecting

Data

Data Lifecycle

Avoid

Prevent

Collection Second UseAccess

Detect

AnonymizationPseudonymization

P3P

AccessControl

LocationSupport

Privacy Mirrors

Wearables

User Interfaces for Feedback, Notification, and Consent

Logging and Periodic Reports

Audits

Privacy Tags

Garbage Collection

LoweringPrecision

Apr 24 2003 68

Privacy TechniqueInteractive Mode

Apr 24 2003 69

Personal Insights

• When starting, I focused on p re ve ntio n– Digital Rights Management, Encryption, Mobile Code– But prevention only goes so far

• Now, focused more on a vo id a nc e and d e te c tio n – Increase transparency of Ubicomp systems– "Trust but verify"

AliceAlice BobBob

Apr 24 2003 70

Ultimate Ubicomp App

Apr 24 2003 71

Yoda

Beware the Dark Side of Ubicomp you must!

Jedi MasterKickass Dude

Apr 24 2003 72

Context Data ModelDivision of Responsibilities

InfoSpace Server

InfoSpace

Tuple

Analogous to web serversManages a collection of InfoSpacesUnit of administrationUnit of deployment

Analogous to a web site / homepageRepresents context data about an entityRepresents zone of protectionManages collection of context tuplesUnit of ownership and addressing

Analogous to individual web pageRepresents single piece of context dataContains privacy preferences and metadataUnit of storage

Apr 24 2003 73

Architecture Recap

• InfoSpaces represent entities and contain Tuples• Operators modify flow of Tuples, primarily for

privacy– Lowering Precision– Access Control– Logging and Periodic Reports– Privacy Tags– Garbage Collection

SinkAlice’s InfoSpaceAlice’s InfoSpace

TupleTupleTuple

Out Operators

Apr 24 2003 74

Thinking about Privacy and Ubicomp

• “The problem, while often couched in terms of privacy, is really one of control. If the computational system is invisible as well as extensive, it becomes hard to know: – what is controlling what– what is connected to what– where information is flowing– how it is being used– what is broken – what are the consequences of any given action

(including simply walking into a room)”[Weiser 1999]

Apr 24 2003 75

Context Data ModelInfoSpaces

• TupleSpace meets Web• TupleSpace

– A shared data space– add(), remove(), query(), subscribe(), unsubscribe()– Complexity shifted into data model and query language

• Web– Leverages existing technology (ex. firewalls)– Leverages well-understood models for administration,

deployment, authoring, and programming– End-user mental model– Independent deployment & anarchic scalability

[Fielding]

Apr 24 2003 76

Quotes on Privacy

• “You know it when you lose it”

• “My own hunch is that Big Brother, if he comes to the United States, will turn out to be not a greedy power-seeker but a relentless bureaucrat obsessed with efficiency” [Vance Packard]

• Privacy relatively new concept in society, “ultimately a psychological construct, with malleable ties to specific objective conditions” [Grudin 2001]

Apr 24 2003 77

Backup Slides

Personal Perspectives

Apr 24 2003 78

My Personal Perspectives on PrivacyPrivacy will be a Continuous Struggle

• Privacy will never be “solved”– Ongoing struggle about relation of individual and

society• Old issues are now manageable risks

– Photography– Telephone

• New issues will arise– Digital rights management– Genetic databases and genetic profiling– Detecting AIDS by shaking someone’s hands

• But fundamentals are still the same– Flow of personal information

Apr 24 2003 79

My Personal Perspectives on PrivacyOur Role as Researchers

• Core problem is rate of change – Social, legal, and market can’t adjust fast enough

• Our job as researchers should be:– Identifying new privacy risks– Figuring out better legal, social, market, and technical

approaches for managing these risks• Better architectures and UIs

– Better education – we’re the ones that build these!• Professionalization of engineers

Apr 24 2003 80

My Personal Perspectives on PrivacyTerrorism and Privacy

• Do it if:– It does not unduly treat average people as suspects– Benefits far outweigh costs

• Total Information Awareness fails b/c of questionable technology

– Ensure Transparency, Accountability, and Oversight• Focus on things that have multiple benefits

– Ex. Monitoring of national medical system• Let’s not forget other issues too

– ~43000 people die every year due to car accidents– $1 billion plus damage due to fire every year

Apr 24 2003 81

Backup Slides

Why Privacy?

Apr 24 2003 82

Why Privacy?Idealistic Reasons

• UN Universal Declaration of Human Rights Article 12– "No one shall be subjected to arbitrary interference with his

privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks."

• Old Hippocratic Oath– "What I may see or hear in the course of the treatment or

even outside of the treatment in regard to the life of men, which on no account one must spread abroad, I will keep to myself, holding such things shameful to be spoken about."

Apr 24 2003 83

Why Privacy?Pragmatic Reasons

• Identity theft• Stalking• Excessive monitoring• Data for one purpose tends to be used for others

– Ex. SSN• The Trackable Society [Kling]

– Stronger enforcement of laws that cannot be done today

– Ex. Speeding, but better tech could enforce it in theory

Apr 24 2003 84

Why Privacy?Cannot Always Reject Technology

• Oakland nurses successfully rejected active badges

• Stakeholders– Admin wanted it for efficiency and accountability– People at desk liked it to find people– Nurses hated it because no benefit to them

• However, nurses could reject only because they had economic upper hand, ie a shortage of nurses

• We build these systems, we have a responsibility to make sure they benefit as many people as possible

Apr 24 2003 85

Why Privacy?Privacy and Technology, Gary Marx

• Anonymity important for encouraging honesty and risk-taking

• Confidentiality can improve communication flows– Doctors, lawyers

• Resource in inter-personal relations• American ideal of starting over• Some information can be used unfairly

– Ex. Religious discrimination• Mental health and creativity• Totalitarian systems lack respect for individuals

Apr 24 2003 86

Medical Record Risks1997 National Research Council Report

• Insiders who make innocent mistakes and cause accidental disclosure of confidential information

• Insiders who abuse their record access privileges

• Insiders who knowingly access information for spite or for profit

• An unauthorized physical intruder who gains access to information

• Vengeful employees and outsiders

Apr 24 2003 87

Backup Slides

Arguments Against Privacy

Apr 24 2003 88

Arguments Against Privacy

• “I have nothing to hide”• Transparent Society• Communitarian argument• We’ll adapt

Apr 24 2003 89

Arguments Against Privacy“I have nothing to hide”

• So why close the door when changing clothes?• Real issue is civil rights and human dignity

– Surveillance gives impression that activity is not proper

– Surveillance can be a pervasive form of repression– Privacy also protects us from excessive norms

[Goffman]

– Empower people to choose what is disclosed and when

Apr 24 2003 90

Arguments Against PrivacyThe Transparent Society, by David Brin

• Openness and accountability are key to a democratic society– The technology is coming, – Let’s opt for complete

transparency• “In all of human history, no

government has ever known more about its people than our government knows about us. And in all of human history, no people have ever been anywhere near as free.”

• Quis Custodiet Ipsos Custodes?

Apr 24 2003 91

Arguments Against PrivacyThe Limits of Privacy, by Amitai Etzioni

• Communitarian argument• Ex. Public safety and Health

– HIV testing for newborns • Ex. Megan’s laws• Communities and Ubicomp

– Can modify flow of info to match community and individual needs

– But, not enough experience with communities & Ubicomp to judge

– Strong potential for abuse in Ubicomp, let’s be conservative

Apr 24 2003 92

Arguments Against PrivacyWe’ll adapt

• Warren and Brandeis’ quote, that privacy is “the right to be left alone”, was about photography (!)

• “One common complaint… was that the telephone permitted intrusion… by solicitors, purveyors of inferior music, eavesdropping operators, and even wire-transmitted germs. Messages come unbidden; background sounds reveal intimacies of the home to the caller…” [Fischer 1994]

Apr 24 2003 93

Arguments Against PrivacyWe’ll adapt

• Credit card slips in restaurants easily stolen• Cell Phones provide rough location• Credit card data charges limited to $50

• Let’s take it slow and be conservative here, can remove privacy over time, can’t put it back in

Apr 24 2003 94

Backup Slides

Related Work

Apr 24 2003 95

Related WorkP3P

• Platform for Privacy Preferences Project– Standard machine-readable format for defining

privacy practices on web sites– Designed to be integrated into Web Browsers

• Orthogonal to Confab– Confab focuses on several techniques for privacy– P3P could be integrated as one of them

• Related Work– Langheinrich looking at integrating P3P into Ubicomp

Apr 24 2003 96

Related WorkResearch on Lying (1/2)

• Lies are pretty common [DePaulo and Kashy 1996]

– 77 university students and 70 community members– 1-2 lies daily, 1 in 3 interactions (students)– Quality of relationships with same gender => fewer

lies– Kind of lie related to gender

• Self-centered lies told to men• Other-centered lies told to women

– Socially adroit people told more lies– Easier to lie when not face-to-face

Apr 24 2003 97

Related WorkResearch on Lying (2/2)

• Cultural differences in lying [Aune and Waters 1994]

– Collectivist (Samoan) vs individualist (USA) societies– Collectivists more likely to attempt to deceive when

related to group / family or authority-based concerns– Individualists more likely to lie to protect privacy or the

feelings of the target person

Apr 24 2003 98

Related WorkFair Information Practices

• Notice - Notice of data collection• Choice - Consent over collection• Onward Transfer - Consent over secondary

use• Access - See data about self• Security - Reasonable safeguards• Data Integrity - Data is accurate• Enforcement - Enforcing policies and redress

Apr 24 2003 99

Related WorkOECD Fair Information Practices

• Collection Limitation - Limited collection with consent

• Data Quality - Relevant and up-to-date• Purpose Specification - Purpose at time of collection• Use Limitation - Restrict use to said purposes• Security Safeguards - Reasonable security• Openness Principle - Existence of data known• Individual Participation - Obtain and correct the

data• Accountability - Someone accountable

Apr 24 2003 100

Related WorkCommentary on Fair Information Practices

• FIPs meant for governments and corporations– Need a framework that also deals with individuals– Also wide range of trust, from family to friends to co-workers

• Spectrum of apps require different kinds of practices– Commercial apps vs. Firefighter apps vs. National Security apps– App running at home vs. App running at work

• Notification and Consent impractical in some cases– Cannot always readily notify (ex. traffic monitoring)– Possibly no alternatives (cannot opt out of building security

cameras)– Pervasive sensors significantly increases scale

• Need a framework that considers:– Risks / Benefits, Identifiability, Quality, Quantity, and Scope of

data

Apr 24 2003 101

Related WorkSmart Dust / TinyOS / TinyDB

• Small, reconfigurable wireless sensors• Strongly driven from systems perspective

– Power management– Dead nodes– Continuous and adaptive query processing

• This work starts from a human-centered perspective– Privacy concerns– Cost / benefit of ubicomp– Providing people control over their data

Apr 24 2003 102

Related WorkContext Toolkit

• Uniform abstraction for sensors– GPS, beacons, Active Badges map to Location widget– Interpreters for transforming data

• Confab has different focus– Data model and data management rather than

sensors– Privacy focus

Apr 24 2003 103

Related WorkHP CoolTown

• Web presence for people, places, things– Associate dynamically updated web pages with

entities– Beacon out or scan in URLs

• Confab – Focuses on context for machines vs for people– Focuses on privacy issues– Can leverage technologies for transmitting URLs

Apr 24 2003 104

Related WorkParcTab System

• Confab is an evolution of ParcTab system– Splits ParcTab Dynamic Environments into multiple

independent InfoSpaces• Tries to push data to the edge, to end-users

– Confab focuses on privacy and data manipulation

Apr 24 2003 105

Related WorkEventHeap / iRoom

• iRoom uses TupleSpace to coordinate events for a Smart Room– Level of indirection provides fault-tolerance and

uniform level of abstraction• Confab also uses TupleSpace

– For same reasons, simple and uniform API– Focuses on privacy and sharing of data

Apr 24 2003 106

Related WorkSemantic Web / DAML

• Semantic Web– Markup for computers rather than for people– Rich (and complex!) language for modeling

(motherOf subProperty parentOf)

(Mary motherOf Bill)

(Mary parentOf Bill)

• Semantic Web has no story for:– Privacy, ie individuals managing personal data– Handling sensor data and dynamic updates

• Confab aims for a simpler model– Complexity of Semantic Web is huge barrier to entry– Start simple

Apr 24 2003 107

Related WorkWeb Services

• Three parts to web services– SOAP, remote procedure call over web– WSDL, description of service API– UDDI, registry for web services (like white pages)

• Strength and weakness of web services is specialized API– Pro: Lots of semantics, highly tuned for specific app– Con: No network effects, need new apps for each

service– Key insight to web is only few methods on lots of

datatypes (like TupleSpace)

Apr 24 2003 108

Related WorkGrid Computing

• Similar constraints but very different goals– Scale– Heterogeneity– Unpredictable structure– Multiple administrative domains

• Grid is focused on creating a networked virtual supercomputer rather than ubicomp services

Apr 24 2003 109

Backup Slides

Future Work

Apr 24 2003 110

Future WorkLiquid Distributed Queries

• Querying across multiple InfoSpaces– Ex. “Average age of all people in the room”

• average (room.people.age)• Update as people go in and out

– Ex. “Average temperature for division X”• average (division.members.temperature)

– Ex. “Monitor water pressure for all companies”• company.water-pressure• Update as companies go in and out

• Prototype Liquid already works– Lots of weird cases that need to be handled, though

Apr 24 2003 111

Future WorkQuery Planning

• Went for simple case first– Hardwired chain of operators

• Really need query planning here– Which operators to apply and in what order– Some weird cases when processing privacy operators

• Should take same approach as databases– Generate several plans– Avoid really bad ones– Execute it

Apr 24 2003 112

Future WorkImperfect Mirror Worlds

• Perfect Mirror Worlds may not be desirable– Highly reliable, highly connected, continuously

updated• Reasons

– No room to hide– No room for ambiguity and deniability– No freedom of action

• Better metaphors?– Cell phones or Instant Messenger?– End-user is in control

• How to build physical layer to support this?

Apr 24 2003 113

Future WorkPhysical Layer Privacy

• Inspired by Peter Swire’s “TrustWrap”– Johnson and Johnson built trust into every

transaction. – Customers use their own senses to reaffirm that the

Tylenol is safe. They touch the plastic wrap, and they see both the plastic wrap and the foil before they take a pill.”

• How to build trust into sensors?– People can see and understand the privacy model– Ex. Cameras that have translucent plastic in front– Ex. Simple motion sensors (familiarity)– Local scope, local and immediate feedback

Apr 24 2003 114

Future WorkMatching Social Expectations

• Berkeley Sproul Plaza• Seen by hundreds, but no privacy worries,

why?– Low identifiability– Limited scope of data– Reciprocity– Forgetfulness– Expectations

• What kinds of applications does this suggest?

Apr 24 2003 115

Future WorkTrust Mechanisms

• Privacy depends on trust that people will do the right thing

• Need better trust mechanisms– Web of Trust (“I trust Alice, Alice trusts Bob”)– Transparency via Audits (“Trust but verify”)

• Also a problem encountered on e-commerce sites– Branding– Well-designed professional sites– Ease-of-use

• Need this for Ubicomp

Apr 24 2003 116

Future WorkPerceptions of Privacy Change Over Time

• Perceptions change with experience [Pew Internet]

– Lowest privacy concerns with white males (first adopters)

– Internet privacy concerns greater among novices, parents, elderly, and women (middle-late adopters)

– Online experience increases number of trusting activities and commercial activities (small sample set though)

• Likely that this will apply to Ubicomp• However, due to pervasiveness of Ubicomp,

really should be conservative here– Early failures could cost us a lot here– Also, our responsibility to respect human dignity here

Apr 24 2003 117

Future WorkDesigning Context-Aware Systems

• Minimize automatic actions– Calculate cost-to-benefit, both statically and dynamically

• Feedback– What is being captured?– Why did the system do that?

• Feed-forward– If you do that, then the system will do this

• Confirmation– The system just did the following action

• Less identifiability– Model Places and Things rather than people– Less identifiable entities, ex. “Some person” rather than “Jason”

• Endpoint– How much context is for people, how much for computers?