Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia...

Preview:

Citation preview

Privacy in Wearable Privacy in Wearable ComputingComputing

Thad Starner

Contextual Computing Group

College of Computing

Georgia Tech

HandoutsHandouts

The Challenges of Wearable Computing (Starner)

Privacy Protection in the 1980’s (Turn)Excerpts from Agre and Rotenberg

Technology and Privacy: The New Landscape

BackgroundBackground

College of Computing, Georgia Tech

Founder, Charmed Technologies

Founder, MIT Wearable Computing Project

IEEE ISWC and Wearable Information Systems TC

Everyday use since 1993

Wearable ChallengesWearable Challenges

Power and heat (mips/watt)On and off-body networking (bits/joule)PrivacyInterface (additional capability vs. load)

– User Interface (cognitive load)– Ergonomics/human factors (weight, heat, etc.)

(Intertwined – changing one effects the others)

Resources (wearable and Resources (wearable and ubiquitous computing)ubiquitous computing)

Phil Agre’s Red Rock Eater list ACM technology alerts Foner “Political Artifacts and Personal Privacy: The

Yenta Multi-Agent Distributed Matchmaking System” (MIT PhD thesis)

Langheinrich “Privacy by Design” EPIC, EFF, ACLU, CPSR, Privacy Journal (on-line),

privacyinterational.org, privacy.org Colleagues: Rhodes, Bruckman, Foner, Kapor, Mann,

Pentland, wearables research mailing list, privacy panel ISWC98

Resources (general)Resources (general)

Pool: Technologies of Freedom, The Social Impact of the Telephone

Agre and Rotenberg: Technology and Privacy: the New Landscape

Westin: Privacy and Freedom R.E. Smith: Our Vanishing Right to Privacy Rothfeder: Privacy for Sale Miller: The Assault on Privacy McCarthy: The rights of Publicity and Privacy Rosen: The Unwanted Gaze (also the article “Is Nothing

Private?”)

Resources (general)Resources (general)

Rosen: The Unwanted Gaze (also the article “Is Nothing Private

Computers, Freedom, and PrivacyIEEE Security and Privacy

DefinitionsDefinitions

Privacy – right of individuals to control the collection and use of personal information about themselves

Security- protection of information from unauthorized users

Definition (Gellman)Definition (Gellman)

“No definition … is possible, because issues are fundamentally matters of values, interests, and power”

Black’s Law DictionaryBlack’s Law Dictionary

Right of privacy: The right to be let alone; the right of a person to be free from unwanted publicity; and the right to live without unwarranted interference by the public in matters with which the public is not necessarily concerned … concept of ordered liberty, and such right prevents governmental interference in the intimage personal relationships or activities, freedoms of the individual to make fundamental choices involoving himself, his family, and his relationshkp with others.

Privacy ViolationPrivacy Violation

Tort – “civil wrong”Unreasonable intrusion upon the seclusion

of another individual if such intrusion would be highly offensive to a reasonable person

Appropriation of the other’s name or likeness for one’s own use or benefit

Privacy Violation (cont.)Privacy Violation (cont.)

Unreasonable publicity given to the other’s private life if the published matter would be highly offensive to a reasonable person and is no concern of the public

Publicity that unreasonably places the other in a false light before the public where the false light would be highly offensive to a reasonable person and the publisher knows the falsity of the published matter

Revolutionary War to NowRevolutionary War to Now

Wilkes case in EnglandPackwood diariesClinton

What are PDAs and What are PDAs and wearables?wearables?

Filing cabinets?– Who bought the machine?– Who owns it?

Diaries?– What is its use?– Separate section for private info?

A Selection of CasesA Selection of Cases

“Fatty” Arbunckle– Tabloids, ubiquitous surveillance, and the changing

social perception of smear campaigns Larry Flynn and the Republican witch hunt Linda Tripp Cellular phone monitoring

– Newt Gingrich– Prince Charles– 911 emergency phone call

EZ Pass

Anti-privacy ArgumentsAnti-privacy Arguments

If you have nothing to hide then you should have no concern for your privacy– Many personal situations might not want

exposed (victim of rape, child abuse, fraud, …)– Opponents will use facts in the worst possible

light (politicians, tabloids, etc.)– Unfair, unregulated environments (racism,

health concerns, etc.)

Anti-privacy ArgumentsAnti-privacy Arguments

“I don’t care about privacy”– But other people have the right to care about

theirs– Equivalent to saying “I don’t say anything

controversial so therefore I don’t care about free speech”

Anti-privacy ArgumentsAnti-privacy Arguments

Privacy discussion is overblown. Big organizations don’t really care about individuals, just narrow goals.– Most harm is in the aggregate, but can still have large

effects on the individual High cost home mortgage loans

– FBI files - sabotage against nonviolent dissidents– Whistle blowers Ralph Nader and GM– Individuals against individuals – University professors

and Freedom of Information act

Anti-privacy ArgumentsAnti-privacy Arguments

Surveillance is inevitable – the real issue is achieving a balance of power where we can watch the people who are watching us– Equal access to information is not the same as equal

ability to use that information. A corporation or government can, and often do, dedicate resources to watching a person or group of people which few individuals can match.

– Political/technical feasiblity of forcing corporations/governments/elite to comply

Anti-privacy ArgumentsAnti-privacy Arguments

Computer technology is just returning us to the rural village of yesteryear/ Technology brings nothing new to the privacy arguments– Simply incorrect. Computers allow large scale data-

mining that was previously impossible with paper– In rural villages, did not have to worry about

corporations with large budgets and large databases. A rural village implies some sense of parity.

Anti-privacy argumentsAnti-privacy arguments

We must balance privacy and industrial concerns/society/government expense– Assumes you can not have the same services in

a privacy-preserving manner. In most cases you can – even at lower cost!

– Ignores industry created by having active privacy protections

Anti-privacy ArgumentsAnti-privacy Arguments

Individuals can make up their own minds about what to reveal– Not if they don’t have the proper information– Currently, the companies that would like to benefit

from your information inform the individual of risks – if they bother at all

– This opinion pits one person’s capabilities against that of large companies with specialists who concentrate on exploiting such data. It also assumes technology will not make new uses of such data possible in the future

Anti-privacy ArgumentsAnti-privacy Arguments

There is no privacy in public– Reasonable expectation– U.S. law (used to be) designed to protect people

not places (1967)– Aggregation of data, not individual instances – Just because it can be done doesn’t mean it isn’t

wrong.

Anti-privacy ArugmentsAnti-privacy Arugments

Companies that distribute collections of personal data are protected under free speech laws– Who owns the bits? Is personal information

property? If so, the rules of copyright and patent are accepted restrictions on the use of such speech

Anti-privacy ArgumentsAnti-privacy Arguments

Tagging a car or cellular phone does not equal tagging a person – circumstantial evidence– Circumstantial evidence is used all the time –

both in and out of court (tabloids)– License plate– EZ Pass in NYC– License for more directed

investigation/harassment

Anti-privacy argumentsAnti-privacy arguments

People on welfare should expect a reduction of privacy for benefits / the elite can violate privacy with enough money – why not make that available to everyone– The rights to privacy should not vary according

to social class – no reason– In any implementation, someone can take

advantage. That does not mean we should not design our systems as well as possible.

U.S. Privacy Act of 1974U.S. Privacy Act of 1974(Turn and Langheinrich)(Turn and Langheinrich)

Openness and transparency – no secret record keeping

Individual participation- ability to see own records Collection limitation – don’t exceed needs Data quality- relevant to purpose and updated Use limitation- only authorized personnel for

specific purpose Reasonable security Accountability

U.S. Privacy Act of 1974U.S. Privacy Act of 1974

Sounds reasonable, but..Only applicable to federal agencies and

certain contractors!!!!Conflict with Freedom of Information

Acts!!

EU Directive 95/46/ECEU Directive 95/46/EC

Data may be transferred only to non-EU countries with “adequate” levels of privacy protection

Explicit consentU.S. Safe Harbor – countries self-certify

– HP only major player

Detecting Privacy ViolationsDetecting Privacy Violations

Violations must be punishable and detectable Different aliases

– E-mail– True names, addresses

Trap entries– Bogus street names– Bogus student names/addresses

Ralph Nader and General Motors

Leonard FonerLeonard Foner

“Those who design systems which handle personal information therefore have a special duty: They must not design systems which unnecessarily require, induce, persuade, or coerce individuals into giving up personal privacy in order to avail themselves of the benefit of the system being designed”

Risks of Ubiquitous Risks of Ubiquitous Computing (Langheinrich)Computing (Langheinrich)

UbiquityInvisibilitySensingMemory amplification

– Bush’s Memex

Wearable vs. Environmental Wearable vs. Environmental ApproachApproach

Data collected on userReleased by userExtreme form: all sensing powered by user

– On-body sensing– RFID

Wearable confounderLittle brother vs. big brother

Privacy Barriers (Starner)Privacy Barriers (Starner)

PhysicalTechnological

– Encryption, biometrics, etc.Legislative

– Changing laws, software monopolies, speed of innovation, enforcement

SocialObscuring

Privacy by Design Privacy by Design (Langheinrich)(Langheinrich)

NoticeChoice and consentAnonymity and pseudonymityProximity and localityAdequate securityAccess and Recourse

Case study - RAVECase study - RAVE

Case Studies – Active BadgeCase Studies – Active Badge

Active badge system for security and convenience at a U.S. state university– Access to secure areas at a distance– Purchasing of snacks at vending machines– Tracking telephone calls– Auto-login– …

DangersDangers

Security– Spoofing– Tracking– Traffic analysis– Social hacking– Bribery

Legal access to data– FOI– Discrimination/harassment lawsuits

Case StudiesCase Studies

Global Positioning SystemLocustCellular phone 911 trackingPagersEZPassAutomobile black box

Case Studies (cont)Case Studies (cont)

Patent search systemMIT Wearable Computing Project

– Remembrance Agent Augmented memory to previous conversations Deniability Implicit, unpredictable violations of shared

databases

– Webcam

Case Studies (cont)Case Studies (cont)

Face Recognition– FaceIT– Eigenfaces– Social engagement

Augmented Reality and Snowcrash’s CIC

Technologists Are the First Technologists Are the First Line of ProtectionLine of Protection

Design the system so that it is easier and more economical to preserve privacy than violate it

Provide mechanisms by which privacy violation can be detected

Use combinations of mechanisms so that privacy and security can be adjusted for different social conditions and new threats

Design with guidelines in mind

Benjamin FranklinBenjamin Franklin

They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety

Recommended