50
Privacy Preserving In LBS Evaluating Privacy of LBS Algorithms In Dynamic Context 1

Privacy Preserving In LBS

  • Upload
    prince

  • View
    52

  • Download
    0

Embed Size (px)

DESCRIPTION

Privacy Preserving In LBS. Evaluating Privacy of LBS Algorithms In Dynamic Context. Outline. Introduction Design Model & Workflow System Design Specification : General Approach Build Privacy Case Based Database Conclusion & Future Work. Introduction(1). What is Context ? [1] - PowerPoint PPT Presentation

Citation preview

Page 1: Privacy Preserving In LBS

1

Privacy Preserving In LBS

Evaluating Privacy of LBS Algorithms In Dynamic Context

Page 2: Privacy Preserving In LBS

2

Outline

IntroductionDesign Model & Workflow System Design Specification : General

ApproachBuild Privacy Case Based Database Conclusion & Future Work

Page 3: Privacy Preserving In LBS

3

Introduction(1)

What is Context ? [1]Any information that can be used to

characterize the situation of entities (whether a person, place or object) are considered relevant to the interaction between (include themselves) User Application

Page 4: Privacy Preserving In LBS

4

Introduction(2)

Problem of Privacy Preserving in Dynamic Context ?

Different services require different algorithms.

Even in the only one service?How to evaluate privacy algorithms

in dynamic context ?

Page 5: Privacy Preserving In LBS

5

Outline

IntroductionDesign Model & Workflow System

Design Model System Workflow System

Design Specification : General Approach

Build Privacy Case Based Database Conclusion & Future Work

Page 6: Privacy Preserving In LBS

6

Design Model System(1)

Page 7: Privacy Preserving In LBS

7

Evaluation Module[2][3]

Page 8: Privacy Preserving In LBS

8

Workflow System

Page 9: Privacy Preserving In LBS

9

Outline

IntroductionDesign Model & Workflow System Design Specification : General

ApproachBuild Privacy Case Based Database Conclusion & Future Work

Page 10: Privacy Preserving In LBS

10

Design Specification : General Approach

Introduction to Privacy Attack ModelsLocation Distribution AttackMaximum Movement Boundary

AttackQuery Tracking AttackMessage attributes summary

Page 11: Privacy Preserving In LBS

11

Introduction to Privacy Attack Models

Privacy attacks are categorized by Privacy Attack Model (adversary model).

Attack models are differ by : Collected target information. Attacker ability in capturing message

during service provisioning. Attacker background knowledge.

Page 12: Privacy Preserving In LBS

12

Privacy Attack ModelsIntroduction (cont.)

Attack model Conditions/When it takes place

Appropriate techniques & Algorithm

Location Distribution Attack -User locations are known.-Some users have oulier locations.-The employed spatial cloaking tends to generate minimum areas.

-adding k-region property.Algorithm : CliqueCloak …

Maximum Movement Boundary (MMB) attack

-continuous captured queries.-same pseudonym for 2 consecutive updates.-maximum speed is known.

-adding safe update property.-techniques : patching, delaying …

Query Tracking Attack -User locations are known.-continuous captured queries.-same pseudonym for 2 consecutive updates.

-adding memorization property.

Page 13: Privacy Preserving In LBS

13

Privacy Attack ModelsContent

IntroductionLocation Distribution AttackMaximum Movement Boundary

AttackQuery Tracking AttackMessage attributes summary

Page 14: Privacy Preserving In LBS

14

Privacy Attack ModelsLocation Distribution Attack

Location Distribution Attack takes place when: User locations are known Some user have outlier

locations. The employed spatial

cloaking algorithm tends to generate minimum areas.

C

D

E

B

A

F

Given a cloaked spatial region covering a sparse area (user A) and a partial dense area (user B, C, and D), an adversary can easily figure out that the query issuer is an outlier.

Page 15: Privacy Preserving In LBS

15

Solution to Location Distribution Attackk-Sharing property

K-sharing Region Property: A cloaked spatial region not only contains at least k users, but it also is shared by at least k of these users.

The same cloaked spatial region is produced from k users. An adversary cannot link the region to an outlier.

C

D

E

B

A

F

Result in an overall more privacy-aware environment. Example of technique that are free from this attack

include CliqueCloak.

Page 16: Privacy Preserving In LBS

16

m (k=3)

Solution to Location Distribution AttackCliqueCloak algorithm

Each user requests: A level of k anomity. A constraint area.

Build an undirected constraint graph. Two nodes are linked, if their constraint areas contain each other.

The cloaked region is the MBR (cloaking box) that includes the user and the neighboring nodes. All users within an MBR use that MBR as their cloaked region.

A (k=3)

C (k=2)

B (k=4)D (k=4) F (k=5)

H (k=4)

E (k=3)

For a new user m, add m to the graph. Find the set of nodes that are linked to m in the graph and has level of anonymity less than m.k.

Page 17: Privacy Preserving In LBS

17

Solution to Location Distribution AttackCliqueCloak pseudo-code

•while TRUE do• pick a message m from S.• N ← all messages in range B(m)• for each n in N do:

• if P(m) is in B(n) then:add the edge (m,n) into G

• M ← local_k_search(m.k, m, G)• if M ≠ Ø then

• Bcl(M) ← The minimal area that contains M

• for each n in M do• remove n from S• remove n from G• nT ← < n.uid, n.rno, Bcl(M), n.C >• output transformed message nT

• remove expired messages from S

Building constraint graph G

Building transformed messages from all messages in M

Finding a subset M of S s.t. m is in M, m.k = |M|, for each n in M n.k ≤ |M|, and M forms a clique in G.

Page 18: Privacy Preserving In LBS

18

Solution to Location Distribution AttackCliqueCloak pseudo-code(cont.)

local_k_search(k, m, G)

• U ← { n | (m,n) is an edge in G and n.k ≤ k }

• if |U| < k-1 then

return Ø

• l ← 0

• while l ≠ |U| do

• l ← |U|

• for each u in U do

• if |{G neighbors of u in U}| < k-2 thenU ← U \ {u}

• find any subset M in U s.t. |M| = k-1 and M U {m} forms a clique

• return M U {m}

Find a group U of neighbors to m in G s.t. their anonymity value doesn’t exceed k.

Remove members of U with less than k-2 neighbors, that cannot provide us with a (k-1)-clique

Look for a k-clique inside U.

Page 19: Privacy Preserving In LBS

19

Solution to Location Distribution AttackCliqueCloak message specification

A plan message (from client to server) m consists of:

• m.uid = Unique identifier of the sender

• m.rno = Message’s reference number

• P(m) = Message’s spatial point (e.g. the client’s current location).

• B(m) = Message’s spatial constraint area

• m.t = Message’s temporal constraint (expiration time)

• m.C = Message’s content

• m.k = Message’s anonymity level

Page 20: Privacy Preserving In LBS

20

Solution to Location Distribution AttackCliqueCloak message specification(cont.)

A transformed message (from server to database) mT consists of:

• m.uid , m.rno

• Bcl(m) = Message’s spatial cloaking box

• m.C

Page 21: Privacy Preserving In LBS

21

Solution to Location Distribution AttackEvaluation of CliqueCloak

Pros:

Free from location distribution attack (query sampling attack).

Cons: suffers from high computational cost as it can support

only k-anomity up to k=10. cost of searching a clique in a graph is costly. some requests that cannot be anonymized will be

dropped when their lifetimes expire.

Page 22: Privacy Preserving In LBS

22

Privacy Attack ModelsContent

IntroductionLocation Distribution AttackMaximum Movement Boundary

AttackQuery Tracking AttackMessage attributes summary

Page 23: Privacy Preserving In LBS

23

Privacy Attack ModelsMaximum Movement Boundary Attack

Maximum movement boundary attack takes place when: Continuous location updates or

continuous queries are considered The same pseudonym is used for

two consecutive updates The maximum possible speed is

known

The maximum speed is used to get a maximum movement boundary (MBB)

The user is located at the intersection of MBB with the new cloaked region

Ri

Ri+1

I know you are here!

Page 24: Privacy Preserving In LBS

24

Solution to Maximum Movement Boundary Attack Safe Update Property[4]

Two consecutive cloaked regions Ri and Ri+1 from the same users are free from the maximum movement boundary attack if one of these three conditions hold:

Ri

Ri+1

① The overlapping area satisfies user requirements

Ri

Ri+1

② Ri totally covers Ri+1

Ri

Ri+1

③ The MBB of Ri totally covers Ri+1

Page 25: Privacy Preserving In LBS

25

Solution to Maximum Movement Boundary Attack

Patching and Delaying[4][9]

Patching: Combine the current cloaked spatial region with the previous one

Delaying: Postpone the update until the MMB covers the current cloaked spatial region

Ri

Ri+1

Ri

Ri+1

Page 26: Privacy Preserving In LBS

26

Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10]

Main ideaIncrementally maintain maximal cliques for location cloaking in an un-directed graph that takes into consideration the effect of continuous location updates.

Use a graph model to formulate the problem Each mobile user is represented by a node in the graph An edge exists between two nodes/users only if they are

within the MMB of each other and can be potentially cloaked together

Page 27: Privacy Preserving In LBS

27

Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10]

Graph Modeling Let G(V, E) be an undirected graph where V is the set of

nodes/users who submitted location-based query requests, and E is the set of edges.

There exists an edge evwbetween two nodes/users v and w, if and only if

Page 28: Privacy Preserving In LBS

28

Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10]

Algorithm maximal clique as a clique that is not contained in any

other clique. start with a graph without any edges All nodes themselves constitute a set of 1-node cliques. Then add the edges to the graph one by one and

incrementally update the set of maximal cliques. the cliques where the user of the new request is

involved might be candidate cloaking sets, classified to three classes:

positive candidates negative candidates not candidates

Page 29: Privacy Preserving In LBS

29

Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10]

Performance

Page 30: Privacy Preserving In LBS

30

Maximum Movement Boundary AttackAtributes

Location Time Maximum velocity Privacy level k User-tolerant maximum area Amax

Page 31: Privacy Preserving In LBS

31

Privacy Attack ModelsContent

IntroductionLocation Distribution AttackMaximum Movement Boundary

AttackQuery Tracking AttackMessage attributes summary

Page 32: Privacy Preserving In LBS

32

Query attacks

K-anonymity: Interval Cloak, CliqueCloak, Uncertainty

Cloaking,…

Query attacks: Query sampling attacks Query homogeneity attacks Query tracking attacks

Page 33: Privacy Preserving In LBS

33

Query homogeneity attacks[12]

Page 34: Privacy Preserving In LBS

34

Query tracking attacks[4]

C

D E

BI

J

A

F

H

K

G

At time ti {A,B,C,D,E}

At time ti+1{A,B,F,G,H}

At time ti+2 {A,F,G,H,I}

This attack takes place when: Continuous location updates or

continuous queries are considered The same pseudonym is used for

several consecutive updates User locations are known

Once a query is issued, all users in the query region are candidates to be the query issuer

If the query is reported again, the intersection of the candidates between the query instances reduces the user privacy

Page 35: Privacy Preserving In LBS

35

Solutions

Memorizing m-InvarianceHistorical k-Anonymity…

Page 36: Privacy Preserving In LBS

36

Memorizing[4]

C

D E

BI

J

A

F

H

K

G Remember a set of users S

that is contained in the cloaked spatial region when the query is initially registered with the database server

Adjust the subsequent cloaked spatial regions to contain at least k of these users.

If a user s is not contained in a subsequent cloaked spatial region, this user is immediately removed from S.

This may result in a very large cloaked spatial region. At some point, the server may decide to disconnect the query and restart it with a new identity.

Page 37: Privacy Preserving In LBS

37

Query m-Invariance[11][13]

Query l-diversity: ensures that a user cannot be linked to less than ℓ distinct service attribute values.

Page 38: Privacy Preserving In LBS

38

Query m-Invariance(cont)

Satisfying location k-anonymity

Satisfying query ℓ-diversity

query 3-diverse and location 3-anonymous

Page 39: Privacy Preserving In LBS

39

Query m-Invariance(cont)

Query m-Invariance: the number of possible query association attacks will increase if a user can be associated with more number of service attribute values.

Page 40: Privacy Preserving In LBS

40

Attributes

A plain message sent from user: Id: Unique identifier of the sender. Ref: Message’s reference number. P: Message’s spatial point (e.g. user current

location). C: Message’s content. k: Message’s anonymity level. ℓ: Message’s diversity level. m: Message’s invariance level.

Page 41: Privacy Preserving In LBS

41

Privacy Attack ModelsContent

IntroductionLocation Distribution AttackMaximum Movement Boundary

AttackQuery Tracking AttackMessage attributes summary

Page 42: Privacy Preserving In LBS

42

Attack Model privacyMessage attributes summary

A plain message sent from user must consist of 11 attributes: Id: Unique identifier of the sender. Ref: Message’s reference number. P: Message’s spatial point (e.g. user current location). B: Message’s spatial constraint area. t: Message’s temporal constraint (expiration time). v: velocity / maximum speed. QoS: quality of service. C: Message’s content. k: Message’s anonymity level. ℓ: Message’s diversity level. m: Message’s invariance level.

Page 43: Privacy Preserving In LBS

43

Outline

IntroductionDesign Model & Workflow System Design Specification : General

ApproachBuild Privacy Case Based Database Conclusion & Future Work

Page 44: Privacy Preserving In LBS

44

Build Privacy Case Based Database

From attack model and attributes we found, a case will include: Input attributes Graph Algorithm using to protect privacy

Specification Define a interval for each attribute Define some properties which input must satisfy them

Note To reduce the computation, we just calculate on

subgraph which is related to the query issuer. Database will delete queries which is expired

Page 45: Privacy Preserving In LBS

45

Outline

IntroductionDesign Model & Workflow System Design Specification : General

ApproachBuild Privacy Case Based Database Conclusion & Future Work

Page 46: Privacy Preserving In LBS

46

Conclusion

Evaluating privacy algorithms in dynamic context need a flexible technique Calculation case-based Ontology reasoner

Attack Models are core component of calculation case-based

Page 47: Privacy Preserving In LBS

47

Future Work

Continue on case-base specification Research other attack models Study on User, CP, SP.

Select appropriate structure for case-base data. Tree structure: parent node present a

more general case.Specify Ontology Reasoner.

Page 48: Privacy Preserving In LBS

48

Reference

[1]. Anind K. Dey and Gregory D. Abowd. Towards a Better Understanding of Context and Context-Awareness. In Graphics, Visualization and Usability Center and College of Computing, Georgia Tech, Atlanta, GA USA, 30332-0280, 2000.

[2]. Yonnim Lee, Ohbyung Kwon: An index-based privacy preserving service trigger in context-aware computing environments. In Expert Systems with Applications 37, p.5192–5200, 2010.

[3]. Claudio Bettini, Linda Pareschi, Daniele Riboni: Efficient profile aggregation and policy evaluation in a middleware for adaptive mobile applications. In: Pervasive and mobile computing. ISSN 1574-1192, p. 697-718, 2008 Oct.

[4] Mohamed F. Mokbel. Privacy in Location-based Services: State-of-the-art and Research Directions. 2007 International Conference on Mobile Data Management.

[5] B. Gedik and L.Liu. A customizable k-Anonymity Model for Protecting Location Privacy. Proc. IEEE Int'l Conf. Distributed Computing Systems (ICDCS '05), pp. 620-629, 2005.

[6] B. Gedik and L.Liu. Location Privacy in Mobile Systems: A Personalized Anonymization Model. In ICDCS, 2005.

Page 49: Privacy Preserving In LBS

49

Reference

[7] Z. Xiao, X. Meng and J. Xu. Quality Aware Privacy Protection for Location-based Services. Proc. the 12th Int. Conf. on Database Systems for Advanced Applications (DASFAA '07), Bangkok, Thailand, April 2007.

[8] Chi-Yin Chow and Mohamed F. Mokbel. Enable Private Continuous Queries For Revealed User Locations. Proc. Int'l Symp. Spatial and Temporal Databases (SSTD), 2007.

[9] . Reynold Cheng, Yu Zhang, Elisa Bertino, and Sunil Prabhakar. Preserving User Location Privacy in Mobile Data Management Infrastructures. In Proceedings of Privacy Enhancing Technology Workshop, PET, 2006.

[10]. X.Pan, J.Xu, and X.Meng. Protecting Location Privacy against Location-Dependent Attack in Mobile Services. Conference on Information and Knowledge Management. Proceeding of the 17th ACM conference on Information and knowledge management 2008,  Napa Valley, California, USA    October 26 - 30, 2008.

[11] Rinku Dewri, Indrakshi Ray, Indrajit Ray and Darrell Whitley. Query m-Invariance: Preventing Query Disclosures in Continuous Location-Based Services., 11th International Conference on Mobile Data Management, MDM 2010, Kansas City, Missouri, USA, May 23-26, 2010

Page 50: Privacy Preserving In LBS

50

Reference

[12] Fuyu Liu, Kien A. Hua, Ying Cai. Query l-diversity in Location-Based Services, in Proceedings of the 10th International Conference on Mobile Data Management: Systems, Services and Middleware, 2009, pp. 436–442.

[13] Panos Kalnis, Gabriel Ghinita, Kyriakos Mouratidis, and Dimitris Papadias. Preventing Location-Based Identity Inference in Anonymous Spatial Queries, IEEE Transactions on Knowledge and Data Engineering, vol. 19, no. 12, pp. 1719-1733, Aug. 2007