30
Solving trust issues using Z3 Z3 SIG, November 2011 Moritz Y. Becker, Nik Sultana Alessandra Russo Masoud Koleini Microsoft Research, Cambridge Imperial College Birmingham University

Solving trust issues using Z3

Embed Size (px)

DESCRIPTION

Solving trust issues using Z3. Moritz Y. Becker, Nik Sultana Alessandra Russo Masoud Koleini Microsoft Research, Cambridge Imperial College Birmingham University. Z3 SIG, November 2011. probe. - PowerPoint PPT Presentation

Citation preview

Page 1: Solving trust issues using Z3

Solving trust issues using Z3

Z3 SIG, November 2011

Moritz Y. Becker, Nik Sultana Alessandra Russo Masoud Koleini Microsoft Research, Cambridge Imperial College Birmingham University

Page 2: Solving trust issues using Z3

What can be detectedabout policy A0?

probe

observe Infer

?

e.g. SecPAL, DKAL, Binder, RT, ...

Page 3: Solving trust issues using Z3

A simple probing attack

𝑨𝟎❑

No: A0 ∪ A ⊬ q

Yes: A0 ∪ A ⊢ q

SvcAlice

Svc says secretAg(Bob)!

Alice can detect “Svc says secretAg(Bob)”!

A = {Alice says foo if secretAg(Bob)}q = access?

Alice says

1

A = { Alice says foo if secretAg(Bob), Alice says Svc cansay

secretAg(Bob) }q = access?

Svc says secrAg(B) Alice says secrAg(B)

2

[Gurevich et al., CSF 2008]

(There’s also an attack on DKAL2, to appear in: “Information Flow in Trust Management Systems”, Journal of Computer Security.)

Page 4: Solving trust issues using Z3

Challenges1. What does “attack”, “detect”, etc.

mean?*2. What can the attacker (not) detect?3. How do we automate?

*Based on “Information Flow in Credential Systems”, Moritz Y. Becker, CSF 2010

Page 5: Solving trust issues using Z3
Page 6: Solving trust issues using Z3

probe

Page 7: Solving trust issues using Z3

Available probes

))

))

Page 8: Solving trust issues using Z3

)

Available probes

))

)) ≡),...,)?

𝑨𝟎′

Yes, No, Yes, Yes, ...!

),...,)? 𝑨𝟎

Yes, No, Yes, Yes, ...!

Policies and are observationally equivalent () iff

for all :

The attacker can’t distinguish and

Page 9: Solving trust issues using Z3

)))

))

),...,)? 𝑨𝟎≡Yes, No, Yes, Yes, ...!

A query is detectable in iff.

p pp

p

pp!

Page 10: Solving trust issues using Z3

)))

))

),...,)? 𝑨𝟎≡Yes, No, Yes, Yes, ...!

A query is opaque in iff.

p pp

p

p

p??

Page 11: Solving trust issues using Z3

No!

Svc says secretAg(B) is detectable in A0!

({A says foo if secrAg(B)}, acc)

({A says Src cansay secAg(B), A says fooif secretAg(B)}, acc)

Yes!

𝑨𝟎≡secretAg(B)

secretAg(B)

secretAg(B)

secretAg(B)

secretAg(B)

Available probes

secretAg(B)!

Page 12: Solving trust issues using Z3

Challenges1. What does “attack”, “detect”, etc. mean?2. What can the attacker (not) detect?*3. How do we automate?

* Based on “Opacity Analysis in Trust Management Systems”, Moritz Y. Becker and Masoud Koleini (U Birmingham), ISC2011

Page 13: Solving trust issues using Z3

Is opaque in ?• Policy language: Datalog clauses • Input: • Output: “opaque in ” or “detectable in ”• Sound, complete, terminating

A query is opaque in iff.

Page 14: Solving trust issues using Z3

Example 1

What do we learn about and in ?

must satisfy one of these:

Page 15: Solving trust issues using Z3

Example 2

What do we learn about e.g. and in ? must satisfy one of these:

Page 16: Solving trust issues using Z3

Challenges1. What does “attack”, “detect”, etc. mean?2. What can the attacker (not) detect?3. How do we automate?

Page 17: Solving trust issues using Z3

How do we automate?• Previous approach:

Build a policy in which the sought fact is opaque.

• Approach described here:Search for proof to show that a property is detectable.

Page 18: Solving trust issues using Z3

Reasoning framework• Policies/credentials, and their properties are

mathematical objects• Better still, are terms in a logic (object-level)• Probes are just a subset of the theorems in

the logic.• Semantic constraints: Datalog entailment,

hypothethical reasoning.

Page 19: Solving trust issues using Z3

Policies

Empty policy

Fact

Rule

Policy union

Page 20: Solving trust issues using Z3

Properties

“phi holds if gamma”

Page 21: Solving trust issues using Z3

Example 1

Page 22: Solving trust issues using Z3

Example 2

Page 23: Solving trust issues using Z3

Calculus+ PL + ML + Hy

Page 24: Solving trust issues using Z3

Reduced calculus(modulo normalisation)

Page 25: Solving trust issues using Z3

Axioms C1 and C2

Page 26: Solving trust issues using Z3

Props 8 and 9

Page 27: Solving trust issues using Z3

Normal form

Page 28: Solving trust issues using Z3

Naïve propositionalisation• Normalise the formula• Apply Prop9 (until fixpoint)• Instantiate C1, C2 and Prop8 for each

box-formula• Abstract the boxes

Page 29: Solving trust issues using Z3

Improvements• Prop9 is very productive – in many

cases this can be avoided – so it could be delayed.

• Axiom C1 can be used as a filter.

Page 30: Solving trust issues using Z3

Summary1. What does “attack”, “protect”, etc. mean?– Observational equivalence, opacity and detectability

2. What can the attacker (not) infer?– Algorithm for deciding opacity in Datalog policies– Tool with optimizations

3. How do we automate?– Encode as SAT problem