Analysis of an E-voting Protocol in the Applied Pi Calculus May 7, 2012

Preview:

Citation preview

Verification of Security Protocols

Analysis of an E-voting Protocol in the Applied Pi Calculus

May 7, 2012

2

Solution Consider the following protocol:

B A : nb;text1A: begin(A,B,nb)A B : pkA;A;{na;nb;text2}skA

B: end(A,B,nb)B: begin(B,A,na)B A : pkB;B;{na;text3}skB

A: end(B,A,na)◦ A public-key infrastructure is assumed: pkA, pkB are the public

keys of A and B, respectively◦ skA, skB are the signing keys of A and B, respectively◦ text1,text2,text3 are publicly known messages

Write a model of this protocol in ProVerif Check whether the protocol correctly realizes mutual

authentication between A and B; fix it if needed

3

free c. (* channel *)

free A,B. (* identifiers *)

free text1,text2,text3. (* publicly known messages *)

fun sign/2.

fun pk/1.

private fun sk/1.

reduc check(sign(x,sk(y)),pk(y))=x.

(* queries *)

query evinj:end(x,y,z) ==> evinj:begin(x,y,z).

query evinj:end2(x,y,z) ==> evinj:begin2(x,y,z).

Solution (1)B A : nb;text1A B : pkA;A;{na;nb;text2}skA

B A : pkB;B;{na;text3}skB

4

let initiatorB = new nb;

out(c,(B,nb,text1));

in(c,(x));

let ((pka,=A),mess1) = x in

let (na,=nb,=text2) = check(mess1,pka) in

event end(A,B,nb);

event begin2(B,A,na);

out(c,(pk(B),B,sign((na,text3),sk(B)))).

Solution (2)B A : nb;text1A B : pkA;A;{na;nb;text2}skA

B A : pkB;B;{na;text3}skB

5

let responderA = in(c,x);

let (B,nb,=text1) = x in

event begin(A,B,nb);

new na;

out(c,((pk(A),A),sign((na,nb,text2),sk(A))));

in(c,(pkb, =B,mess1));

let (=na,=text3) = check(mess1,pkb) in

event end2(B,A,na).

process

!responderA | !initiatorB

Solution (3)

Mutual authentication? NO! The attacker can impersonate the users cheating on keys

How to fix: Certificates (pkID, ID) should be signed by a TTP and identities should be checked

B A : nb;text1A B : pkA;A;{na;nb;text2}skA

B A : pkB;B;{na;text3}skB

6

Today’s class is based on a paper by Steve Kremer and Mark Ryan:◦ “Analysis of an Electronic Voting Protocol in the

Applied Pi Calculus”

Analysis of an e-voting protocol by Fujioka, Okamoto and Ohta known as FOO92◦ partly carried out using ProVerif◦ where ProVerif is not powerful enough, by hand

proofs

ProVerif as a proof assistant

7

Fairness◦ No early results can be obtained (to avoid influencing the remaining

voters) Eligibility

◦ Only legitimate voters can vote, and only once Privacy

◦ The association of a voter with her vote is not revealed to anyone Individual verifiability

◦ A voter can verify that her vote was really counted Universal verifiability

◦ A voter can verify that the published outcome really is the sum of all votes

Receipt-freeness◦ A voter cannot prove that she voted in a certain way (to protect voters

from coercion)

… in the presence of corrupt election authorities

E-voting protocols:Security goals

8

It is hard to achieve all the security goals simultaneously

EligibilityOnly legitimate voters can vote and only once

Individual verifiability

A voter can verify that her vote was

really counted

PrivacyThe association of a voter with her

vote is not revealed to anyone

Receipt-freenessA voter cannot prove that she

voted in a certain way

9

Most of these security goals look different from the ones studied so far, but they can be expressed in terms of◦ Secrecy: ProVerif supports reasoning about

secrecy for direct flows ◦ Testing equivalence: ProVerif supports testing

equivalence, but its reasoning is notably incomplete

Formalizing the security goals

10

Kremer and Ryan express FOO92 in ProVerif input language

Then they prove that it satisfies:◦ Fairness (with ProVerif)◦ Eligibility (with ProVerif)◦ Privacy (by hand)

FOO92 Analysis

11

Voter Administrator

◦ Checks if the voter is legitimate The voter has the right to vote The voter has not voted already

Collector◦ Collects and publishes the votes

FOO92: The Agents

12

Constructor◦ commit/2.

Destructor◦ open/2.

Reduction rule◦ open(commit(v,r),r) = v.

From our abstract point of view, bit commitment is exactly like symmetric encryption◦ v is a vote◦ r is a random key to encrypt the vote

Cryptographic Primitives:Bit Commitment

13

Constructors◦ enc/1.◦ dec/1.◦ sign/2.

Destructor◦ checksign/2.

Reduction rule◦ checksign(sign(m,enc(kp)),dec(kp)) = m.

Cryptographic Primitives:Digital Signing

14

Constructor◦ blind/2.

Destructor◦ unblind/2.

Reduction rules (where b is the blinding factor)◦ unblind(blind(m,b),b) = m.

Like symmetric encryption◦ unblind(sign(blind(m,b),sk),b) = sign(m,sk).

In order to legitimize a vote, an administrator signs the vote that has been blinded by a voter

The voter can unblind it while preserving the administrators signature, before forwarding it to the vote collector

Cryptographic Primitives:Blinding

15

An observer who sees a message on a channel must not be able to tell the origin and the destination of the message

That is exactly how channels are modeled in the spi-calculus (and in ProVerif)

Implementing anonymous channels in reality is problematic, but there are some solutions, like MIX-nets and onion routing

Cryptographic Primitives:Anonymous Channels

16

Three consecutive phases:◦ Legitimization phase

The administrator legitimizes the votes◦ Voting phase

The voters send their votes to the collectors◦ Opening phase

The collector publishes the votes

The end of each phase is a global synchronization point◦ the next phase does not start before the previous

phase has ended

The protocol: phases

17

Voter V selects a vote v and computes the commitment x of v and a random key r◦ x = commit(v,r)

V computes the message e using a blinding function and a random blinding factor b◦ e = blind(x,b) = blind(commit(v,r),b)

V digitally signs e and sends the signature to the administrator A together with her identity◦ V A: V, sign(e,sv) = sign(blind(commit(v,r),b),sv)

A verifies that◦ V has the right to vote◦ V has not voted yet◦ The signature is valid

If so, A sends her digital signature to V◦ A V : sign(blind(commit(v,r),b),sa)

V unblinds the message obtaining y = sign(commit(v,r),sa)

The protocol:Phase 1 - Legitimization

18

V sends y, A’s signature on the commitment to V’s vote, to the collector C using an anonymous channel◦ V C: y = sign(x,sa) = sign(commit(v,r),sa)

C checks correctness of the signature y and, if the test succeeds, enters (l,x,y) onto a list as an l-th item

The protocol:Phase 2 - Voting

19

The collector C received all votes The voters reveal the random key r so that C can open the votes and publish them◦ C publishes the list of votes (li,xi,yi)◦ V verifies that her commitment is in the list and

sends l,r to C via an anonymous channel(V C: l, r)

◦ C opens the l-th ballot using the random r and publishes the vote v

The protocol:Phase 3 - Opening

20

Which security goal would be violated if blinding was omitted?◦ Privacy

The first message, without blinding, would create an observable link between vote and voter (once the voter has revealed the random key r)

Which security goal would be violated if bit commitment was not collision-free?commit(v,r) = commit(v’,r’) for some (v,r)≠(v’,r’)

◦ Fairness The voter could change his vote after the voting

phase (by publishing r’ instead of r).

Motivations forblinding and commitment

21

let voter = new r; new b; let (bcv) = blind(commit(v,r),b) in out(net,(v, sign(bcv,sv))); in(net,lbcv); let (=bcv) = checksign(lbcv,pka) in let (lcv) = unblind(lbcv,b) in phase 1; out(net,lcv); in(net,(l,=lcv)); phase 2; out(net,(l,r)).

The vote v is:◦ a free variable of the system...

votes are guessable◦ ... and not new-generated names

they model unguessable data ProVerif phases specifies global synchronization points

The voter process

22

let administrator =

in(privCh,(V,pkv)); (* voter and public key *)

in(net,(=V,sbcv));

let (bcv) = checksign(sbcv,pkv) in

out(net,sign(bcv,ska)).

In Kremer and Ryan’s model, the administrator does not check for duplicate votes

The administrator process

23

let collector = phase 1; in(net,lcv); new l; out(net,(l,lcv)); phase 2; in(net,(=l,r)); let (v) = open(checksign(lcv,pka),r) in out(net,v).

Remark: There is a small discrepancy◦ Informal description: Collector checks administrator signature in voting phase◦ ProVerif model: Collector checks administrator signature in opening phase

The collector process

24

Fairness ensures that no early results can be obtained

Kremer and Ryan verify fairness as a secrecy property◦ It should be impossible for an attacker to learn a vote

before the opening phase (2) Strong secrecy of the votes up to the end of the voting

phase

ProVerif successfully proves that FOO92 guarantees that (in this model) an intruder cannot obtain the votes or learn any information about them before the voting phase ends

Fairness analysis

25

Eligibility verifies that◦ only legitimate voters can vote…

The attacker has a challengeVote (a global name) The attacker is illegitimate (he does not have a valid signing key) They modify the collector process (it publishes a fresh name attack if and only if he receives the challengeVote)

Then attack becomes public if and only if the collector receives the challengeVote from the attacker

In order to verify that the challengeVote from the attacker never reaches the collector, it suffices to show that attack remains secret the problem is reduced to secrecy

◦ … and only once This cannot be verified in the model because all voters share

the same key

Eligibility analysis

26

let collector = phase 1; in(net,lcv); new l; out(net,(l,lcv)); phase 2; in(net,(=l,r)); let (v) = open(checksign(lcv,pka),r) in new attack; if (v) = challengeVote then out(net,attack) else out(net,v).

ProVerif succeeds in verifying the standard secrecy of attack, therefore FOO92 guarantees eligibility

Eligibility analysisModified collector process

27

Privacy aims to guarantee that the association of a voter with her vote is not revealed to anyone◦ We need to suppose that at least two voters are honest

(if there is only one honest voter then privacy can never be guaranteed) Voter V1 – vote1

Voter V2 – vote2

◦ Privacy: P[vote1/v1, vote2/v2] ≈ P[vote2/v1, vote1/v2]

ProVerif fails on this, because its reasoning about testing equivalence is incomplete (proof by hand)

Privacy analysis

28

The phase separator between legitimization (1) and voting (2) phase is crucial for privacy

Without the separator, the following attack on privacy would be possible:◦ The attacker blocks all messages coming from voters

other than V until he has seen on the network two messages that are signed by A

◦ The attacker knows that the second of these A-signed messages contains V’s committed vote (unblinded!)

◦ Once V publishes his random key r, the attacker can open V’s committed vote (knowing that this is V’s vote)

Phases and privacy

29

For their hand proof of privacy for FOO92, Kremer and Ryan use of a powerful proof method for testing equivalence, called labeled bisimilarity◦ Abadi, Fournet: “Mobile Values, New Names, and

Secure Communication”

Proving Testing Equivalenceby Labeled Bisimilarity

30

Abadi and Fournet’s article is based on the applied pi calculus

Applied pi calculus vs ProVerif

ProVerif

1. distinguishes between constructors and destructors

2. reduction rules3. ProVerif language is restricted

to enable automatic analysis4. ProVerif actually allows certain

equations, too, (keyword: equation), but internally translates these to reduction rules

Applied pi calculus

1. no distinction between constructors and destructors

2. equations3. more general than ProVerif

language

31

The trouble with testing equivalence is that, by definition, for proving P ≃ Q it quantifies over all the possible contexts

Labeled bisimilarity is a relation that is contained in testing equivalence, and whose definition avoids an infinite quantification

Why labeled bisimilarity?

32

Abadi/Fournet enrich the syntax domain of processes with active substitutions {M/x}

They enrich the operational semantics with:◦ a labeled reduction rule that allows to reduce outputs

without matching input:new u; (Q | out c M;P)new u; (Q | P | {M/x})

where x is not a free variable in P or Q◦ a rule that allows to reduce inputs without matching

outputs:new u; (Q | inp c x;P)new u; (Q | {M/x}P)

where (fv(M) fn(M)) ∩ u = ∪ ∅

Labeled operational semantics

new x.out c x

inp c M

33

A frame is a process that is built up from stop and active substitutions, using parallel composition and new-generation:

We let ψ range over frames and σ over substitutions. Every enriched process P can be mapped to a frame ψ(P) by

replacing by stop all processes that are not active substitutions, parallel compositions or new-generations.

Example:

(new c; new d; (inp c x;P | {M/x} | out d N;Q)) == new c; new d; (stop | {M/x} | stop)= new c; new d; ({M/x})

Frames

34

(M = N)ψ iff ∃n,σ s.t.◦ ψ = new n.σ◦ σM=σN◦ n∩(fn(M)∪fn(N))=∅

Example: Assume fun f/1, fun g/1, and no equations

◦ ψ0 = new k; new s; {k/x, s/y}

◦ ψ1 = new k; new s; {f(k)/x, g(k)/y}

◦ ψ2 = new k; new s; {k/x, f(k)/y}

Then (f(x)=y)ψ2, but not (f(x)=y)ψ1 and not (f(x)=y)ψ0

Message Equality Under Frames

35

Static Equivalence of Frames:

◦ φ≈sψ iff dom(φ)=dom(ψ) (∀M,N)((M=N)φ⇔(M=N))ψ)

Example: ψ0≈Fsψ2, ψ1≈Fsψ2, ψ0≈sψ1

Static Process Equivalence:

P≈sQ iff ψ(P)≈sψ(Q)

Depending on the underlying equational theory, static process equivalence can be quite hard to show, but at least it does not depend on the dynamics of processes

Static Equivalence

36

Labeled bisimilarity ≈l is an equivalence relation on processes

In order to prove that P≈lQ, one needs to prove the existence of a relation R such that PRQ:◦ P≈sQ

◦ If P P’, then (∃Q’)(Q * Q’ and P’R Q’)◦ If P P’, then (∃Q’)(Q ** Q’ and P’R Q’).

Furthermore, one needs to prove these three statements with the roles of P and Q reversed

Note that one has to apply these rules iteratively, because R occurs in the 2nd and 3rd rule

Technically, ≈l is defined as the largest symmetric, binary relation on processes that satisfies the three rules

Labeled Bisimilarity

αα

37

Theorem: If P≈lQ, then P Q≃

This theorem tells us that for proving testing equivalence P Q≃ , it suffices to prove labeled bisimilarity P≈lQ

Proving labeled bisimilarity is often simpler than proving testing equivalence directly

Bisimilarity implies Testing Equivalence

38

In order to show privacy for FOO92, Kremer and Ryan show the following labeled bisimilarity:

P[vote1/v1, vote2/v2] ≈l P[vote2/v1, vote1/v2]

Privacy in FOO92

39

Consider the following protocol between two computers, a server and a client:S C: (hello; timestamp)C: begin(C,S,timestamp)C S: timestamp; hash(kCS)

S: end(C,S,timestamp)

◦ kCS is a long-term key shared between S and C

◦ timestamp may be considered as a nonce Write a model of this protocol in ProVerif Check whether the protocol correctly realizes the

authentication of C to S; fix it if needed Check whether the protocol preserves the

secrecy of kCS. What about weak secrecy?

Try it yourself

Recommended