20
Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

Embed Size (px)

Citation preview

Page 1: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

Probabilistic Algorithms

Michael Sipser

Presented by: Brian Lawnichak

Page 2: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

2

Introduction

• Probabilistic Algorithm– uses the result of a random process

– “flips a coin” to decide next execution

• Purpose– saves on calculating the actual best choice

– avoids introducing a bias

– e.g. query individuals in a large population

Page 3: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

3

Probabilistic Turing Machine

• Definition 10.3

• Nondeterministic Turing Machine M– each nondeterministic step is a coin flip

– two legal next moves

– probability is given to each branch b of MPr[b] = 2-k

– where k is the number of coin flips on b

Page 4: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

4

M on input w

• Probability that M accepts input wPr[M accepts w] = Pr[b]

• Probability that M rejects input wPr[M rejects w] = 1 – Pr[M accepts w]

• What if there is a bad coin flip?– is this algorithm 100% correct?

– errors should be accounted for

Page 5: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

5

Error Probability

• Allow the Turing machine an error probability where 0 ½

• M recognizes language L with error probability if– w L implies Pr[M accepts w] 1 – – w L implies Pr[M rejects w] 1 –

• We say that TM M is bounded by

Page 6: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

6

The Class BPP

• Bounded Probabilistic Polynomial time Turing machine M1

• Time and Space complexity same as a nondeterministic TM

• Definition 10.4– BPP is the class of languages recognized

by M1 with error probability = 1/3

Page 7: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

7

Amplification

• An error probability of 33% is lousy

• Could we improve upon this?

• Amplification lemma– uses any error 0 ½

– allows us to make the error probability exponentially small

Page 8: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

8

Lemma 10.5

• Given fixed and polynomial poly(n)

• M1 operates with error probability an equivalent BPP TM M2 that

operates with error probability 2-poly(n)

–M2 simulates M1 by running it a polynomial number of times and taking the majority decision vote

Page 9: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

9

Will M2 Really Work?

• Box has 2/3 green and 1/3 red balls–M1 samples one ball at random to decide

– errs with probability

• M2 runs M1 poly(n) times to decide

– each run of M1 gives us err of

– running multiple times gives us exponentially small probability

Page 10: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

10

Proof

• Using M1 show that M2 recognizes the same language with error 2-poly(n)

t = 2poly(n)

a = 1/(4(1-)

b = max(1,1/log(a))

c = 2 log(bt)

k = bc

Page 11: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

11

Proof [cont.]

• M2 = “On input w

– find k and repeat the following 2k times

– simulate M1 on input w

– if most runs of M1 accept then accept; otherwise reject

Page 12: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

12

Verification

• We have built M2 but must now verify that M2 is equivalent to M1

• Assumptions– t 9 while conserving generality

–M1 errs on w with ½

–M2 obtains at least k erroneous results on 2k runs of M1

Page 13: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

13

Verification [cont.]

• Probability of M2 obtaining more than k erroneous on 2k runs is

• We can allow i = k because /(1 - ) 1 when ½

Page 14: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

14

Verification [cont.]

• With i = k we have an upper bound

• We bound the combination by 22k, the number of all subsets giving us

(k+1)22k k(1-)k

(k+1)(4(1-))k

Page 15: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

15

Verification [cont.]

• We can allow (1 - ) (1 - ) because ½ giving us

(k+1)(4(1-))k

(k+1)(1/a)k

• To show that this is at most 2-poly(n), we show that

ak (k+1)t

Page 16: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

16

Verification [cont.]

• We use a series of inequalitiesak = abc abc 2c = 22log(bt) = (bt)2

• b 1 and t 9 so that bt 9; therefore(bt)2 bt(2+2log(bt)) = t(2b+2blog(bt))

• Since b 1, we come up withak t(2+2blog(bt))

t(1+2blog(bt))

t(1+bc) = (k+1)t

Page 17: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

17

Applications

• Known:– BPP = co-BPP

• Unknown:– NP BPP– BPP NP

• Existence of certain strong pseudo-random number generators implies that P = BPP

above information was gleaned from http://encyclopedia.thefreedictionary.com/BPP

Page 18: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

18

Applications [cont.]

• Security– Chinese Remainder Theorem simplifies

modular arithmetic

– increases the efficiency of decryption

– given pairwise relatively prime moduli {p1,...,pn} and arbitrary {a1,...,an}

– there exists a unique key

above information was gleaned from: “FAQ on Public-Key Crypt” on the Google sci.crypt newsgroup

Page 19: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

19

Applications [cont.]

• Primality– don’t bother testing all positive integers

less than x to show that x is prime

– select a subset of numbers {1, …, x-1} chosen randomly and test for those

Page 20: Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak

20

Conclusions

• Using random processes for nondeter-minism saves time and avoids bias

• Error should be accounted for

• Bounded by exponentially small error