42
2IL50 Data Structures Spring 2015 Lecture 2: Analysis of Algorithms

Spring 2015 Lecture 2: Analysis of Algorithms

Embed Size (px)

DESCRIPTION

Analysis of algorithms the formal way …

Citation preview

Page 1: Spring 2015 Lecture 2: Analysis of Algorithms

2IL50 Data Structures

Spring 2015

Lecture 2: Analysis of Algorithms

Page 2: Spring 2015 Lecture 2: Analysis of Algorithms

Analysis of algorithms

the formal way …

Page 3: Spring 2015 Lecture 2: Analysis of Algorithms

Analysis of algorithms Can we say something about the running time of an algorithm

without implementing and testing it?

InsertionSort(A)1. initialize: sort A[1]2. for j = 2 to A.length3. do key = A[j] 4. i = j -15. while i > 0 and A[i] > key6. do A[i+1] = A[i] 7. i = i -18. A[i +1] = key

Page 4: Spring 2015 Lecture 2: Analysis of Algorithms

Analysis of algorithms Analyze the running time as a function of n (# of input elements)

best case average case worst case

elementary operationsadd, subtract, multiply, divide, load, store, copy, conditional and unconditional branch, return …

An algorithm has worst case running time T(n) if for any input of size n the maximal number of elementary operations executed is T(n).

Page 5: Spring 2015 Lecture 2: Analysis of Algorithms

Analysis of algorithms: example

InsertionSort: 15 n2 + 7n – 2

MergeSort: 300 n lg n + 50 n

n=10 n=100 n=1000

1568 150698 1.5 x 107

10466 204316 3.0 x 106

InsertionSort6 x faster

InsertionSort1.35 x faster

MergeSort5 x faster

n = 1,000,000 InsertionSort 1.5 x 1013

MergeSort 6 x 109 2500 x faster !

The rate of growth of the running time as a function of the input is essential!

Page 6: Spring 2015 Lecture 2: Analysis of Algorithms

Θ-notationIntuition: concentrate on the leading term, ignore constants

19 n3 + 17 n2 - 3n becomes Θ(n3)

2 n lg n + 5 n1.1 - 5 becomes

n - ¾ n √n becomes

Θ(n1.1)

---

Page 7: Spring 2015 Lecture 2: Analysis of Algorithms

Θ-notationLet g(n) : N ↦ N be a function. Then we have

Θ(g(n)) = { f(n) : there exist positive constants c1, c2, and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥

n0 }

“Θ(g(n)) is the set of functions that grow as fast as g(n)”

Page 8: Spring 2015 Lecture 2: Analysis of Algorithms

Θ-notationLet g(n) : N ↦ N be a function. Then we have

Θ(g(n)) = { f(n) : there exist positive constants c1, c2, and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 }

nn0

c1g(n)

c2g(n)

f(n)

0

Notation: f(n) = Θ(g(n))

Page 9: Spring 2015 Lecture 2: Analysis of Algorithms

Θ-notationLet g(n) : N ↦ N be a function. Then we have

Θ(g(n)) = { f(n) : there exist positive constants c1, c2, and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 }

Claim: 19n3 + 17n2 - 3n = Θ(n3)

Proof: Choose c1 = 19, c2 = 36 and n0 = 1. Then we have for all n ≥ n0:

0 ≤ c1n3 = 19n3 (trivial) ≤ 19n3 + 17n2 - 3n (since 17n2 > 3n for n ≥ 1)

≤ 19n3 + 17n3 (since 17n2 ≤ 17n3 for n ≥1) = c2n3

Page 10: Spring 2015 Lecture 2: Analysis of Algorithms

Θ-notationLet g(n) : N ↦ N be a function. Then we have

Θ(g(n)) = { f(n) : there exist positive constants c1, c2, and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 }

Claim: 19n3 + 17n2 - 3n ≠ Θ(n2)

Proof: Assume that there are positive constants c1, c2, and n0 such that for all n ≥ n0

0 ≤ c1n2 ≤ 19n3 + 17n2 - 3n ≤ c2n2 Since 19n3 + 17n2 - 3n ≤ c2n2 implies 19n3 ≤ c2n2 + 3n – 17n2 ≤ c2n2 (3n – 17n2 ≤

0)we would have for all n ≥ n0

19n ≤ c2.

Page 11: Spring 2015 Lecture 2: Analysis of Algorithms

O-notationLet g(n) : N ↦ N be a function. Then we have

O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }

“O(g(n)) is the set of functions that grow at most as fast as g(n)”

Page 12: Spring 2015 Lecture 2: Analysis of Algorithms

O-notationLet g(n) : N ↦ N be a function. Then we have

O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }

nn0

cg(n)

f(n)

0

Notation: f(n) = O(g(n))

Page 13: Spring 2015 Lecture 2: Analysis of Algorithms

Ω-notationLet g(n) : N ↦ N be a function. Then we have

Ω(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }

“Ω(g(n)) is the set of functions that grow at least as fast as g(n)”

Page 14: Spring 2015 Lecture 2: Analysis of Algorithms

Ω-notationLet g(n) : N ↦ N be a function. Then we have

Ω(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }

nn0

cg(n)

f(n)

0

Notation: f(n) = Ω(g(n))

Page 15: Spring 2015 Lecture 2: Analysis of Algorithms

Asymptotic notation

Θ(…) is an asymptotically tight bound

O(…) is an asymptotic upper bound

Ω(…) is an asymptotic lower bound

other asymptotic notation

o(…) → “grows strictly slower than”ω(…) → “grows strictly faster than”

“asymptotically equal”

“asymptotically smaller or equal”

“asymptotically greater or equal”

Page 16: Spring 2015 Lecture 2: Analysis of Algorithms

More notation …f(n) = n3 + Θ(n2) means

f(n) = means

O(1) or Θ(1) means

2n2 + O(n) = Θ(n2) means

there is a function g(n) such thatf(n) = n3 + g(n) and g(n) = Θ(n2)

there is one function g(i) such thatf(n) = and g(i) = O(i)

a constant

for each function g(n) with g(n)=O(n) we have 2n2 + g(n) = Θ(n2)

n

1iO(i)

n

1ig(i)

Page 17: Spring 2015 Lecture 2: Analysis of Algorithms

Quiz1. O(1) + O(1) = O(1)

2. O(1) + … + O(1) = O(1)

3. =

4. O(n2) ⊆ O(n3)

5. O(n3) ⊆ O(n2)

6. Θ(n2) ⊆ O(n3)

7. An algorithm with worst case running time O(n log n) is always slower than an algorithm with worst case running time O(n) if n is sufficiently large.

true

false

true

true

false

true

false

n

1iO(i)

n

1ii)O(

Page 18: Spring 2015 Lecture 2: Analysis of Algorithms

Quiz8. n log2 n = Θ(n log n)

9. n log2 n = Ω(n log n)

10. n log2 n = O(n4/3)

11. O(2n) ⊆ O(3n)

12. O(2n) ⊆ Θ(3n)

false

true

true

true

false

Page 19: Spring 2015 Lecture 2: Analysis of Algorithms

Analysis of algorithms

Page 20: Spring 2015 Lecture 2: Analysis of Algorithms

Analysis of InsertionSortInsertionSort(A)1. initialize: sort A[1]2. for j = 2 to A.length3. do key = A[j] 4. i = j -15. while i > 0 and A[i] > key6. do A[i+1] = A[i] 7. i = i -18. A[i +1] = key

Get as tight a bound as possible on the worst case running time.

➨ lower and upper bound for worst case running timeUpper bound: Analyze worst case number of elementary

operationsLower bound: Give “bad” input example

Page 21: Spring 2015 Lecture 2: Analysis of Algorithms

Analysis of InsertionSortInsertionSort(A)1. initialize: sort A[1]2. for j = 2 to A.length3. do key = A[j] 4. i = j -15. while i > 0 and A[i] > key6. do A[i+1] = A[i] 7. i = i -18. A[i +1] = key

Upper bound: Let T(n) be the worst case running time of InsertionSort on an array of length n. We have

T(n) =

Lower bound:

n

2j

O(1)

O(1)

O(1)

worst case:(j-1) ∙ O(1)

O(1) + { O(1) + (j-1)∙O(1) + O(1) }

= O(j) = O(n2)

n

2j

Array sorted in de-creasing order ➨ Ω(n2)

The worst case running time of InsertionSort is Θ(n2).

Page 22: Spring 2015 Lecture 2: Analysis of Algorithms

MergeSort(A) // divide-and-conquer algorithm that sorts array A[1..n] 1. if A.length = 12. then skip3. else 4. n = A.length ; n1 = floor(n/2); n2 = ceil(n/2); 5. copy A[1.. n1] to auxiliary array A1[1.. n1]6. copy A[n1+1..n] to auxiliary array A2[1.. n2]7. MergeSort(A1); MergeSort(A2)8. Merge(A, A1, A2)

Analysis of MergeSort

O(1)

O(1)

O(n)O(n)

O(n)??

T( n/2 ) + T( n/2 )

MergeSort is a recursive algorithm ➨ running time analysis leads to recursion

Page 23: Spring 2015 Lecture 2: Analysis of Algorithms

Analysis of MergeSort Let T(n) be the worst case running time of MergeSort on an array of

length n. We have

O(1) if n = 1 T(n) = T( n/2 ) + T( n/2 ) + Θ(n) if n > 1

frequently omitted since it (nearly) always holds

often written as 2T(n/2)

Page 24: Spring 2015 Lecture 2: Analysis of Algorithms

Solving recurrences

Page 25: Spring 2015 Lecture 2: Analysis of Algorithms

Solving recurrences

Easiest: Master theoremcaveat: not always applicable

Alternatively: Guess the solution and use the substitution method to prove that your guess it is correct.

How to guess:1. expand the recursion2. draw a recursion tree

Page 26: Spring 2015 Lecture 2: Analysis of Algorithms

Let a and b be constants, let f(n) be a function, and let T(n)be defined on the nonnegative integers by the recurrence

T(n) = aT(n/b) + f(n)

Then we have:

1. If f(n) = O(nlog a – ε) for some constant ε > 0, then T(n) = Θ(nlog a).

2. If f(n) = Θ(nlog a), then T(n) = Θ(nlog a log n)

3. If f(n) = Ω(nlog a + ε) for some constant ε > 0, and if af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n, then T(n) = Θ(f(n))

The master theorem

b

b

b

b

can be rounded up or down

note: logba - ε

b

Page 27: Spring 2015 Lecture 2: Analysis of Algorithms

The master theorem: ExampleT(n) = 4T(n/2) + Θ(n3)

Master theorem with a = 4, b = 2, and f(n) = n3

logba = log24 = 2

➨ n3 = f(n) = Ω(nlog a + ε) = Ω(n2 + ε) with, for example, ε = 1

Case 3 of the master theorem gives T(n) = Θ(n3), if the regularity condition holds.

choose c = ½ and n0 = 1

➨ af(n/b) = 4(n/2)3 = n3/2 ≤ cf(n) for n ≥ n0

➨ T(n) = Θ(n3)

b

Page 28: Spring 2015 Lecture 2: Analysis of Algorithms

The substitution method The Master theorem does not always apply

In those cases, use the substitution method:1. Guess the form of the solution.2. Use induction to find the constants and show that the solution

works

Use expansion or a recursion-tree to guess a good solution.

Page 29: Spring 2015 Lecture 2: Analysis of Algorithms

Recursion-treesT(n) = 2T(n/2) + n

n

n/2 n/2

n/4 n/4 n/4 n/4

n/2i n/2i … n/2i

Θ(1) Θ(1) … Θ(1)

Page 30: Spring 2015 Lecture 2: Analysis of Algorithms

Recursion-treesT(n) = 2T(n/2) + n

n

2 ∙ (n/2) = n

4 ∙ (n/4) = n

2i ∙ (n/2i) = n

n ∙ Θ(1) = Θ(n)

log n

+

Θ(n log n)

n

n/2 n/2

n/4 n/4 n/4 n/4

n/2i n/2i … n/2i

Θ(1) Θ(1) … Θ(1)

Page 31: Spring 2015 Lecture 2: Analysis of Algorithms

Recursion-treesT(n) = 2T(n/2) + n2

n2

(n/2)2 (n/2)2

(n/4)2 (n/4)2 (n/4)2 (n/4)2

(n/2i)2 (n/2i)2 … (n/2i)2

Θ(1) Θ(1) … Θ(1)

Page 32: Spring 2015 Lecture 2: Analysis of Algorithms

Recursion-treesT(n) = 2T(n/2) + n2

n2

(n/2)2 (n/2)2

(n/4)2 (n/4)2 (n/4)2 (n/4)2

(n/2i)2 (n/2i)2 … (n/2i)2

Θ(1) Θ(1) … Θ(1)

n2

2 ∙ (n/2)2 = n2/2

4 ∙ (n/4)2 = n2/4

2i ∙ (n/2i) 2 = n2/2i

n ∙ Θ(1) = Θ(n) +

Θ(n2)

Page 33: Spring 2015 Lecture 2: Analysis of Algorithms

Recursion-treesT(n) = 4T(n/2) + n

n

n/2 n/2 n/2 n/2

… n/4 n/4 n/4 n/4 …

Θ(1) Θ(1) … Θ(1)

Page 34: Spring 2015 Lecture 2: Analysis of Algorithms

Recursion-treesT(n) = 4T(n/2) + n

n

n/2 n/2 n/2 n/2

… n/4 n/4 n/4 n/4 …

Θ(1) Θ(1) … Θ(1)

n

4 ∙ (n/2) = 2n

16 ∙ (n/4) = 4n

n2 ∙ Θ(1) = Θ(n2) +

Θ(n2)

Page 35: Spring 2015 Lecture 2: Analysis of Algorithms

The substitution method

Claim: T(n) = O(n log n)

Proof: by induction on n

to show: there are constants c and n0 such that T(n) ≤ c n log n for all n ≥ n0

T(1) = 2 ➨ choose c = 2 (for now) and n0 = 2

Why n0 = 2? How many base cases?

Base cases: n = 2: T(2) = 2T(1) + 2 ≤ 2∙2 + 2 = 6 = c 2 log 2 for c = 3 n = 3: T(3) = 2T(1) + 3 ≤ 2∙2 + 3 = 7 ≤ c 3 log 3

T(n) =2 if n = 1

2T( n/2 ) + n if n > 1

log 1 = 0 ➨ can not prove bound for n = 13/2 = 1, 4/2 = 2 ➨ 3 must also be

base case

Page 36: Spring 2015 Lecture 2: Analysis of Algorithms

The substitution method

Claim: T(n) = O(n log n)

Proof: by induction on n

to show: there are constants c and n0 such that T(n) ≤ c n log n for all n ≥ n0

choose c = 3 and n0 = 2

Inductive step: n > 3T(n) = 2T( n/2 ) + n

≤ 2 c n/2 log n/2 + n (ind. hyp.) ≤ c n ((log n) - 1) + n

≤ c n log n ■

T(n) =2 if n = 1

2T( n/2 ) + n if n > 1

Page 37: Spring 2015 Lecture 2: Analysis of Algorithms

The substitution method

Claim: T(n) = O(n)

Proof: by induction on n

Base case: n = n0

T(2) = 2T(1) + 2 = 2c + 2 = O(2)

Inductive step: n > n0

T(n) = 2T( n/2 ) + n = 2O( n/2 ) + n (ind. hyp.)

= O(n) ■

T(n) =Θ(1) if n = 1

2T( n/2 ) + n if n > 1

Never use O, Θ, or Ω in a proof by induction!

Page 38: Spring 2015 Lecture 2: Analysis of Algorithms

Analysis of algorithms

one more example …

Page 39: Spring 2015 Lecture 2: Analysis of Algorithms

Example (A) // A is an array of length n1. n = A.length2. if n=13. then return A[1]4. else begin5. Copy A[1… n/2 ] to auxiliary array B[1... n/2 ]6. Copy A[1… n/2 ] to auxiliary array C[1… n/2 ]7. b = Example(B); c = Example(C)8. for i = 1 to n9. do for j = 1 to i10. do A[i] = A[j] 11. return 4312. end

Example

Page 40: Spring 2015 Lecture 2: Analysis of Algorithms

Example (A) // A is an array of length n1. n = A.length2. if n=13. then return A[1]4. else begin5. Copy A[1… n/2 ] to auxiliary array B[1... n/2 ]6. Copy A[1… n/2 ] to auxiliary array C[1… n/2 ]7. b = Example(B); c = Example(C)8. for i = 1 to n9. do for j = 1 to i10. do A[i] = A[j] 11. return 4312. end

ExampleLet T(n) be the worst case running time of Example on an array of length n.

Lines 1,2,3,4,11, and 12 take Θ(1) time.Lines 5 and 6 take Θ(n) time.Line 7 takes Θ(1) + 2 T( n/2 ) time.Lines 8 until 10 take

time.

If n=1 lines 1,2,3 are executed,else lines 1,2, and 4 until 12 are executed.

➨ T(n):

➨ use master theorem …

)Θ(nΘ(i)Θ(1) 2n

1i

n

1i

i

1j

Θ(1) if n=12T(n/2) + Θ(n2) if n>1

Page 41: Spring 2015 Lecture 2: Analysis of Algorithms

Tips Analysis of recursive algorithms:

find the recursion and solve with master theorem if possible

Analysis of loops: summations

Some standard recurrences and sums:

T(n) = 2T(n/2) + Θ(n) ➨

½ n(n+1) = Θ(n2)

Θ(n3)

T(n) = Θ(n log n)

n

1i

i

n

1i

2i

Page 42: Spring 2015 Lecture 2: Analysis of Algorithms

Announcements

Group sign-up closes tonight (Wed 04-02-2015) at midnight.

Not signed up for group ➨ steps will be taken Thursday 05-02-2015

Some students might be moved to equalize sizes.

Email your assignment as .pdf to your tutor, not to me.