32
Review

Algorithm review

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Algorithm review

Review

Page 2: Algorithm review

2

OverviewFundamentals of Analysis of Algorithm EfficiencyAlgorithmic Techniques

Divide-and-Conquer, Decrease-and-ConquerDynamic ProgrammingGreedy Technique

Data StructuresHeapsGraphs– adjacency matrices & adjacency linked listsTrees

Page 3: Algorithm review

3

Fundamentals of Analysis of Algorithm Efficiency

Basic operationsWorst-, Best-, and Average-case time efficiencies Orders of growthEfficiency of non-recursive algorithmsEfficiency of recursive algorithms

Page 4: Algorithm review

4

Worst-Case, Best-Case, and Average-Case Efficiency

Worst case efficiencyEfficiency (# of times the basic operation will be executed) for the worst case input of size n, for whichThe algorithm runs the longest among all possible inputs of size n.

Best caseEfficiency (# of times the basic operation will be executed) for the best case input of size n, for whichThe algorithm runs the fastest among all possible inputs of size n.

Average case: Efficiency (#of times the basic operation will be executed) for atypical/random inputNOT the average of worst and best caseHow to find the average case efficiency?

Page 5: Algorithm review

5

Orders of Growth

Three notations used to compare orders of growth of algorithms

O(g(n)): class of functions f(n) that grow no faster than g(n)Θ (g(n)): class of functions f(n) that grow at same rate as g(n)Ω(g(n)): class of functions f(n) that grow at least as fast as g(n)

Page 6: Algorithm review

6

TheoremIf t1(n) ∈ O(g1(n)) and t2(n) ∈ O(g2(n)), thent1(n) + t2(n) ∈ O(maxg1(n), g2(n)).The analogous assertions are true for the Ω-notation and Θ-notation.The algorithm’s overall efficiency will be determined by the part with a larger order of growth.

5n2 + 3n + 4

Page 7: Algorithm review

7

Using Limits for Comparing Orders of Growth

limn→∞ T(n)/g(n) =

0 order of growth of TT((n)n) < order of growth of g(n)g(n)

c>0 order of growth of T(n)T(n) = order of growth of g(n)g(n)

∞ order of growth of T(n)T(n) > order of growth of g(n)g(n)

Examples:Examples:• 10n vs. 2n2

• n(n+1)/2 vs. n2

• logb n vs. logc n

Page 8: Algorithm review

8

Summary of How to Establish Orders of Growth of an Algorithm

Method 1: Using limits.Method 2: Using the theorem.Method 3: Using the definitions of O-, Ω-, and Θ-notation.

Page 9: Algorithm review

9

Basic Efficiency classesfast

factorialn!exponential2n

cubicn3

quadraticn2

n log nn log nlinearn

logarithmiclog nconstant1 High time efficiency

low time efficiencyslow

Page 10: Algorithm review

10

Time Efficiency Analysis of NonrecursiveAlgorithms

Steps in mathematical analysis of nonrecursive algorithms:Decide on parameter n indicating input size

Identify algorithm’s basic operation

Determine worst, average, and best case for input of size n

Set up summation for C(n) reflecting the number of times the algorithm’s basic operation is executed.

Simplify summation using standard formulas (see Appendix A)

Page 11: Algorithm review

11

Time Efficiency Analysis of Recursive Algorithms

Decide on parameter n indicating input size

Identify algorithm’s basic operation

Determine worst, average, and best case for input of size n

Set up a recurrence relation and initial condition(s) for C(n)-the number of times the basic operation will be executed for an input of size n (alternatively count recursive calls).

Solve the recurrence or estimate the order of magnitude of the solution (see Appendix B)

Page 12: Algorithm review

12

Master’s Theorem

T(n) = aT(n/b) + f (n) where f (n) ∈ Θ(nk)1. a < bk T(n) ∈ Θ(nk)2. a = bk T(n) ∈ Θ(nk lg n )3. a > bk T(n) ∈ Θ(nlog b a)

Note: the same results hold with O instead of Θ.

Page 13: Algorithm review

Divide-and-Conquer

Page 14: Algorithm review

14

Three Steps of The Divide and Conquer Approach

The most well known algorithm design strategy:1. Divide the problem into two or more smaller

subproblems.

2. Conquer the subproblems by solving them recursively(or recursively).

3. Combine the solutions to the subproblemsinto the solutions for the original problem.

Page 15: Algorithm review

15

Divide-and-Conquer Technique

subproblem 2 of size n/2

subproblem 1 of size n/2

a solution to subproblem 1

a solution tothe original problem

a solution to subproblem 2

a problem of size n

Page 16: Algorithm review

16

Divide and Conquer ExamplesSorting algorithms

MergesortIn-place?Worst-case efficiency?

QuicksortIn-place?Worst-case , best-case and average-case efficiency?

Binary Tree algorithmsDefinitions

What is a binary tree? A node’s/tree’s height? A node’s level?

Pre-order, post-order, and in-order traversalFind the heightFind the total number of leaves.…

Page 17: Algorithm review

Decrease-and-Conquer

Page 18: Algorithm review

18

Decrease and ConquerExploring the relationship between a solution to a given instance of a problem and a solution to a smaller instance of the same problem.Use top down(recursive) or bottom up (iterative) to solve the problem.Example, an

A top down (recursive) solution A bottom up (iterative) solution

Page 19: Algorithm review

19

Examples of Decrease and ConquerDecrease by one: the size of the problem is reduced by the same constant on each iteration/recursion of the algorithm.

Insertion sortIn-place?Worst-case , best-case and average-case efficiency?

Graph search algorithms:DFSBFS

Decrease by a constant factor: the size of the problem is reduced by the same constant factor on each iteration/recursion of the algorithm.

Page 20: Algorithm review

20

A Typical Decrease by One Technique

subproblem of size n-1

a solution to thesubproblem

a solution tothe original problem

a problem of size n

Page 21: Algorithm review

21

A Typical Decrease by a Constant Factor (half) Technique

subproblem of size n/2

a solution to the subproblem

a solution tothe original problem

a problem of size n

Page 22: Algorithm review

22

What’s the Difference?Consider the problem of exponentiation:

Compute an

Divide and conquer:

Decrease by one:

Decrease by a constant factor:

an= an/2 * an/2

an= an-1* a (top down)

an= (an/2)2

an= a*a*a*a*...*a (bottom up)

Page 23: Algorithm review

23

Depth-First SearchThe idea

traverse “deeper” whenever possible. When reaching a dead end, the algorithm backs up one edge to theparent and tries to continue visiting unvisited vertices from there.Break the tie by the alphabetic order of the verticesIt’s convenient to use a stack to track the operation of depth-first search.

DFS forest/tree and the two orderings of DFSDFS can be implemented with graphs represented as:

Adjacency matrices: Θ(V2)Adjacency linked lists: Θ(V+E)

Applications:Topological sortingchecking connectivity, finding connected components

Page 24: Algorithm review

24

Breadth-First SearchThe idea

Traverse “wider” whenever possible. Discover all vertices at distance k from s (on level k) before discovering any vertices at distance k +1 (at level k+1)Similar to level-by-level tree traversalsIt’s convenient to use a queue to track the operation of depth-first search.

BFS forest/tree and the one ordering of BFSBFS has same efficiency as DFS and can be implemented with graphs represented as:

Adjacency matrices: Θ(V2)Adjacency linked lists: Θ(V+E)

Applications:checking connectivity, finding connected components

Page 25: Algorithm review

Heapsort

Page 26: Algorithm review

26

HeapsDefinitionRepresentationPropertiesHeap algorithms

Heap constructionTop-downBottom-up

Root deletionHeapsort

In-place?Time efficiency?

Page 27: Algorithm review

27

Examples of Dynamic Programming Algorithms

Main idea: solve several smaller (overlapping) subproblemsrecord solutions in a table so that each subproblem is only

solved oncefinal state of the table will be (or contain) solution

VS. Divide and Conquer

Computing binomial coefficients

Warshall’s algorithm for transitive closure

Floyd’s algorithms for all-pairs shortest paths

Page 28: Algorithm review

Greedy Algorithms

Page 29: Algorithm review

29

Greedy algorithmsConstructs a solution through a sequence of steps, each expanding

a partially constructed solution obtained so far, until a complete solution to the problem is reached. The choice made at each stepmust be:Feasible

Satisfy the problem’s constraintslocally optimal

Be the best local choice among all feasible choicesIrrevocable

Once made, the choice can’t be changed on subsequent steps.

Greedy algorithms do not always yield optimal solutions.

Page 30: Algorithm review

30

Examples of the Greedy StrategyMinimum Spanning Tree (MST)

Definition of spanning tree and MSTPrim’s algorithmKruskal’s algorithm

Single-source shortest paths Dijkstra’s algorithm

Page 31: Algorithm review

31

P, NP, and NP-Complete Problems

Tractable and intractable problemsThe class PThe class NPThe relationship between P and NPNP-complete problems

Page 32: Algorithm review

32

Backtracking and Branch-and-Bound

They guarantees solving the problem exactly but doesn’t guarantee to find a solution in polynomial time.

Similarity and difference between backtracking and branch-and-bound