32
CS 315 1 Approximation Algorithms Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015

Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 1

Approximation Algorithms

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015

Page 2: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 6

Approximation RatiosOptimization Problemsn We have some problem instance x that has many

feasible “solutions”.n We are trying to minimize (or maximize) some cost

function c(S) for a “solution” S to x. For example,w Finding a minimum spanning tree of a graphw Finding a smallest vertex cover of a graphw Finding a smallest traveling salesperson tour in a graph

Page 3: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 7

Approximation RatiosAn approximation produces a solution Tn T is a k-approximation to the optimal solution OPT

if c(T)/c(OPT) < k (assuming a min. prob.; a maximization approximation would be the reverse)

n Put another way, c(T) < k c(OPT)n We would have c(T) > (1/k) c(OPT) n In both cases we assume k > 1.

Page 4: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 8

Special Case of the Traveling Salesperson Problem

OPT-TSP: Given a complete, weighted graph, find a cycle of minimum cost that visits each vertex.n OPT-TSP is NP-hardn Special case: edge weights satisfy the triangle

inequality (which is common in many applications):w w(a,b) + w(b,c) > w(a,c)

a

b

c

5 4

7

Page 5: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 9

Aside: Euler Tour of a Rooted Tree

Page 6: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 10

A 2-Approximation for TSP Special Case

Output tour T

Euler tour P of MST M

Algorithm TSPApprox(G)Input weighted complete graph G, satisfying the triangle inequalityOutput a TSP tour T for GM ¬ a minimum spanning tree for GP ¬ an Euler tour traversal of M,

starting at some vertex sT ¬ empty listfor each vertex v in P (in traversal order)

if this is v’s first appearance in P then T.insertLast(v)

T.insertLast(s)return T

Page 7: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 11

A 2-Approximation for TSP Special Case - Proof

Euler tour P of MST MOutput tour T Optimal tour OPT (twice the cost of M) (at least the cost of MST M)(at most the cost of P)

The optimal tour is a spanning tour; hence |M|<|OPT|.The Euler tour P visits each edge of M twice; hence |P|=2|M|Each time we shortcut a vertex in the Euler Tour we will not increase the total length, by the triangle inequality (w(a,b) + w(b,c) >w(a,c)); hence, |T|<|P|.Therefore, |T|<|P|=2|M|<2|OPT|

Page 8: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 16

Vertex CoverA vertex cover of graph G=(V,E) is a subset W of V, such that, for every (a,b) in E, a is in W or b is in W. OPT-VERTEX-COVER: Given an graph G, find a vertex cover of G with smallest size.OPT-VERTEX-COVER is NP-hard.

Page 9: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 17

A 2-Approximation for Vertex Cover

Every chosen edge e has both ends in CBut e must be covered by an optimal cover; hence, one end of e must be in OPTThus, there is at most twice as many vertices in C as in OPT.That is, C is a 2-approx. of OPTRunning time: O(n+m)

Page 10: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

An Aside: Harmonic Numbers

CS 315 18

Page 11: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 19

Set Cover (Greedy Algorithm)OPT-SET-COVER: Given a collection of m sets, find the smallest number of them whose union is the same as the whole collection of m sets?n OPT-SET-COVER is NP-hard

Greedy approach produces an O(log n)-approximation algorithm.

Page 12: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Greedy Set Cover AnalysisConsider the moment in our algorithm when a set Sj is added to C, and let k be the number of previously uncovered elements in Sj. We pay a total charge of 1 to add this set to C (meaning we made our cover one subset larger), so we charge each previously uncovered element i of Sj a charge of c(i) = 1/k.Thus, the total size of our cover is equal to the total charges made.To prove an approximation bound, we will consider the charges made to the elements in each subset Sj that belongs to an optimal cover, C′. So, suppose that Sj belongs to C′. Let us write Sj = {x1, x2, . . . , xnj } so that Sj’s elements are listed in the order in which they are covered by our algorithm.

CS 315 20

Page 13: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Greedy Set Cover Analysis, cont.

CS 315 21

Page 14: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

CS 315 22

Polynomial-Time Approximation Schemes

A problem L has a polynomial-time approximation scheme (PTAS) if it has a polynomial-time (1+e)-approximation algorithm, for any fixed e >0 (this value can appear in the running time).0/1 Knapsack has a PTAS, with a running time that is O(n3/ e).

Page 15: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 23

Backtracking and Branch-and-Bound

Page 16: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 24

Back to NP-Completeness

Many hard problems are NP-Completen But we still need solutions to them, even if they

take a long time to generateBacktracking and Branch-and-Bound are two design techniques that have proven promising for dealing with NP-Complete problemsn And often these find ``good enough’’ solutions in

a reasonable amount of time

Page 17: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 25

Backtracking

A backtracking algorithm searches through a large set of possibilities in a systematic wayn Typically optimized to avoid symmetries in

problem instances (think, e.g., TSP)n Traverses search space in manner such that

``easy’’ solution is found if one exists.Technique takes advantage of an inherent structure possessed by many NP-Complete problems

Page 18: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 26

NP-Complete Problem Structure

Recall: acceptance of solution for instance x of NP-complete problem can be verified in polynomial timeVerification typically involves checking a set of choices (the output of the choice function) to see whether they satisfy a formula or demonstrate a successful configurationn E.g., Values assigned to Boolean variables,

vertices of graph to include in special set, items to go in knapsack

Page 19: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 27

BacktrackingSystematically searches for a solution to problemTraverses through possible ``search paths’’ to locate solutions or dead endsConfiguration at end of such a path consists of a pair (x,y) n x is the remaining problem to be solvedn y is the set of choices that have been made to get

to this subproblem from the original problem instance

Page 20: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 28

Backtracking

Start search with the pair (x,f), where x is the original problem instanceAny time that backtracking algorithm discovers a configuration (x,y) that cannot lead to a valid solution, it cuts off all future searches from this configuration and backtracks to another configuration.

Page 21: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 29

Backtracking Pseudocode

Page 22: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 30

Backtracking: Required DetailsDefine way of selecting the most promising candidate configuration from frontier set F.n If F is a stack, get depth-first search of solution

space. F a queue gives a breadth-first search, etc.Specify way of expanding a configuration (x,y) into subproblem configurationsn This process should, in principle, be able to generate

all feasible configurations, starting from (x,f)Describe how to perform a simple consistency check for configuration (x,y) that returns ``solution found’’, ``dead end’’, or ``continue’’

Page 23: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 31

Example: CNF-SATProblem: Given Boolean formula S in CNF, we want to know if S is satisfiableHigh level algorithm: n Systematically make tentative assignments to variables in S n Check whether assignments make S evaluate immediately to

0 or 1, or yield a new formula S’ for which we can continue making tentative value assignments

Configuration in our algorithm is a pair (S’,y) wheren S’ is a Boolean formula in CNFn y is assignment of values to variables NOT in S’ such that

making these assignments to S yields the formula S’

Page 24: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 32

Example: CNF-SAT (details)Given frontier F of configurations, the most promising subproblem S’ is the one with the smallest clausen This formula would be the most constrained of all

formulas in F, so we expect it to hit a dead end most quickly (if it’s going to hit a dead end)

To generate new subproblem from S’, locate smallest clause in S’, pick a variable xi that appears in that clause, and create two new subproblems associated with assigning xi=0 and xi=1 respectively

Page 25: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 33

Example: CNF-SAT (details)Processing S’ for consistency check for assignment of a variable xi in S’n First, reduce any clauses containing xi based on the value

assigned to xi

n If this reduction results in new single literal clause (xj or ~xj) we assign value to xj to make single literal clause satisfied. Repeat process until no single literal clauses

n If we discover a contradiction (clauses xi and ~xi , or an empty clause) we return ``dead end’’

n If we reduce S’ all the way to constant 1, return ``solution found’’ along with assignments made to reach this

n Else, derive new subformula S’’ such that each clause has at least two literals, along with value assignments that lead from S to S’’ (this is the reduce operation)

Page 26: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 34

Example: CNF-SAT (details)

Placing this in backtracking template results in algorithm that has exponential worst case run time, but usually does OKIn fact, if each clause has no more than two literals, this algorithm runs in polynomial time.

Page 27: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 35

Branch-and-BoundBacktracking not designed for optimization problems, where in addition to having some feasibility condition that must be satisfied, we also have a cost f(x) to be optimized n We’ll assume minimized

Can extend backtracking to work with optimization problems (result is Branch-and-Bound)n When a solution is found, we continue processing to find the

best solutionn Add a scoring mechanism to always choose most promising

configuration to explore in each iteration (best-first search)

Page 28: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 36

Branch-and-BoundIn addition to the details required for the backtracking design pattern, we add one more to handle optimization:For any configuration (x,y) we assume we have a lower bound function, lb(x,y), that returns a lower bound on the cost of any solution that is derived from this configuration.n Only requirement for lb(x,y) is that it must be less than or

equal to the cost of any derived solution.n Of course, a more accurate lower bound function results in a

more efficient algorithm.

Page 29: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 37

Branch-and-Bound Pseudocode

Page 30: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 38

Branch-and-Bound Alg. for TSPProblem: optimization version of TSPn Given weighted graph G, want least cost tourn For edge e, let c(e) be the weight (cost) of edge e

We compute, for each edge e = (v,w) in G, the minimum cost path that begins at v and ends at w while visiting all other vertices of G along the wayWe generate the path from v to w in G – {e} by augmenting a current path by one vertex in each loop of the branch-and-bound algorithm

Page 31: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 39

Branch-and-Bound Alg. for TSPAfter having built a partial path P, starting, say, at v, we only consider augmenting P with vertices not in PWe can classify a partial path P as a ``dead end’’ if the vertices not in P are disconnected in G – {e}We define the lower bound function to be the total cost of the edges in P plus c(e).n Certainly a lower bound for any tour built from e

and P

Page 32: Approximation Algorithmsdszajda/classes/cs315/... · 2020-03-24 · Spring 2007 Approximation Algorithms 30 Backtracking: Required Details Define way of selecting the most promising

Spring 2007 Approximation Algorithms 40

Branch-and-Bound Alg. for TSPAfter having run the algorithm to completion for one edge of G, we can use the best path found so far over all tested edges, rather than restarting the current best solution b at +¥Run time can still be exponential, but we eliminate a considerable amount of redundancyThere are other heuristics for approximating TSP. We won’t discuss them here.