38
Local Search Local Search Introduction to Introduction to Artificial Intelligence Artificial Intelligence COS302 COS302 Michael L. Littman Michael L. Littman Fall 2001 Fall 2001

Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Embed Size (px)

Citation preview

Page 1: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Local SearchLocal Search

Introduction toIntroduction toArtificial IntelligenceArtificial Intelligence

COS302COS302

Michael L. LittmanMichael L. Littman

Fall 2001Fall 2001

Page 2: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

AdministrationAdministration

Questions?Questions?

Reading: Boyan thesis for more Reading: Boyan thesis for more information on local search.information on local search.

http://www-2.cs.cmu.edu/afs/cs.cmu.http://www-2.cs.cmu.edu/afs/cs.cmu.edu/user/jab/mosaic/thesis/edu/user/jab/mosaic/thesis/

chap3.ps.gz, appen2.ps.gzchap3.ps.gz, appen2.ps.gz

Page 3: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

SearchSearch

Recall the search algorithms for Recall the search algorithms for constraint problems (CSP, SAT).constraint problems (CSP, SAT).

• Systematically check all states.Systematically check all states.• Look for satisfied state.Look for satisfied state.• In finite spaces, can determine if no In finite spaces, can determine if no

state satisfies constraints state satisfies constraints (complete).(complete).

Page 4: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Local SearchLocal Search

Alternative approach:Alternative approach:• Sometimes more interested in Sometimes more interested in

positive answer than negative.positive answer than negative.• Once we find it, we’re done.Once we find it, we’re done.• Guided random search approach.Guided random search approach.• Can’t determine unsatisfiability.Can’t determine unsatisfiability.

Page 5: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Optimization ProblemsOptimization Problems

Framework: States, neighbors Framework: States, neighbors (symmetric). Add notion of (symmetric). Add notion of scorescore for states: V(s)for states: V(s)

Goal: Find state w/ highest Goal: Find state w/ highest (sometimes lowest) score.(sometimes lowest) score.

Generalization of constraint Generalization of constraint framework because…framework because…

Page 6: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

SAT as OptimizationSAT as Optimization

States are complete assignments.States are complete assignments.

Score is number of satisfied clauses.Score is number of satisfied clauses.

Know the maximum (if formula is Know the maximum (if formula is truly satisfiable).truly satisfiable).

Page 7: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Local Search IdeaLocal Search Idea

In contrast to CSP/SAT search, work In contrast to CSP/SAT search, work with complete states.with complete states.

Move from state to state Move from state to state remembering very little (in remembering very little (in general).general).

Page 8: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Hill ClimbingHill Climbing

Most basic approachMost basic approach• Pick a starting state, s.Pick a starting state, s.• If V(s) > max If V(s) > max s’ in N(s) s’ in N(s) V(s’), stop V(s’), stop• Else, let s = any s’ s.t.Else, let s = any s’ s.t.

V(s’) = max V(s’) = max s’’ in N(s) s’’ in N(s) V(s’’) V(s’’)• Repeat Repeat Like DFS. Stops at local maximum.Like DFS. Stops at local maximum.

Page 9: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

N-queens OptimizationN-queens Optimization

Minimize violated constraints.Minimize violated constraints.

Neighbors: Slide in column.Neighbors: Slide in column.1

10

0

0

02

2

Page 10: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

N-queens Search SpaceN-queens Search Space

How far, at most, from minimum How far, at most, from minimum configuration?configuration?

Never more than n slides from any Never more than n slides from any configuration.configuration.

How many steps, at most, before How many steps, at most, before local minimum reached?local minimum reached?

Maximum score is n(n-1).Maximum score is n(n-1).

Page 11: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Reaching a Local MinReaching a Local Min

1

10

0

00

22

1

00

0

10

11

00

0

10

00

1

Page 12: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Graph PartitioningGraph Partitioning

Separate nodes into 2 groups to Separate nodes into 2 groups to minimize edges between themminimize edges between them

A B

CD

Page 13: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Objective FunctionObjective Function

Penalty for imbalancePenalty for imbalance

V(x) = cutsize(x) + (L-R)V(x) = cutsize(x) + (L-R)22

State space? Neighbors?State space? Neighbors?

Page 14: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

State SpaceState Space

0000: 16+00000: 16+0 1000: 4+21000: 4+2

0001: 4+30001: 4+3 1001: 0+31001: 0+3

0010: 4+10010: 4+1 1010: 0+31010: 0+3

0011: 0+20011: 0+2 1011: 4+21011: 4+2

0100: 4+20100: 4+2 1100: 0+21100: 0+2

0101: 0+30101: 0+3 1101: 4+11101: 4+1

0110: 0+30110: 0+3 1110: 4+31110: 4+3

0111: 4+20111: 4+2 1111: 16+01111: 16+0

0000: 16+00000: 16+0 1000: 4+21000: 4+2

0001: 4+30001: 4+3 1001: 0+31001: 0+3

0010: 4+10010: 4+1 1010: 0+31010: 0+3

0011: 0+20011: 0+2 1011: 4+21011: 4+2

0100: 4+20100: 4+2 1100: 0+21100: 0+2

0101: 0+30101: 0+3 1101: 4+11101: 4+1

0110: 0+30110: 0+3 1110: 4+31110: 4+3

0111: 4+20111: 4+2 1111: 16+01111: 16+0

Page 15: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Bumpy LandscapeBumpy Landscape

1

2

3

4

5

6

1 6 11 16

State

Sc

ore

Page 16: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Simulated AnnealingSimulated Annealing

Jitter out of local minima.Jitter out of local minima.

Let T be non-negative.Let T be non-negative.

Loop:Loop:• Pick random neighbor s’ in N(s)Pick random neighbor s’ in N(s)• Let D = V(s) – V(s’) (improve)Let D = V(s) – V(s’) (improve)• If D > 0, (better), s = s’If D > 0, (better), s = s’• Else with prob e Else with prob e D/TD/T , s = s’ , s = s’

Page 17: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

TemperatureTemperature

As T goes to zero, hill climbingAs T goes to zero, hill climbing

As T goes to infinity, random walkAs T goes to infinity, random walk

In general, start with large T and In general, start with large T and decrease it slowly.decrease it slowly.

If decrease If decrease infinitelyinfinitely slowly, will find slowly, will find optimal solution.optimal solution.

Analogy is with metal cooling.Analogy is with metal cooling.

Page 18: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Aside: AnalogiesAside: Analogies

Natural phenomena inspire algsNatural phenomena inspire algs• Metal coolingMetal cooling• EvolutionEvolution• ThermodynamicsThermodynamics• Societal marketsSocietal markets• Ant coloniesAnt colonies• Immune systemsImmune systems• Neural networks, …Neural networks, …

Page 19: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Annealing ScheduleAnnealing Schedule

After each step, update TAfter each step, update T

• Logarithmic: TLogarithmic: Tii = = /log(i+2)/log(i+2)

provable properties, slowprovable properties, slow

• Geometric: TGeometric: Tii = T = T00 int(i/L)int(i/L)

lots of parameters to tunelots of parameters to tune• AdaptiveAdaptive

change T to hit target accept ratechange T to hit target accept rate

Page 20: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Lam ScheduleLam Schedule

Ad hoc, but impressive (see Boyan, Ad hoc, but impressive (see Boyan, pg. 190, 191). Parameter is run pg. 190, 191). Parameter is run time.time.

• Start target accept rate at 100%Start target accept rate at 100%• Decrease exponentially to 44% at Decrease exponentially to 44% at

15% through run15% through run• Continue exponential decline to 0% Continue exponential decline to 0%

after 65% through run.after 65% through run.

Page 21: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Locality AssumptionLocality Assumption

Why does local search work? Why Why does local search work? Why not jump around at random?not jump around at random?

We believe scores change slowly We believe scores change slowly from neighbor to neighbor.from neighbor to neighbor.

Perhaps keep a bunch of states and Perhaps keep a bunch of states and search around the good ones…search around the good ones…

Page 22: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Genetic AlgorithmsGenetic Algorithms

Search : Population genetics ::Search : Population genetics ::

State : individual ::State : individual ::

State set : population ::State set : population ::

Score : fitness ::Score : fitness ::

Neighbors : offspring / mutationNeighbors : offspring / mutation

? : sexual reproduction, crossover? : sexual reproduction, crossover

Page 23: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Algorithmic OptionsAlgorithmic Options

Fitness functionFitness function

Representation of individualsRepresentation of individuals

Who reproduces? *Who reproduces? *

How alter individuals (mutation)?How alter individuals (mutation)?

How combine individuals How combine individuals (crossover)? *(crossover)? *

Page 24: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

RepresentationRepresentation

““Classic” approach: states are fixed-Classic” approach: states are fixed-length bit sequenceslength bit sequences

Can be more complex objects: Can be more complex objects: genetic programs, for example.genetic programs, for example.

Crossover:Crossover:• One point, two point, uniformOne point, two point, uniform

Page 25: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

GA ExampleGA Example

110111011111 101110111313 0100010022 1001100199

11011101 10111011 10111011 10011001

11111111 10011001 10011001 10111011

11111111 00010001 10001000 10111011

reproduction

crossover

mutation

Page 26: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Generation to NextGeneration to Next

G is a set of N (even) statesG is a set of N (even) states

Let p(s) = V(s)/sum Let p(s) = V(s)/sum s’s’ V(s’), G’ = { } V(s’), G’ = { }

For k = 1 to N/2For k = 1 to N/2

choose x, y with prob. p(x), p(y)choose x, y with prob. p(x), p(y)

randomly swap bits to get x’, y’randomly swap bits to get x’, y’

rarely, randomly flip bits in x’, y’rarely, randomly flip bits in x’, y’

G’ = G’ U {x’, y’}G’ = G’ U {x’, y’}

Page 27: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

GSATGSAT

Score: number of unsat clauses.Score: number of unsat clauses.• Start with random assignment.Start with random assignment.• While not satisfied…While not satisfied…

– Flip variable value to satisfy greatest Flip variable value to satisfy greatest number of clauses.number of clauses.

– Repeat (until done or bored)Repeat (until done or bored)

• Repeat (until done or bored)Repeat (until done or bored)

Page 28: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

GSAT AnalysisGSAT Analysis

What kind of local search is this most What kind of local search is this most like? Hill climbing? Simulated like? Hill climbing? Simulated annealing? GA?annealing? GA?

Known to perform quite well for Known to perform quite well for coloring problems.coloring problems.

Page 29: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

WALKSATWALKSAT

On each step, with probability p, do a On each step, with probability p, do a greedy GSAT move.greedy GSAT move.

With probability 1-p, “walk” (flip a With probability 1-p, “walk” (flip a variable in an unsatisfied clause).variable in an unsatisfied clause).

Even faster for many problems Even faster for many problems (planning, random CNF).(planning, random CNF).

Page 30: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

WALKSAT AnalysisWALKSAT Analysis

What kind of local search is this most What kind of local search is this most like? Hill climbing? Simulated like? Hill climbing? Simulated annealing? GA?annealing? GA?

Known to perform quite well for hard Known to perform quite well for hard random CNF problems.random CNF problems.

Page 31: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Random k-CNFRandom k-CNF

Variables: nVariables: n

Clauses: mClauses: m

Variables per clause: kVariables per clause: k

Randomly generate all m clauses by Randomly generate all m clauses by choosing k of the n variables for choosing k of the n variables for each clause at random. Randomly each clause at random. Randomly negate.negate.

Page 32: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Satisfiable FormulaeSatisfiable Formulae

0

0.2

0.4

0.6

0.8

1

0 5 10

clauses/variables

Pro

b.

of

sa

tisfi

ab

le

0

0.2

0.4

0.6

0.8

1

0 5 10

clauses/variables

Pro

b.

of

sa

tisfi

ab

le

Page 33: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Explaining ResultsExplaining Results

Three sectionsThree sections• m/n < 4.2, under constrained: m/n < 4.2, under constrained:

nearly all satisfiablenearly all satisfiable• m/n > 4.3, over constrained: nearly m/n > 4.3, over constrained: nearly

all unsatisfiableall unsatisfiable• m/n , under constrainted: nearly all m/n , under constrainted: nearly all

satisfiablesatisfiable

Page 34: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Phase TransitionPhase Transition

Under-constrained problems are Under-constrained problems are easy, just guess an assignment.easy, just guess an assignment.

Over-constrained problems aren’t Over-constrained problems aren’t too bad, just say “unsatisfiable” too bad, just say “unsatisfiable” and try to verify via DPLL.and try to verify via DPLL.

Transition region sharpens as n Transition region sharpens as n increases.increases.

Page 35: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Local Search DiscussionLocal Search Discussion

Often the Often the secondsecond best way to solve best way to solve any problem.any problem.

Relatively easy to implement.Relatively easy to implement.

So, good first attempt. If good So, good first attempt. If good enough, stop.enough, stop.

Page 36: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

What to LearnWhat to Learn

Definition of hill climbing, simulated Definition of hill climbing, simulated annealing, genetic algorithm.annealing, genetic algorithm.

Definition of GSAT, WALKSAT.Definition of GSAT, WALKSAT.

Relative advantages and Relative advantages and disadvantages of local search.disadvantages of local search.

Page 37: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Programming Project 1 Programming Project 1 (due 10/22)(due 10/22)

Solve Rush Hour problems.Solve Rush Hour problems.

Implement BFS, A* with simple Implement BFS, A* with simple blocking heuristic, A* with more blocking heuristic, A* with more advanced heuristic.advanced heuristic.

Input and output examples available Input and output examples available on course web siteon course web site

Speed and accuracy count.Speed and accuracy count.

Page 38: Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Homework 4 (due 10/17)Homework 4 (due 10/17)

1.1. In graph partitioning, call a In graph partitioning, call a balancedbalanced state state one in which both sets contain the same one in which both sets contain the same number of nodes. (a) How can we choose an number of nodes. (a) How can we choose an initial state at random that is balanced? (b) initial state at random that is balanced? (b) How can we define the neighbor set N so that How can we define the neighbor set N so that every neighbor of a balanced state is every neighbor of a balanced state is balanced? (c) Show that standard GA balanced? (c) Show that standard GA crossover (uniform) between two balanced crossover (uniform) between two balanced states need not be balanced. (d) Suggest an states need not be balanced. (d) Suggest an alternative crossover operator that preserves alternative crossover operator that preserves balance.balance.

2.2. More soon…More soon…