36
A survey of Constraint Handling A survey of Constraint Handling Techniques in Evolutionary Techniques in Evolutionary Computation Methods Computation Methods Author: Author: Zbigneiw Michalewicz Zbigneiw Michalewicz Presenter: Presenter: Masoud Mazloom Masoud Mazloom 27 27 th th Oct. 2010 Oct. 2010 1

Presenter: Masoud Mazloom 27 th Oct. 2010

  • Upload
    london

  • View
    26

  • Download
    0

Embed Size (px)

DESCRIPTION

A survey of Constraint Handling Techniques in Evolutionary Computation Methods Author: Zbigneiw Michalewicz. Presenter: Masoud Mazloom 27 th Oct. 2010. 1. Outline. Introduction Numerical Optimization and Unfeasible solutions : Nonlinear programming problem Constraint-handling methods: - PowerPoint PPT Presentation

Citation preview

Page 1: Presenter: Masoud Mazloom 27 th  Oct. 2010

A survey of Constraint Handling Techniques in A survey of Constraint Handling Techniques in Evolutionary Computation MethodsEvolutionary Computation Methods

Author: Author: Zbigneiw MichalewiczZbigneiw Michalewicz

Presenter:Presenter:

Masoud MazloomMasoud Mazloom

2727thth Oct. 2010 Oct. 2010

11

Page 2: Presenter: Masoud Mazloom 27 th  Oct. 2010

OutlineOutline• IntroductionIntroduction

• Numerical Optimization and Unfeasible solutionsNumerical Optimization and Unfeasible solutions::– Nonlinear programming problemNonlinear programming problem– Constraint-handling methods:Constraint-handling methods:

• The method of Homaifar, Lai & QiThe method of Homaifar, Lai & Qi• The method of Joines & HouckThe method of Joines & Houck• The method of Michalewicz & JanikowThe method of Michalewicz & Janikow• The method of Michalewicz & AttiaThe method of Michalewicz & Attia• The method of Powell &skolnickThe method of Powell &skolnick• The method of SchoenauerThe method of Schoenauer• Death PenaltyDeath Penalty• Repair methodRepair method

• Five test caseFive test case• Experiments, ResultsExperiments, Results

22

Page 3: Presenter: Masoud Mazloom 27 th  Oct. 2010

IntroductionIntroduction• Evaluation function:Evaluation function:

one of the major components of any Evolutionary system.one of the major components of any Evolutionary system.

• Evolutionary computation techniques use efficient evaluation function for Evolutionary computation techniques use efficient evaluation function for feasible individuals but these techniques have not developed any feasible individuals but these techniques have not developed any guidelines on how to deal with unfeasible solutions.guidelines on how to deal with unfeasible solutions.

• Evolutionary programming and evolution strategy: Evolutionary programming and evolution strategy:

(Back et al. 1991)(Back et al. 1991)

(Fogel and Stayton 1994)(Fogel and Stayton 1994)

Reject unfeasible individualsReject unfeasible individuals

• Genetic Algorithms:Genetic Algorithms:

(Goldberg 1989)(Goldberg 1989)

Penalize unfeasible individualsPenalize unfeasible individuals

• Is there any general rule for designing penalty functions?Is there any general rule for designing penalty functions?

33

Page 4: Presenter: Masoud Mazloom 27 th  Oct. 2010

10/27/10

Page 5: Presenter: Masoud Mazloom 27 th  Oct. 2010

Introduction ContinueIntroduction Continue

We don’t make any assumptions about about these subspace:• They need not be convex•They not be connected 55

Page 6: Presenter: Masoud Mazloom 27 th  Oct. 2010

Introduction Continue.Introduction Continue.

• We have to deal with various feasible and We have to deal with various feasible and unfeasible individuals.unfeasible individuals.

• A population may contain some feasible (a, c, d) A population may contain some feasible (a, c, d) and unfeasible individuals (b, e, f), while the and unfeasible individuals (b, e, f), while the optimum solution is ‘x’optimum solution is ‘x’

• Problem:Problem:

How to deal with unfeasible individuals?How to deal with unfeasible individuals?• In general we have to design two two evaluations In general we have to design two two evaluations

functions:functions:

1. eval 1. eval f:f: for feasible domain for feasible domain

2. eval 2. eval u: u: for unfeasible domainfor unfeasible domain

66

Page 7: Presenter: Masoud Mazloom 27 th  Oct. 2010

Introduction ContinueIntroduction Continue 1. How should two feasible individuals be compared?1. How should two feasible individuals be compared?

2. How should two unfeasible individuals be compared?2. How should two unfeasible individuals be compared?

3. Should we assume that 3. Should we assume that

for any and fo and any for any and fo and any

In particular, which individual is better: individual ‘c’ or In particular, which individual is better: individual ‘c’ or unfeasible individual ‘f’ (note that the optimum is ‘x’) unfeasible individual ‘f’ (note that the optimum is ‘x’)

4. Should we consider unfeasible individuals harmful and 4. Should we consider unfeasible individuals harmful and eliminate them from the population? ( may be there are eliminate them from the population? ( may be there are useful)useful)

5. Should we ‘repair’ unfeasible solutions by moving them 5. Should we ‘repair’ unfeasible solutions by moving them into the closest point of the feasible space ? ( should we use into the closest point of the feasible space ? ( should we use a repair procedure for evaluation purpose only)a repair procedure for evaluation purpose only)

6. Should we chose to penalize unfeasible individuals?6. Should we chose to penalize unfeasible individuals? 77

Page 8: Presenter: Masoud Mazloom 27 th  Oct. 2010

Numerical Optimization and Unfeasible solutionsNumerical Optimization and Unfeasible solutions

Non linear programming problem:Non linear programming problem:

88

Page 9: Presenter: Masoud Mazloom 27 th  Oct. 2010

10/27/10

Page 10: Presenter: Masoud Mazloom 27 th  Oct. 2010

Condtrain Handling methodsCondtrain Handling methods1. Methods based on preserving feasibility of solutions: a) Use of specialized operations ( The method of Michalewicz &

Janikow) b) Searching the boundary of feasible region

2. Methods based on penalty functions: a) Static penalty ( The method of Homaifar, Lai & Qi) b) Dynamic Penalty ( The method of Joines & Houck) c) Annealing penalty ( The method of Michalewicz & Attia) d) Death Penalty ( The method of Michalewicz)

3. Methods based on a search for feasible soluitons: a) Superiority of feasible points ( The method of Powell &

Skolniks) b) Behavioral memory method (The method of Schoenauer &

Xanthakis)

4. Muti-Objective optimization methods: a) the method of Paredis b) Cultural Algorithms 1010

Page 11: Presenter: Masoud Mazloom 27 th  Oct. 2010

Static penalty ( The method of Homaifar, Lai & Qi)Static penalty ( The method of Homaifar, Lai & Qi)

• For every constraint we establish a family of intervals that For every constraint we establish a family of intervals that determine appropriate penalty value.determine appropriate penalty value.

• The methods works as follows:The methods works as follows:

1111

Page 12: Presenter: Masoud Mazloom 27 th  Oct. 2010

Static penalty ContinueStatic penalty Continue

• Weakness:Weakness:

Number of parameters: m(2l+1)Number of parameters: m(2l+1)

m=5, l=4: 45 parametersm=5, l=4: 45 parameters• Quality of solutions heavily depends on the values Quality of solutions heavily depends on the values

of these parameters:of these parameters:

if Rij are moderate: algorithm converge to an if Rij are moderate: algorithm converge to an unfeasible solution unfeasible solution

if Rij are too large: algorithm reject unfeasible if Rij are too large: algorithm reject unfeasible solutionsolution

• Finding an optimal set of parameters for reaching Finding an optimal set of parameters for reaching to a feasible solution near optimum is quite to a feasible solution near optimum is quite difficult.difficult.

1212

Page 13: Presenter: Masoud Mazloom 27 th  Oct. 2010

Dynamic Penalty ( The method of Joines & Houck)Dynamic Penalty ( The method of Joines & Houck)

• In this method individuals are evaluated (at the iteration t) by In this method individuals are evaluated (at the iteration t) by the following formula:the following formula:

1313

• C,α, β are constant.• The method is quite similar to Homaifar et al. but it require many fewer parameter .• This method is independent of the total number of constraints.•Penalty component changes with the generation number.•The pressure on unfeasible solutions is increased due to the:

Quality of the solution was very sensitive to values of these three parameters (reasonable parameters are

Page 14: Presenter: Masoud Mazloom 27 th  Oct. 2010

Annealing penalty ( The method of Michalewicz & Attia)Annealing penalty ( The method of Michalewicz & Attia)

• In this method linear and nonlinear constraints are processed In this method linear and nonlinear constraints are processed separately.separately.

• The method works as follow:The method works as follow:

1414

Page 15: Presenter: Masoud Mazloom 27 th  Oct. 2010

1515

Page 16: Presenter: Masoud Mazloom 27 th  Oct. 2010

• At every iteration the algorithm considers active At every iteration the algorithm considers active constraints only.constraints only.

• The pressure on unfeasible solution is increased due The pressure on unfeasible solution is increased due to the decreasing values of temperature to the decreasing values of temperature ζζ..

• This method is quite sensitive to values of its This method is quite sensitive to values of its parameters:parameters:

starting temperature starting temperature ζζ, freezing temperature , freezing temperature ζζ0 and 0 and cooling scheme cooling scheme ζζf to decrease temperature f to decrease temperature ζζ ( standard value s ( standard value s are )are )

• This algorithm may converge to a near-optimum This algorithm may converge to a near-optimum solution just in one iterationsolution just in one iteration

1616

Page 17: Presenter: Masoud Mazloom 27 th  Oct. 2010

Death Penalty ( The method of MichalewiczDeath Penalty ( The method of Michalewicz

• This method is popular option in many evolutionary This method is popular option in many evolutionary techniques like evolutionary programming.techniques like evolutionary programming.

• Removing unfeasible solutions from population may work Removing unfeasible solutions from population may work well when the feasible search space well when the feasible search space F F is convex and it is convex and it constitutes a reasonable part of the whole search space.constitutes a reasonable part of the whole search space.

• For some problems where the ratio between the size of F and S For some problems where the ratio between the size of F and S is small and an initial population consist of unfeasible is small and an initial population consist of unfeasible individuals only, we must to improve them.individuals only, we must to improve them.

• Experiment shows that when ratio between F and S was Experiment shows that when ratio between F and S was between 0% and 0.5% this method performed worse than other between 0% and 0.5% this method performed worse than other methods.methods.

• Unfeasible solutions should provide information and not just Unfeasible solutions should provide information and not just be thrown awaybe thrown away

1717

Page 18: Presenter: Masoud Mazloom 27 th  Oct. 2010

Methods based on preserving feasibility of solutions: Methods based on preserving feasibility of solutions: Use of specialized operators Use of specialized operators ( The method of Michalewicz & Janikow)( The method of Michalewicz & Janikow)

• The idea behind this method is based on specialized operators The idea behind this method is based on specialized operators which transform feasible individuala into feasible individuals.which transform feasible individuala into feasible individuals.

• This method works only with linear constraint and a feasible This method works only with linear constraint and a feasible starting point .starting point .

• A close set of operators maintains feasibility of solutions ( the A close set of operators maintains feasibility of solutions ( the offspring solution vector is always feasible).offspring solution vector is always feasible).

• Mutation ( which values xi will give from its domain?)Mutation ( which values xi will give from its domain?)

• Crossover: Crossover:

• It gave surprisingly good performance on many test functions.It gave surprisingly good performance on many test functions.

• Weakness of the method lies in its inability to deal with non Weakness of the method lies in its inability to deal with non convex search space ( to deal with nonlinear constraints in convex search space ( to deal with nonlinear constraints in general) general) 1818

Page 19: Presenter: Masoud Mazloom 27 th  Oct. 2010

Methods based on a search for feasible solutions: Methods based on a search for feasible solutions: Superiority of feasible pointsSuperiority of feasible points (The method of Powell & Skolniks) (The method of Powell & Skolniks)

• The key concept behind this method is the assumption of The key concept behind this method is the assumption of superiority of feasible solutions over unfeasible ones.superiority of feasible solutions over unfeasible ones.

• Incorporate a heuristic rule for processing unfeasible Incorporate a heuristic rule for processing unfeasible solutions:solutions:

“ “ every feasible solution is better than every unfeasible every feasible solution is better than every unfeasible solution”solution”

• This rule is implemented in the following way:This rule is implemented in the following way:

evaluations of feasible solutions are mapped into the evaluations of feasible solutions are mapped into the interval (-∞,1)interval (-∞,1)

evaluations of unfeasible solutions are mapped into the evaluations of unfeasible solutions are mapped into the interval (1, ∞)interval (1, ∞)

1919

Page 20: Presenter: Masoud Mazloom 27 th  Oct. 2010

• Evaluation procedure:Evaluation procedure:

2020

Page 21: Presenter: Masoud Mazloom 27 th  Oct. 2010

• Experiments show that this method work very well Experiments show that this method work very well however the topology of the feasible search space however the topology of the feasible search space might be an important factor.might be an important factor.

• For problems with a small ratio |F|/|S| the algorithm is For problems with a small ratio |F|/|S| the algorithm is often trapped into an unfeasible solution.often trapped into an unfeasible solution.

• This method should require at least one feasible This method should require at least one feasible individual for start.individual for start.

2121

Page 22: Presenter: Masoud Mazloom 27 th  Oct. 2010

Behavioral memory method Behavioral memory method (The method of Schoenauer & Xanthakis)(The method of Schoenauer & Xanthakis)

• This method based on the idea of handling constraints in a This method based on the idea of handling constraints in a particular order.particular order.

2222

Page 23: Presenter: Masoud Mazloom 27 th  Oct. 2010

2323

This method require 3 parameter: sharing factor (to maintain diversity of the population) flip threshold particular permutation of constraint which determine their order

Page 24: Presenter: Masoud Mazloom 27 th  Oct. 2010

• In the final step of this algorithm the objective In the final step of this algorithm the objective function function ff is optimized. is optimized.

• But for large feasible spaces the method just provides But for large feasible spaces the method just provides additional computational overhead.additional computational overhead.

• For very small feasible search space it is essential to For very small feasible search space it is essential to maintain a diversity in the populationmaintain a diversity in the population

• Experiments indicate that the method provides a Experiments indicate that the method provides a reasonable performance except when the feasible reasonable performance except when the feasible search “too small”search “too small”

2424

Page 25: Presenter: Masoud Mazloom 27 th  Oct. 2010

10/27/10

Page 26: Presenter: Masoud Mazloom 27 th  Oct. 2010

Test case #1Test case #1

2626

Page 27: Presenter: Masoud Mazloom 27 th  Oct. 2010

Test case #2Test case #2

2727

Page 28: Presenter: Masoud Mazloom 27 th  Oct. 2010

Test case #3Test case #3

2828

Page 29: Presenter: Masoud Mazloom 27 th  Oct. 2010

Test case #4Test case #4

2929

Page 30: Presenter: Masoud Mazloom 27 th  Oct. 2010

Test Case #5Test Case #5

3030

Page 31: Presenter: Masoud Mazloom 27 th  Oct. 2010

Summary of test caseSummary of test case

3131

Page 32: Presenter: Masoud Mazloom 27 th  Oct. 2010

• In the table below :In the table below :

Method #1 is: Homaifar’s methodMethod #1 is: Homaifar’s method

Method #2 is: joines ‘s methodMethod #2 is: joines ‘s method

Method #3 is: Xanthakis’s mehtodMethod #3 is: Xanthakis’s mehtod

Method #4 is: Attia’ s methodMethod #4 is: Attia’ s method

Method #5 is: powell’s methodMethod #5 is: powell’s method

Method #6 is: Death penaltyMethod #6 is: Death penalty

3232

Page 33: Presenter: Masoud Mazloom 27 th  Oct. 2010

10/27/10

Page 34: Presenter: Masoud Mazloom 27 th  Oct. 2010

Experimental, ResultExperimental, Result

3434

Page 35: Presenter: Masoud Mazloom 27 th  Oct. 2010

3535

• Questions?Questions?

Page 36: Presenter: Masoud Mazloom 27 th  Oct. 2010

3636