32
1 Multi-Objective Optimization Using Multi-Objective Optimization Using Evolutionary Algorithms Evolutionary Algorithms

1 Multi-Objective Optimization Using Evolutionary Algorithms

Embed Size (px)

Citation preview

Page 1: 1 Multi-Objective Optimization Using Evolutionary Algorithms

1

Multi-Objective Optimization Using Multi-Objective Optimization Using Evolutionary Algorithms Evolutionary Algorithms

Page 2: 1 Multi-Objective Optimization Using Evolutionary Algorithms

2

• Evolution Strategies (ESs) were developed in Germany and have been extensively studied in Europe

1. ESs use real-coding of design parameters since they model the organic evolution at the level of individual’s phenotypes.

2. ESs depend on deterministic selection and mutation for its evolution.

3. ESs use strategic parameters such as on-line self-adaptation of mutability parameters.

The selection of parents to form offspring is less constrained than it is in genetic algorithms and genetic programming. For instance, due to the nature of the representation, it is easy to average vectors from many individuals to form a single offspring.

In a typical evolutionary strategy, parents are selected uniformly randomly (i.e., not based upon fitness), more than offspring are generated through the use of recombination (considering ), and then survivors are selected deterministically. The survivors are chosen either from the best offspring (i.e., no parents survive, (,)-ES) or from the best + parents and offspring - (+)-ES.

Short ReviewShort Review (I) (I)

Page 3: 1 Multi-Objective Optimization Using Evolutionary Algorithms

3

• Genetic programming and Genetic algorithms are similar in most other aspects, except that the reproduction operators are tailored to a tree representation.

– The most commonly used operator is subtree crossover, in which an entire subtree is swapped between two parents.

– In a standard genetic program, the representation used is a variable-sized tree of functions and values. Each leaf in the tree is a label from an available set of value labels. Each internal node in the tree is labeled from an available set of function labels.

– The entire tree corresponds to a single function that may be evaluated. Typically, the tree is evaluated in a left-most depth-first manner. A leaf is evaluated as the corresponding value. A function is evaluated using as arguments the result of the evaluation of its children.

Short ReviewShort Review (II) (II)

Page 4: 1 Multi-Objective Optimization Using Evolutionary Algorithms

4

OverviewOverview

• Principles of Multi-Objective Optimization.

• Difficulties with the classical multi-objective optimization methods.

• Schematic of an ideal multi-objective optimization procedure.

• The original Genetic Algorithm (GA).

• Why using GA?

• Multi-Objective Evolutionary Algorithm (MOEA).

• An example of using a MOEA for solving engineering design problem.

Page 5: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Multiobjective algorithms classification based on Multiobjective algorithms classification based on how the objectives are integrated how the objectives are integrated withiwithinn

We will use the following simple classification of Evolutionary Multi-Objective Optimization (EMOO) approaches:•Non-Pareto Techniques

– Aggregating approaches– Lexicographic ordering– VEGA (Vector Evaluated Genetic Algorithm)

•Pareto Techniques– Pure Pareto ranking– MOGA– NSGA

•Recent Approaches– PAES– SPEA

•Bio-inspired Approaches– PSO– Ant-colony based

5

Page 6: 1 Multi-Objective Optimization Using Evolutionary Algorithms

6

Principles of Multi-Objective OptimizationPrinciples of Multi-Objective Optimization

• Real-world problems have more than one objective function, each of which may have a different individual optimal solution.

• Different in the optimal solutions corresponding to different objectives because the objective functions are often conflicting (competing) to each other.

• Set of trade-off optimal solutions instead of one optimal solution, generally known as “Pareto-Optimal” solutions (named after Italian economist Vilfredo Pareto (1906)).

• No one solution can be considered to be better than any other with respect to all objective functions. The non-dominant solution concept.

Page 7: 1 Multi-Objective Optimization Using Evolutionary Algorithms

7

Multi-Objective OptimizationMulti-Objective Optimization

• Is the optimization of different objective functions at the same time, thus at the end the algorithm return n different optimal values which is different to return one value in a normal optimization problem.

• Thus, there are more than one objective function

Pareto - optimal solutions and Pareto - optimal front• Pareto - optimal solutions: The optimal solutions

found in a multiple-objective optimization problem• Pareto - optimal front: the curve formed by joining all

these solution (Pareto - optimal solutions)

Page 8: 1 Multi-Objective Optimization Using Evolutionary Algorithms

8

Nondominated and dominated solutionsNondominated and dominated solutions

• Non-dominated -> given two objectives, a non-dominated solution is when none of both solution are better than the other with respect to two objectives. Both objectives are equally important. e.g. speed and price.

• Dominated: when solution a is no worse than b in all objectives, and solution a is strictly better than b in at least one objective, then solution a dominate solution b.

• A weakly dominated solution: when solution a is no worse than b in all objectives.

Page 9: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Multi-Objective Problems: DominanceMulti-Objective Problems: Dominance

• we say x dominates y if it is at least as good on all criteria and better on at least one

Dominated by x

f1

f2

Pareto frontx

Page 10: 1 Multi-Objective Optimization Using Evolutionary Algorithms

10

Principles of Multi-Objective Optimization Principles of Multi-Objective Optimization (cont.)(cont.)

• Simple car design example: two objectives - cost and accident rate – both of which are to be minimized.

• A multi-objective optimization algorithm must achieve:1. Guide the search towards the global Pareto-Optimal front.

2. Maintain solution diversity in the Pareto-Optimal front.

A, B, D - One objective can only be improved at the expense of at least one other objective!

Page 11: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Non-ParetoNon-Pareto Classification Techniques Classification Techniques ((Traditional ApproachesTraditional Approaches))

Aggregating the objectives into a single and parameterized objective function and performing several runs with different parameter settings to achieve a set of solutions approximating the Pareto-optimal set.

Weighting Method (Cohon, 1978)Weighting Method (Cohon, 1978)

Constraint Method (Cohon, 1978)Constraint Method (Cohon, 1978)

Goal Programming (Steuer, 1986)Goal Programming (Steuer, 1986)

Minimax Approach (Koski, 1984)Minimax Approach (Koski, 1984)

Page 12: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Vector Evaluated Genetic AlgorithmVector Evaluated Genetic Algorithm

• Proposed by Schaffer in the mid-1980s (1984,1985).• Only the selection mechanism of the GA is modified

so that at each generation a number of sub-populations was generated by performing proportional selection according to each objective function in turn.

• Thus, for a problem with k objectives and a population size of M, k sub-populations of size M/k each would be generated.

• These sub-populations would be shuffled together to obtain a new population of size M, on which the GA would apply the crossover and mutation operators in the usual way.

12

Page 13: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Schematic of VEGA selectionSchematic of VEGA selection

13

Page 14: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Advantages and DisadvantagesAdvantages and Disadvantages of VEGA of VEGA

• Efficient and easy to implement.

• If proportional selection is used, then the shuffling and merging of all the sub-populations corresponds to averaging the fitness components associated with each of the objectives.

• In other words, under these conditions, VEGA behaves as an aggregating approach and therefore, it is subject to the same problems of such techniques.

14

Page 15: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Problems in Multiobjectives Optimization

Weighting MethodWeighting Method example

Fitness Function = w1 F1(x) + w2 F2(x)

Consider the problem for minimize response time, maximize throughput

When F1(x) = response time, F2(x) = throughput

Wi = weight value

Then,It is hard to find the values of W1 and W2.

It is hard to form a fitness function.

Page 16: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Traditional ApproachesTraditional Approaches

Difficulties with classical methods:

Being sensitive to the shape of the Pareto-optimal front (e.g. weighting method).

Need for problem knowledge which may not be available.

Restrictions on their use in some application areas.

Need to several optimization runs to achieve the best parameter setting to obtain an approximation of the Pareto-optimal set.

Page 17: 1 Multi-Objective Optimization Using Evolutionary Algorithms

17

Difficulties with the classical multi-objective Difficulties with the classical multi-objective optimization methodsoptimization methods

• Such as weighted sum, є-perturbation, goal programming, min-max, and others:

1. Repeat many times to find multiple optimal solutions.2. Require some knowledge about the problem being solved.3. Some are sensitive to the shape of the Pareto-optimal front

(e.g. non-convex).4. The spread of optimal solutions depends on efficiency of

the single-objective optimizer.5. Not reliable in problems involving uncertainties or

stochastic.6. Not efficient for problems having discrete search space.

Page 18: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Lexicographic OrderingLexicographic Ordering (LO) (LO)

• In this method, the user is asked to rank the objectives in order of importance. The optimum solution is then obtained by minimizing the objective functions, starting with the most important one and proceeding according to the assigned order of importance of the objectives.

• It is also possible to select randomly a single objective to optimize at each run of a GA.

18

Page 19: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Advantages and DisadvantagesAdvantages and Disadvantages of LO of LO

• Efficient and easy to implement.• Requires a pre-defined ordering of objectives and its

performance will be affected by it.• Selecting randomly an objective is equivalent to a

weighted combination of objectives, in which each weight is defined in terms of the probability that each objective has of being selected. However, if tournament selection is used, the technique does not behave like VEGA, because tournament selection does not require scaling of the objectives (because of its pair-wise comparisons). Therefore, the approach may work properly with concave Pareto fronts.

• Inappropriate when there is a large amount of objectives.

19

Page 20: 1 Multi-Objective Optimization Using Evolutionary Algorithms

20

Schematic of an ideal Multi-Objective Schematic of an ideal Multi-Objective optimization procedureoptimization procedure

Multi-objectiveoptimization problem

Minimize f1Minimize f2

…Minimize fn

subject to constraints

IDEALMulti-objective Optimizer

Multiple trade-off solutionsfound

Choose one solution

Higher-levelInformation

Step 2

Ste

p 1

In 1967, Rosenberg hinted the potential of Genetic Algorithms in multi-objective optimizationNo significant study until in 1989 Goldberg outlined a new non-dominated sorting procedure

A lot of interest recently because a GA is capable of finding multiple optimum solutions in one single run (more than 630 publications in this research area)

Page 21: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Pareto-based TechniquesPareto-based Techniques

• Suggested by Goldberg (1989) to solve the problems with Schaffer’s VEGA.

• Use of non-dominated ranking and selection to move the population towards the Pareto front.

• Requires a ranking procedure and a technique to maintain diversity in the population (otherwise, the GA will tend to converge to a single solution, because of the stochastic noise involved in the process).

21

Page 22: 1 Multi-Objective Optimization Using Evolutionary Algorithms

22

The original Genetic Algorithm (GA)The original Genetic Algorithm (GA)• Initially introduced by Holland in 1975.• General-purpose heuristic search algorithm that mimic the natural

selection process in order to find the optimal solutions.1. Generate a population of random individuals or candidate

solutions to the problem at hand.2. Evaluate of the fitness of each individual in the

population.3. Rank individuals based on their fitness. 4. Select individuals with high fitness to produce the next

generation. 5. Use genetic operations crossover and mutation to

generate a new population. 6. Continue the process by going back to step 2 until the

problem’s objectives are satisfied. • The best individuals are allowed to survive, mate, and

reproduce offspring.• Evolving solutions over time leads to better solutions.

Page 23: 1 Multi-Objective Optimization Using Evolutionary Algorithms

23

The original Genetic Algorithm (GA) – The original Genetic Algorithm (GA) – Flow ChartFlow Chart

A real coded GA represents parameters without coding, which makes representation of the solutions very close to the natural formulation of many problems.

Special crossover and mutation operators are designed to work with real parameters.

Multi-objective Fitness:1.Non-dominated (best)2.Dominated but feasible

(average)3.Infeasible points (worst)

Page 24: 1 Multi-Objective Optimization Using Evolutionary Algorithms

24

Why using GA?Why using GA?

• Using a GA when the search space is large and not so well understood and unstructured.

• A GA can provide a surprisingly powerful heuristic search.

• Simple, yet it performs well on many different types of problems:– optimization of functions with linear and nonlinear

constraints,– the traveling salesman problem,– machine learning,– parallel semantic networks, – simulation of gas pipeline systems, – problems of scheduling, web search, software testing,

financial forecasting, and others.

Page 25: 1 Multi-Objective Optimization Using Evolutionary Algorithms

25

Multi-Objective Evolutionary Algorithm Multi-Objective Evolutionary Algorithm (MOEA)(MOEA)

• An EA is a variation of the original GA.• An MOEA has additional operations to maintain multiple

Pareto-optimal solutions in the population.Advantages:• Deal simultaneously with a set of possible solutions.• Enable of finding several members of the Pareto optimal set in

a single run of the algorithm.• Explore solutions over the entire search space.• Less susceptible to the shape or continuity of the Pareto front.Disadvantages:• Not completely supported theoretically yet (compared to

another method such as Stochastic Approximation which has been around for half a century).

Page 26: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Multi-Objective Genetic Algorithm (MOGA)Multi-Objective Genetic Algorithm (MOGA)

• Proposed by Fonseca and Fleming (1993).

• The approach consists of a scheme in which the rank of a certain individual corresponds to the number of individuals in the current population by which it is dominated.

• It uses fitness sharing and mating restrictions.

26

Page 27: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Advantages and DisadvantagesAdvantages and Disadvantages of MOGA of MOGA

• Efficient and relatively easy to implement.

• Its performance depends on the appropriate selection of the sharing factor.

• MOGA has been very popular and tends to perform well when compared to other EMOO approaches.

27

Some Applications Fault diagnosis Control system design Wings plan form design

Page 28: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Nondominated Sorting Genetic AlgorithmNondominated Sorting Genetic Algorithm (NSGA)(NSGA)

• Proposed by Srinivas and Deb (1994).• It is based on several layers of classifications of

the individuals.• Nondominated individuals get a certain dummy

fitness value and then are removed from the population. The process is repeated until the entire population has been classified.

• To maintain the diversity of the population, classified individuals are shared (in decision variable space) with their dummy fitness values.

28

Page 29: 1 Multi-Objective Optimization Using Evolutionary Algorithms

29

NSGANSGA – Flow Chart – Flow Chart

Multi-objective Fitness:1.Non-dominated (best)2.Dominated but feasible (average)3.Infeasible points (worst)

Before selection is performed, the population is ranked on the basic of domination: all non-dominated individuals are classified into one category (with a dummy fitness value, which is proportional to the population size).To maintain the diversity of the population, these classified individuals are shared (in decision variable space) with their dummy fitness values.Then this group of classified individuals is removed from the population and another layer of no-dominated individuals is considered (the remainder of the population is re-classified). The process continues until all the individuals in the population are classified. Since individuals in the first front have maximum fitness value, they always get more copies than the rest of the population. This allow us to search for non-dominated regions, and results in convergence of the population toward such regions. Sharing, on its part, helps to distribute the population over this region.

Page 30: 1 Multi-Objective Optimization Using Evolutionary Algorithms

30

DemoDemo – NSGA II – NSGA II

0.25

0.3

0.35

0.4

0.45

0.5

7.00E+09 1.20E+10 1.70E+10 2.20E+10 2.70E+10 3.20E+10 3.70E+10 4.20E+10 4.70E+10

Energy

CP

I

Run without fuzzy Run with fuzzy Manual

http://webspace.ulbsibiu.ro/adrian.florea/html/docs/IET_MultiObjective.pdf

[Gellert et al., 2012] Multi-Objective Optimizations for a Superscalar Architecture with Selective Value Prediction, IET Computers & Digital Techniques, Vol. 6, No. 4 (July), pp. 205-213, ISSN: 1751-8601

- Features of NSGA II

Page 31: 1 Multi-Objective Optimization Using Evolutionary Algorithms

31

The research areaThe research areaProblems:

1. The so called “standard” settings (De Jong, 1990) : population size of 50-100, crossover rate of 0.6 – 0.9, and mutation rate of 0.001 do not work for complex problems.

2. For complex real-world problems, GAs require parameter tunings in order to achieve the optimal solutions.

3. The task of tuning GA parameters is not trivial due to the complex and nonlinear interactions among the parameters and its dependency on many aspects of the particular problem being solved (e.g. density of the search space).

Research: 1. Self-Adaptive MOEA: use information fed-back from the MOEA

during its execution to adjust the values of parameters attached to each individual in the population.

2. Improve the performance of MOEA: finding wide spread Pareto-Optimal solutions and reducing computing resources.

3. Make them easier to use and available to more users.

Page 32: 1 Multi-Objective Optimization Using Evolutionary Algorithms

Multi-Objective Evolutionary Algorithms - references (MOEAs)

Some representatives of MOEAs in operational research through past years:

a) Non-Dominated Sorting genetic Algorithm (NSGA), Srinivas et Deb, 1995.

b) NSGA-II, Deb et al., 2002.

c) Strength Pareto Evolutionary Algorithm (SPEA), Zitzler and Thiele, 1999.

d) SPEA2, Zitzler et al., 2001.

e) Epsilon-NSGAII, Kollat and Reed, 2005.

f) Multi-objective Shuffled Complex Evolution Metropolis Algorithm (MOSCEM-UA), Vrugt et al., 2003.