84
Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Embed Size (px)

Citation preview

Page 1: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010

Multi-Objective Optimisation (II)

Matthieu Basseur

Page 2: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 2

Outlines

Motivations

Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm

Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation

Conclusions and perspectives

Page 3: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 3

Introduction

About me…. 2001-2005: PhD in Lille (France) – supervisor E-G. Talbi Research visitor in ETH Zurich/Switzerland (2005) – with E. Zitzler 2006-2007: Research Assistant at Nottingham University/England Since September 2007: Assistant Professor (University of Angers)

Research interests Main area: multiobjective optimisation Metaheuristics for multiobjective optimisation (GAs, Local Search,

Memetic algorithms, Path relinking, and also exact methods) Hybrid and adaptive metaheuristics (cooperation, parallelism) MO optimisation under uncertainty Applications (continuous test functions, flow-shop problem, routing

problem…) Motivations

Mainly linked to my previous research activities

Page 4: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 4

MultiObjective Optimisation (I)

…by V. Barichard two weeks ago!

Single objective optimisation Optimisation problems Resolution approaches

Multiobjective optimisation problems Description Dominance relation

Resolution approaches and result evaluation Resolution approaches Pareto dominance based algorithms Outputs comparison

Today: new trends in MOO

Page 5: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 5

Motivations Efficients optimisation algorithms are often:

Complex Complex mechanisms (diversification, evaluation…) Hybrid algorithms

Parameters-dependant Numerous Great influence on the results (setted by hand, or adaptive setting)

Dependant to the size of the problem Dependant to the problem treated

Need of generic algorithms which are: Simple Adaptable to a range of optimisation problems Small number of parameters …but efficient!

Design of generic multi-objective metaheuristics Problem specific optimisation

Page 6: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 6

Outlines

Motivations

Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm

Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation

Conclusions and perspectives

Page 7: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 7

Multiobjective optimisation

Non-dominated solutionDominated feasible solution

b

e

g

f1

f2

f

a

c

Pareto Dominance y dominates z if and only if i[1, …, n], yi zi

and i[1, …, n], yi < zi

Non-dominated solution A solution x is non dominated if a solution which dominates x does

not exist

Goal: Find a good quality and well diversified set of non-dominated solutions

Page 8: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 8

Multiobjective optimisation

Non-dominated solutionDominated feasible solution

b

e

g

f1

f2

f

a

c

No total order relation exist ( from single objective case) We can not compare solution a={4,7} with solution b={8,5}.

Resulting specific questions How to assign fitness of solutions in evolutionary algo. (for

selection)? How to find good compromise solutions? How to evaluate different outputs obtained by different algorithms?

Goal: Find a good quality and well diversified set of non-dominated solutions

Page 9: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 9

Evolutionary Multiobjective Optimisation?

Multiobjective Optimisation : find a set of compromise solutions

Evolutionary Algorithms (EAs) : evolve a set of solutions.

EAs are naturally wellsuited to find multiple efficient solutions in a single simulation run

a tremendous number of multiobjective evolutionary algorithms have been proposed over the last two decade.

f2

f1

Page 10: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 10

Multiobjective Fitness assignment

Fitness assignement : central point of (population-based) multiobjective metaheuristics

Generic population-based search algorithm:

Create initial population P

repeat generate a new solution x add x to the population P evaluate fitness of solution x (and update P?) delete the worst solution of Puntil termination criteria is verified

return P

Need to ‘rank’ solutions

Page 11: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 11

Multiobjective Fitness assignment

Until mid 80’s: agregation of objective functions Now: Pareto dominance based ranking methods (dominance

depth, counter of dominance…)

f2

f1

f1f2

+ σσ

2

1

Convex hull

Page 12: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 12

Multiobjective Fitness assignment

Dominance depth [Srinivas & Deb 94]

f2

f1

Rk=1

Rk=2

Rk=3

Page 13: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 13

Multiobjective Fitness assignment

Counter of dominance [Fonseca & Flemming 93]

f2

f1

Rk=0

Rk=0

Rk=1

Rk=3 Rk=4

Rk=3

Rk=1

Rk=0

Rk=0

Rk=7

Page 14: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 14

1+4

Multiobjective Fitness assignment

Sum of ranks ≈[Bentley & Wakefield 97]

f2

f1

RK=16

RK=13

RK=12

RK=11

RK=9

RK=7

RK=5

RK=8

RK=5

RK=4

Page 15: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 15

Multiobjective Fitness assignment Pareto dominance ranking methods drawbacks

Binary value - No quantification of the dominance Comparison is difficult if to many Pareto solutions can be

generated (need to add clustering tool). General goal of MO optimisation: « Find a good

quality and well diversified set of non-dominated solutions »

How to achieve this? Define indicators which are able to evaluate a set of

solutions Optimise the indicator value during the search

Page 16: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 16

Outlines

Motivations

Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm

Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation

Conclusions and perspectives

Page 17: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 17

Quality indicators Usefull to compare two (or more)

optimizers How to compare set A against set B?

f1

f2

Approximation AApproximation B?

Page 18: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 18

Quality indicators Definition (Quality indicator): An m-ary quality

indicator I is a function, which assigns each vector (A1,A2, … , Am) of m approximation sets a real value I(A1,…, Am) [Zitzler 2005].

f1

f2

Approximation AApproximation B

Page 19: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 19

Quality indicators Definition (Quality indicator): An m-ary quality

indicator I is a function, which assigns each vector (A1,A2, … , Am) of m approximation sets a real value I(A1,…, Am) [Zitzler 2005].

Unary indicator : I(P1),…, I(Pm) compare real values. Binary indicator : I(P1,P2) compare two sets!

Comparison of m outputs: Use a reference set (e.g. the best known Pareto set) and compare each output against the reference set

Many research on this subject – many indicators: hypervolume indicator, ε-indicator, average best weight

combination, distance from reference, error ratio, chi-square-like deviation indicator, spacing, generational distance, maximum Pareto front error, maximum spread, coverage error, Pareto spread… [Zitzler 2005]

Page 20: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 20

ε-indicator

niforxfxf

AxBxBAI

ii ,,1)()(

:,min),(21

12

I (A,B)ε

I (B,A)ε

Normalized space

f1

f2

Binary indicator epsilon [Zitzler & Kuenzli 04] I (A,B)= Minimal translation to apply on the set A so that

every solution in set B is dominated by at least as one solution in A.

Page 21: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 21

ε-indicator

niforxfxf

AxRxAI

ii ,,1)()(

:,min)(21

12

I (A)ε

Normalized space

f1

f2

Unary indicator version of binary indicator epsilon I (A)= Minimal translation to apply on the set A so that every

solution in a reference set R is dominated by at least as one solution in A.

Page 22: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 22

Hypervolume indicator Also known as S-metric, lebesgue’ measure… Hypervolume enclosed by approximation A

according to a reference point Z

f1

f2

Approximation A

Page 23: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 23

Hypervolume indicator Also known as S-metric, lebesgue’ measure… Hypervolume enclosed by approximation A

according to a reference point Z

f1

f2

Approximation A

Z

Page 24: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 24

Hypervolume indicator Also known as S-metric, lebesgue’ measure… Hypervolume enclosed by approximation A

according to a reference point Z

f1

f2

Approximation A

Normalized space Z

I (A)HD

Page 25: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 25

Hypervolume indicator Hypervolume as binary indicator [Zitzler & Kuenzli 04]

Hypervolume enclosed by approximation A and not by approximation B, according to a reference point Z

f1

f2

Approximation AApproximation B

I (A,B)HD

I (B,A)HD

Normalized space Z

Page 26: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 26

Outlines

Motivations

Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm

Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation

Conclusions and perspectives

Page 27: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 27

IBEA principle

Fitness assignment: Define a binary indicator I which allows to

compare two solutions When a solution x is added to a population P

Compare x against every solution in P using indicator I to compute x fitness

For each solution S in P, update fitness according to I and x

Selection Delete the solution which have the worst fitness

value

Page 28: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 28

From binary indicator to fitness assignment

f1

f2 I (a,b)>0HD

I (b,a)>0HD

a

b f1

f2

HD

I (b,a)=-I (a,b)>0

a

b

HD

f1

f2

I (a,b)>0

I (b,a)>0

a

b

f1

f2a

εI (b,a)<0

I (a,b)>0ε

ε

Page 29: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 29

From binary indicator to fitness assignment

f1

f2 I (a,b)>0HD

I (b,a)>0HD

a

b f1

f2

HD

I (b,a)=-I (a,b)>0

a

b

HD

f1

f2

I (a,b)>0

I (b,a)>0

a

b

f1

f2a

εI (b,a)<0

I (a,b)>0ε

ε

Binary indicator value of a population against a single solution:

)(,\ xFitnessxxPI

xPz

xzIexxPI\

/,,\

xPz

xzIxxPI\

,,\

xzIxxPIxPz

,min,\\

Page 30: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 30

IBEA principles [Zitzler & Kuenzli 2004]

Define a binary indicator I and an initial population P of n solutions

Generate a set Q of m new solutions using genetic operators Select a set R of N solutions from Q U P, which minimize I(Q U P,R) Repeat until termination criteria verified return R

Advantages Outperforms NSGA-II and SPEA2 on continuous test functions Small number of parameters (population size, m, binary indicator) No diversity preservation mechanism required Could take into account the decision-maker preference

But… Delete optimaly m solutions from a population is difficult (greedy

in IBEA) Evolutionary algorithm convergence is usualy slow

Indicator-Based Evolutionary Algorithm

Page 31: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 31

But…

Delete optimaly m solutions from a population is difficult (greedy in IBEA)

Evolutionary algorithm convergence is usualy slow local search methods are known to be efficient

metaheuristics for single-objective optimization… application to MOO?

f2

f1

Cut m solutions f2

f1

IBEA: Delete 1 by 1 f2

f1

ES(n,1) : 1 to delete

Page 32: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 32

Outlines

Motivations

Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm

Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation

Conclusions and perspectives

Page 33: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 33

Single Objective Local Search Evaluate solutions « arround » an initial

one, and select a solution which is better. Efficient heuristic, easy to understand and

to implement. Several Neighborhood Improvement strategy (first, best) Iterated version (random pop., or other

strategy)

x1

x2

Solution space

f(x)

Page 34: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 34

MO Local searches issues Difficulties resulting from the multiobjective

aspect of the problems. Initialisation (random?) Solution Evaluation (agregation, Pareto, indicator) Neighborhood (related to all objectives?) Neigh. Exploration (partial, 1st improvement, best

imp.) Selection strategy (all improvements, dominance…) Population size (unique solution, fixed or variable

size) Archive of best known? Iteration (re-initialisation) Stopping criterion (progress threshold, entire set in

local optima?) …

Page 35: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 35

MO Local searches example : PLS Classical and intuitive dominance-based MO local

search [Talbi et al. 2001] [Basseur et al. 2003] [Angel et al. 2004]

f1

f2

Different versions : stopping criterion, archive, selection strategy… Problems :

non-dominated solution are incomparable variable population size (can be huge)

Indicator-Based MO Local Search!

Page 36: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 36

Outlines

Motivations

Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm

Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation

Conclusions and perspectives

Page 37: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 37

Indicator-Based MO Local Search

Initialisation of the population P of size N

Fitness assignment For each x є P, Fitness(x)=I(P\{x},x)

Local search Step: for all x є P do x* one random neighbour of x

Fitness(x*)=I(P,x*)

For each z є P, update its Fitness: Fitness(z)+=I(x*,z)

Remove w, the solution with minimal Fitness value in P U x*

Repeat until all neighbours tested, or w≠x* (new solution found)

Stopping criterion: no new non-dominated solution found during an

entire local search step: return the set of non-dominated solutions of

P.

Iterated IBMOLS: repeat the process, with different initial populations

Page 38: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 38

Parameters: indicators Binary indicators issued from performance assessment

studies: Iε [Zitzler & Kuenzli 04]

IHD [Zitzler & Kuenzli 04]

Comparison with classical dominance-based ranking methods, adapted into indicators:

IBen [Bentley & Wakefield 97] ISri [Srinivas & Deb 94]

IFon [Fonseca & Flemming 93]

Page 39: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 39

Parameters: indicators Binary indicators issued from performance assessment

studies: Iε [Zitzler & Kuenzli 04]

IHD [Zitzler & Kuenzli 04]

Comparison with classical dominance-based ranking methods, adapted into indicators:

IBen [Bentley & Wakefield 97] ISri [Srinivas & Deb 94]

IFon [Fonseca & Flemming 93]

f2

f1

Rk=1Rk=2

Rk=3

f1

xzIxPI SriPz

Sri ,min,

otherwise

xxifxPIxxI Sri

Sri 0

1,, 211

21

Page 40: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 40

Parameters: indicators Binary indicators issued from performance assessment

studies: Iε [Zitzler & Kuenzli 04]

IHD [Zitzler & Kuenzli 04]

Comparison with classical dominance-based ranking methods, adapted into indicators:

IBen [Bentley & Wakefield 97] ISri [Srinivas & Deb 94]

IFon [Fonseca & Flemming 93]

f2

f1

Rk=0

Rk=0

Rk=1Rk=3Rk=4

Rk=3Rk=1

Rk=0

Rk=0

Rk=7

f1

Pz

FonFon xzIxPI ,,

otherwise

xxifxxIFon 0

1, 21

21

Page 41: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 41

Parameters: indicators Binary indicators issued from performance assessment

studies: Iε [Zitzler & Kuenzli 04]

IHD [Zitzler & Kuenzli 04]

Comparison with classical dominance-based ranking methods, adapted into indicators:

IBen [Bentley & Wakefield 97] ISri [Srinivas & Deb 94]

IFon [Fonseca & Flemming 93]

n

iiiBen xfxfxxI

12121 ,,

otherwise

baif

baif

ba

02

11

,

Pz

BenBen xzIxPI ,,

with

1+4

f2 RK=16

RK=13RK=12

RK=11RK=9

RK=7

RK=5

RK=8

RK=5

RK=4

f1

Page 42: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 42

Parameters: population initialisation

Rand: Generate a set P of size n of random permutations.

Cross: Apply a classical crossover operator to pairs of

solutions selected from the archive A of size m of non-dominated solutions.

If 2n<m, then select randomly 2n solutions from A. If 2n≥m, then select A, and complete with random solutions.

Create n new solutions by applying crossover on the 2n selected solutions.

SA: Random noise applied on archived solutions. If n<m, then select randomly n solutions from A. If n≥m, then select A, and complete with random solutions.

Create n new solutions by applying random noise (mutations) on the n selected solutions.

Page 43: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 43

Application: Ring Star problem Applications in telecommunication network design and

transit systems planning. Problems from 70 to 300 locations

MinimiseRing costAssignment cost

Page 44: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 44

Application: Nurse Scheduling QMC NURSE SCHEDULING PROBLEM: process of timetabling staff (allocating nurses to working shifts) over a period of time.

Hard constraints to satisfy 3 Objective functions: minimise the violation of 3 soft constraints

violations of “SingleNight, WeekendSplit, WeekendBalance” number of violations of “Coverage” penalty for “CoverageBalance”

Problem details: Ward of 20 to 30 nurses Planning period is 28 days, with 3 shift types: day, evening and night Full time/Part time nurses (e.g. 8h, 15h, 23h, 30h, 40h…) Nurses hierarchy, according to their qualifications and training Coverage demand is different for each shift Working regulations to be satisfied (e.g. annual leave)

Page 45: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 45

Application: Biobjective Flow-shop problem

N jobs to schedule on M machines Critical ressources Permutation flow shop Objectives to minimise :

Cmax : Maximum completion time T : Sum of tardiness (or average tardiness)

Taillard’ Benchmarks [Taillard 93] , extended to the biobjective case

M1

M2

M3

Cmax_T

_

Page 46: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 46

Parameters / Performance assessment

Binary quality indicators: Iε, IHD [Zitzler & Kuenzli 04]

Comparison with classical dominance-based ranking methods, adapted into indicators: IBen [Bentley & Wakefield 97], IFon [Fonseca & Fleming 93], ISri

[Srinivas & Deb 94]

Population size: small fixed values (3, 5, 8, 10, 20, 30, 50) Population initialisation: Random, Crossover on solutions in the

archive, Random noise on archived solutions (Simulated Annealing)

20 runs on each instance, short run time (20” to 20’)

Performance assessment Hypervolume indicator difference

of the different sets of

non-dominated solutions obtained

Statistical analysis (Mann-Withley test)

Z

Page 47: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 47

Results : table analysis For each algorithm :

20 Hypervolume indicator differences computed from the 20 runs

Tables show the average value for each pair algorithm/instance

Stastistical analysis : Rank the differents runs of two different algorithm Using Hypervolume difference Mann-Withley test: compute the confidence level that the

obtained classification is not hazardous

Results in bold: algorithm never outperformed by another algorithm with a confidence level greater than 95%

A A A A B A A A B A B A B B B B B B

A B B A B A B A A A B A B B B A A B

Page 48: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 48

Results Indicator sensitivity

Superiority of performance assessment based indicators over dominance based indicators

Superiority of epsilon indicator over hypervolume indicator

Page 49: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 49

Results Initialisation strategy sensitivity

Superiority of Simulated annealing (random mutations) initialisation

Optimal noise rate around 10%

Page 50: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 50

Results Population size sensitivity

Best performance obtained with small population size Optimal population size increases with the size of the problem

considered

Page 51: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 51

Experiments: parameter sensitivitysummary of the best values

Problem Run time

P size Indicator Initialisation

20*5 (1) 20” 3 IFon , Iε SA, Cross

20*5 (2) 20” 3 IFon , ISri Cross

20*10 (1)

1’ 5 Iε , IHD SA

20*10 (2)

1’ 8 Iε , IHD SA

20*20 2’ 8 Iε , IFon SA

50*5 5’ 8 Iε , IFon SA

50*10 10’ 10 Iε , IHD SA

50*20 20’’ 30 Iε , IHD SA, Cross

Page 52: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 52

IBMOLS: conclusions

IBMOLS: generic indicator-based local search for MOPs Small number of parameters (pop size, binary indicator,

initialisation function) No diversity preservation mechanism required

Superiority of Iε binary indicator on different problems Parameter sensitivity analysis

Performance assessment based indicators Small population size Population initialisation: random noise on archived solutions

Very good overall results obtained (new best-knowns)…BUT hypervolume indicator is known as the most intuitive

performance indicator and the only one being fully sensitive to Pareto dominance relation. Why the results are desapointing?

Page 53: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 53

Outlines

Motivations

Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm

Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation

Conclusions and perspectives

Page 54: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 54

Hypervolume UNARY indicator IHD does not correspond to the definition of hypervolume

indicator!Zreff2

f1

Page 55: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 55

Hypervolume UNARY indicator IHD does not correspond to the definition of hypervolume

indicator!Zreff2

f1

We would like to compute…

Page 56: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 56

Hypervolume UNARY indicator IHD does not correspond to the definition of hypervolume

indicator!Zreff2

f1

Zreff2

f1

We would like to compute…

But we compute…

Page 57: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 57

Hypervolume UNARY indicator IHD does not correspond to the definition of hypervolume

indicator!Zreff2

f1

Zreff2

f1

We would like to compute…

But we compute…

Page 58: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 58

Hypervolume UNARY indicator IHD does not correspond to the definition of hypervolume

indicator!Zreff2

f1

We would like to compute…

We can’t compute hypervolume contribution of a solution by

comparing only pairs of solutions

Page 59: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 59

Hypervolume-Based MO Local Search

Initialisation of the population P of size N

Fitness assignment For each x є P, Fitness(x)=I(P\{x},x)

Local search Step: for all x є P do x* one random neighbour of x

Fitness(x*)=I(P,x*)

For each z є P, update its Fitness: Fitness(z)+=I(x*,z)

Remove w, the solution with minimal Fitness value in P U x*

Repeat until all neighbours tested, or w≠x* (new solution found)

Stopping criterion: no new non-dominated solution found during an

entire local search step: return the set of non-dominated solutions of

P.

Iterated IBMOLS: repeat the process, with different initial populations

Page 60: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 60

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution

f2

f1

Page 61: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 61

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution

f2

f1

Page 62: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 62

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 1st case : x is dominated

f2

f1

Page 63: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 63

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 1st case : x is dominated

x’ fitness equal to the biggest dominance area between a solution of P and x

f2

f1

Page 64: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 64

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 1st case : x is dominated

x’ fitness equal to the biggest dominance area between a solution of P and x

Delete the dominated solution with the worst fitness value

No more fitness update

f2

f1

Page 65: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 65

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 1st case : x is dominated

x’ fitness equal to the biggest dominance area between a solution of P and x

Delete the dominated solution with the worst fitness value

No more fitness update

f2

f1

Page 66: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 66

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

f2

f1

Page 67: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 67

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

f2

f1

Zref

Page 68: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 68

Zref

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

Update fitness: New dominated solutions?

f2

f1

Update fitnesses

Page 69: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 69

Zref

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

Update fitness: New dominated solutions? Compute x fitness (thanks to his

non-dominated neighbours)

f2

f1

x

y1

y0 z0

z1

])2[]2[(])1[]1[(. 01 xyxyfitx

Page 70: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 70

Zref

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

Update fitness: New dominated solutions? Compute x fitness (thanks to his

non-dominated neighbours) Compute new fitness for x

neighbours (thanks to x and neighbour which is perhaps newly dominated)

f2

f1

])1[]1[(

])1[]1[(..

00

000 yz

yxfityfity

])2[]2[(

])2[]2[(..

11

111 yz

yxfityfity

x

y1

y0 z0

z1

Page 71: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 71

Zref

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

Update fitness: New dominated solutions? Compute x fitness (thanks to his

non-dominated neighbours) Compute new fitness for x

neighbours (thanks to x and neighbour which is perhaps newly dominated)

f2

f1

Page 72: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 72

Zref

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

Update fitness: New dominated solutions? Compute x fitness (thanks to his

non-dominated neighbours) Compute new fitness for x

neighbours (thanks to x and neighbour which is perhaps newly dominated)

Delete worst solution w If w is dominated: no fitness

change

f2

f1

Page 73: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 73

Zref

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

Update fitness: New dominated solutions? Compute x fitness (thanks to his

non-dominated neighbours) Compute new fitness for x

neighbours (thanks to x and neighbour which is perhaps newly dominated)

Delete worst solution w If w is dominated: no fitness

change If w is non-dominated: update

fitness of w neighbours.

f2

f1

Page 74: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 74

Zref

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

Update fitness: New dominated solutions? Compute x fitness (thanks to his

non-dominated neighbours) Compute new fitness for x

neighbours (thanks to x and neighbour which is perhaps newly dominated)

Delete worst solution w If w is dominated: no fitness

change If w is non-dominated: update

fitness of w neighbours (thanks to them and w).

f2

f1

])1[]1[(

])1[]1[(..

02

0100 yy

yyfityfity

])2[]2[(

])2[]2[(..

12

1011 yy

yyfityfity

y0

y1

y2

Page 75: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 75

Zref

Fitness update : algorithm Algorithm

Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each

solution 2nd case : x is non-dominated

Update fitness: New dominated solutions? Compute x fitness (thanks to his

non-dominated neighbours) Compute new fitness for x

neighbours (thanks to x and neighbour which is perhaps newly dominated)

Delete worst solution w If w is dominated: no fitness

change If w is non-dominated: update

fitness of w neighbours (thanks to them and w).

f2

f1

Page 76: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 76

General observations Overall complexity in O(n)! Special cases when x is non-dominated and on the

extremity of the Pareto set Zref coordinates replace neighbour coordinates

The reference point Zref need to be fixed Solution : Zref= {+,+} ! It allows us to maintain the

extremities of the Pareto set into the population. The algorithm is defined for the bi-objective case

only Need to extend it to the general case Hypervolume calculation is NP according to the number of

objective function, then our algorithm too. BUT: Multi-objective problems studied mainly deals with 2

objective functions, sometime 3 objective functions, and almost never more than 4 objective functions.

Page 77: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 77

Outlines

Motivations Multiobjective optimisation Quality indicators Indicator-Based MultiObjective Search Hypervolume-Based Optimisation

Description Experiments

Experiments Conclusions and perspectives

Page 78: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 78

Application: Biobjective Flow-shop problem

N jobs to schedule on M machines Critical ressources Permutation flow shop Objectives to minimise :

Cmax : Maximum completion time T : Sum of tardiness (or average tardiness)

Taillard’ Benchmarks [Taillard 93] , extended to the biobjective case

M1

M2

M3

Cmax_T

_

Page 79: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 79

Parameters / Performance assessment

Binary quality indicators: Iε, IHD [Zitzler & Kuenzli 04] and INH presented previouly.

Population size: small fixed values (from 10 to 30) Population initialisation: 30% Random noise on archived

solutions 20 runs on each instance run time from 20” to 60’

Performance assessment Hypervolume indicator difference

of the different sets of

non-dominated solutions obtained

Statistical analysis (Mann-Withley test)

Z

Page 80: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 80

Results on Flow-shop problem

Page 81: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 81

Outlines

Motivations Multiobjective optimisation Quality indicators Indicator-Based MultiObjective Search Hypervolume-Based Optimisation Conclusions and perspectives

Page 82: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 82

Conclusions and perspectives Conclusions

Indicator-based multiobjective optimisation : a growing research area

Very simple mechanism (no diversity preservation mecanism needed) Very efficient (outperform classical generic methods) New principle : still a lot of research fields to exploit

IBEA : Indicator-Based Evolutionary Algorithm Efficient according to other multiobjective evolutionary algorithms Superiority of Iε binary indicator on different problems Evolutionary Algorithm Slow convergence

IBMOLS : Indicator-Based MultiObjective Local Search Combine advantages of IBEA and fast convergence of iterated local

searches algorithms Hypervolume indicator needed to be improved

HBMOLS : Hypervolume-Based MultiObjective Local Search Selection based on hypervolume maximisation Greatly outperforms IBMOLS versions

Page 83: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 83

Conclusions and perspectives Perspectives

Application to another multiobjective problems real world problems academics problems mathematical functions

More than 2 objective optimisation? Propose adaptations to optimise more than 2 objectives (in process

with Ron-Qiang Zeng) Study the limitation of the proposed algorithm in terms of complexity Study the possible use of approximation algorithms

Study different versions of Hypervolume based selection Mainly for fitness computation of dominated solutions

Application of indicator-based strategy on other search methods Path Relinking (in process with Ron-Qiang Zeng)

Page 84: Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June 2010 84

Global conclusion / research perspectives

Lessons from past research (tx to E. Zitzler) EMO provides information about a problem (search space

exploration) EMO can help in single-objective scenarios

(multiobjectivization) But… MOO is part of the decision making process (preferences) But… MOO for large n is different from n = 2 (high-dimensional

objective spaces) Research perspectives

MOO = part of the decision making process. How to collaborate between decision maker and MOO

Uncertainty and robustness Expensive objective function evaluations Hybridisation: Metaheuristics and OR methods (examples) Multi-multiobjective problems definition Many objective optimisation