Upload
others
View
9
Download
0
Embed Size (px)
Citation preview
A Crash Course in Robust Optimization
Arie M.C.A. [email protected]
INRIA – Project COATI – 12 February 2013
Lehrstuhl II für Mathematik
Outline
1 Motivation2 Chance-Constrained Programming3 Robust Optimization4 Γ-Robust Optimization5 Recoverable Robustness6 Robust Network Design with Affine Recourse7 Multi-Band Robustness8 Conclusions
Arie Koster – RWTH Aachen University 2 / 48
How robust is a network design?
Demand uncertainties Traffic fluctuations in the US abilene Internet2network in time intervals of 5 minutes during one week:
Traffic fluctuates heavily between node-pairs
Load of links will fluctuate alikeTo avoid congestion, demand is overestimated by, e.g., ≥ 300%Can we do better?
Arie Koster – RWTH Aachen University 3 / 48
How robust is a network design?
Demand uncertainties Traffic fluctuations in the US abilene Internet2network in time intervals of 5 minutes during one week:
Traffic fluctuates heavily between node-pairsLoad of links will fluctuate alike
To avoid congestion, demand is overestimated by, e.g., ≥ 300%Can we do better?
Arie Koster – RWTH Aachen University 3 / 48
How robust is a network design?
Demand uncertainties Traffic fluctuations in the US abilene Internet2network in time intervals of 5 minutes during one week:
Traffic fluctuates heavily between node-pairsLoad of links will fluctuate alikeTo avoid congestion, demand is overestimated by, e.g., ≥ 300%Can we do better?
Arie Koster – RWTH Aachen University 3 / 48
Alternative Approaches
Lower overestimationThe network is designed such that capacities are as small as possible; trafficfluctuations might result in high network congestion
Stochastic ProgrammingNetwork design has to be computed for many scenarios; high computationaleffort
Multi-period Network DesignMany traffic matrices have to be considered simultaneously; highcomputational effort
Arie Koster – RWTH Aachen University 4 / 48
Alternative Approaches
Lower overestimationThe network is designed such that capacities are as small as possible; trafficfluctuations might result in high network congestion
Stochastic ProgrammingNetwork design has to be computed for many scenarios; high computationaleffort
Multi-period Network DesignMany traffic matrices have to be considered simultaneously; highcomputational effort
Arie Koster – RWTH Aachen University 4 / 48
Alternative Approaches
Lower overestimationThe network is designed such that capacities are as small as possible; trafficfluctuations might result in high network congestion
Stochastic ProgrammingNetwork design has to be computed for many scenarios; high computationaleffort
Multi-period Network DesignMany traffic matrices have to be considered simultaneously; highcomputational effort
Arie Koster – RWTH Aachen University 4 / 48
Outline
1 Motivation2 Chance-Constrained Programming3 Robust Optimization4 Γ-Robust Optimization5 Recoverable Robustness6 Robust Network Design with Affine Recourse7 Multi-Band Robustness8 Conclusions
Arie Koster – RWTH Aachen University 5 / 48
Chance-Constrained Programming
Chance-constrained Optimization Problem (COP)Find among all solutions that satisfy all constraints with high probability asolution with optimal objective value.
How to solve a COP?Stochastic OptimizationI Modelling with random variablesI Quite challenging to solve resulting problemsI Probability distribution have to be determined
Robust Optimization
I Uncertainty comes from a known set, the uncertainty setI No information on probability distribution neededI Seeks for solution with best worst-case objective guarantee
Arie Koster – RWTH Aachen University 6 / 48
Chance-Constrained Programming
Chance-constrained Optimization Problem (COP)Find among all solutions that satisfy all constraints with high probability asolution with optimal objective value.
How to solve a COP?Stochastic OptimizationI Modelling with random variablesI Quite challenging to solve resulting problemsI Probability distribution have to be determined
Robust Optimization
I Uncertainty comes from a known set, the uncertainty setI No information on probability distribution neededI Seeks for solution with best worst-case objective guarantee
Arie Koster – RWTH Aachen University 6 / 48
Chance-Constrained Programming
Chance-constrained Optimization Problem (COP)Find among all solutions that satisfy all constraints with high probability asolution with optimal objective value.
How to solve a COP?Stochastic OptimizationI Modelling with random variablesI Quite challenging to solve resulting problemsI Probability distribution have to be determined
Robust Optimization
I Uncertainty comes from a known set, the uncertainty setI No information on probability distribution neededI Seeks for solution with best worst-case objective guarantee
Arie Koster – RWTH Aachen University 6 / 48
Chance-Constrained Programming
Chance-constrained Optimization Problem (COP)Find among all solutions that satisfy all constraints with high probability asolution with optimal objective value.
How to solve a COP?Stochastic OptimizationI Modelling with random variablesI Quite challenging to solve resulting problemsI Probability distribution have to be determinedRobust OptimizationI Uncertainty comes from a known set, the uncertainty set
I No information on probability distribution neededI Seeks for solution with best worst-case objective guarantee
Arie Koster – RWTH Aachen University 6 / 48
Chance-Constrained Programming
Chance-constrained Optimization Problem (COP)Find among all solutions that satisfy all constraints with high probability asolution with optimal objective value.
How to solve a COP?Stochastic OptimizationI Modelling with random variablesI Quite challenging to solve resulting problemsI Probability distribution have to be determinedRobust OptimizationI Uncertainty comes from a known set, the uncertainty setI No information on probability distribution needed
I Seeks for solution with best worst-case objective guarantee
Arie Koster – RWTH Aachen University 6 / 48
Chance-Constrained Programming
Chance-constrained Optimization Problem (COP)Find among all solutions that satisfy all constraints with high probability asolution with optimal objective value.
How to solve a COP?Stochastic OptimizationI Modelling with random variablesI Quite challenging to solve resulting problemsI Probability distribution have to be determinedRobust OptimizationI Uncertainty comes from a known set, the uncertainty setI No information on probability distribution neededI Seeks for solution with best worst-case objective guarantee
Arie Koster – RWTH Aachen University 6 / 48
Chance-Constrained Programming
Chance-Constrained Linear Programming
min cT x
s.t. Ax ≤ b
x ≥ 0
with Entries of A, b and/or c are not constant but random variables
Arie Koster – RWTH Aachen University 7 / 48
Chance-Constrained Programming
Chance-Constrained Linear Programming with joint constraints
min cT x
s.t. P (Ax ≤ b) ≥ 1− εx ≥ 0
with Entries of A, b and/or c are not constant but random variables
Arie Koster – RWTH Aachen University 7 / 48
Chance-Constrained Programming
Chance-Constrained Linear Programming with individual constraints
min cT x
s.t. P (Aix ≤ bi ) ≥ 1− εi ∀i = 1, . . . ,mx ≥ 0
with Entries of A, b and/or c are not constant but random variables
Arie Koster – RWTH Aachen University 7 / 48
Example
Chance-Constrained Knapsack:Knapsack with n Items, profits ci , uncertain weights ai , and capacity b
maxn∑
i=1
cixi
s.t. P
(n∑
i=1
aixi ≤ b
)≥ 1− ε
x ∈ {0, 1}n
How to solve this problem?Assumption: Weights are independently and normally distributed withexpectation mi and standard deviation σi .
Arie Koster – RWTH Aachen University 8 / 48
Example
Chance-Constrained Knapsack:Knapsack with n Items, profits ci , uncertain weights ai , and capacity b
maxn∑
i=1
cixi
s.t. P
(n∑
i=1
aixi ≤ b
)≥ 1− ε
x ∈ {0, 1}n
How to solve this problem?Assumption: Weights are independently and normally distributed withexpectation mi and standard deviation σi .
Arie Koster – RWTH Aachen University 8 / 48
Example
Chance-Constrained Knapsack:Knapsack with n Items, profits ci , uncertain weights ai , and capacity b
maxn∑
i=1
cixi
s.t. P
(n∑
i=1
aixi ≤ b
)≥ 1− ε
x ∈ {0, 1}n
How to solve this problem?
Assumption: Weights are independently and normally distributed withexpectation mi and standard deviation σi .
Arie Koster – RWTH Aachen University 8 / 48
Example
Chance-Constrained Knapsack:Knapsack with n Items, profits ci , uncertain weights ai , and capacity b
maxn∑
i=1
cixi
s.t. P
(n∑
i=1
aixi ≤ b
)≥ 1− ε
x ∈ {0, 1}n
How to solve this problem?Assumption: Weights are independently and normally distributed withexpectation mi and standard deviation σi .
Arie Koster – RWTH Aachen University 8 / 48
Example (cont.)Assumption: Weights are independently and normally distributed withexpectation mi and standard deviation σi .
P
(n∑
i=1
aixi ≤ b
)= P
∑ni=1 (aixi −mixi )√∑n
i=1 σ2i x
2i
≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
=
P
Z ≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
≥ 1− ε
with Z =∑n
i=1(ai xi−mi xi )√∑ni=1 σ
2i x2i
Let Φ(.) be the cumulative distribution function of the standard normaldistribution. Then,
b −∑n
i=1mixi√∑ni=1 σ
2i x
2i
≥ Φ−1(1− ε)
Arie Koster – RWTH Aachen University 9 / 48
Example (cont.)Assumption: Weights are independently and normally distributed withexpectation mi and standard deviation σi .
P
(n∑
i=1
aixi ≤ b
)= P
∑ni=1 (aixi −mixi )√∑n
i=1 σ2i x
2i
≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
=
P
Z ≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
≥ 1− ε
with Z =∑n
i=1(ai xi−mi xi )√∑ni=1 σ
2i x2i
Let Φ(.) be the cumulative distribution function of the standard normaldistribution. Then,
b −∑n
i=1mixi√∑ni=1 σ
2i x
2i
≥ Φ−1(1− ε)
Arie Koster – RWTH Aachen University 9 / 48
Example (cont.)Assumption: Weights are independently and normally distributed withexpectation mi and standard deviation σi .
P
(n∑
i=1
aixi ≤ b
)= P
∑ni=1 (aixi −mixi )√∑n
i=1 σ2i x
2i
≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
= P
Z ≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
≥ 1− ε
with Z =∑n
i=1(ai xi−mi xi )√∑ni=1 σ
2i x2i
Let Φ(.) be the cumulative distribution function of the standard normaldistribution. Then,
b −∑n
i=1mixi√∑ni=1 σ
2i x
2i
≥ Φ−1(1− ε)
Arie Koster – RWTH Aachen University 9 / 48
Example (cont.)Assumption: Weights are independently and normally distributed withexpectation mi and standard deviation σi .
P
(n∑
i=1
aixi ≤ b
)= P
∑ni=1 (aixi −mixi )√∑n
i=1 σ2i x
2i
≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
= P
Z ≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
≥ 1− ε
with Z =∑n
i=1(ai xi−mi xi )√∑ni=1 σ
2i x2i
Let Φ(.) be the cumulative distribution function of the standard normaldistribution. Then,
b −∑n
i=1mixi√∑ni=1 σ
2i x
2i
≥ Φ−1(1− ε)
Arie Koster – RWTH Aachen University 9 / 48
Example (cont.)Assumption: Weights are independently and normally distributed withexpectation mi and standard deviation σi .
P
(n∑
i=1
aixi ≤ b
)= P
∑ni=1 (aixi −mixi )√∑n
i=1 σ2i x
2i
≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
= P
Z ≤b −
∑ni=1mixi√∑n
i=1 σ2i x
2i
≥ 1− ε
with Z =∑n
i=1(ai xi−mi xi )√∑ni=1 σ
2i x2i
Let Φ(.) be the cumulative distribution function of the standard normaldistribution. Then,
b −∑n
i=1mixi√∑ni=1 σ
2i x
2i
≥ Φ−1(1− ε)
Arie Koster – RWTH Aachen University 9 / 48
Example (cont.)
b −∑n
i=1mixi√∑ni=1 σ
2i x
2i
≥ Φ−1(1− ε)
If 1− ε > 0.5, Φ−1(1− ε) > 0
and the chance constrained knapsack can bereformulated as
minn∑
i=1
cixi
s.t. Φ−1(1− ε)
√√√√ n∑i=1
σ2i x2i +
n∑i=1
aixi ≤ b
x ∈ {0, 1}n
After relaxing the integrality of x , a second order cone problem remains,which can be solved in polynomial time.
Arie Koster – RWTH Aachen University 10 / 48
Example (cont.)
b −∑n
i=1mixi√∑ni=1 σ
2i x
2i
≥ Φ−1(1− ε)
If 1− ε > 0.5, Φ−1(1− ε) > 0 and the chance constrained knapsack can bereformulated as
minn∑
i=1
cixi
s.t. Φ−1(1− ε)
√√√√ n∑i=1
σ2i x2i +
n∑i=1
aixi ≤ b
x ∈ {0, 1}n
After relaxing the integrality of x , a second order cone problem remains,which can be solved in polynomial time.
Arie Koster – RWTH Aachen University 10 / 48
Example (cont.)
b −∑n
i=1mixi√∑ni=1 σ
2i x
2i
≥ Φ−1(1− ε)
If 1− ε > 0.5, Φ−1(1− ε) > 0 and the chance constrained knapsack can bereformulated as
minn∑
i=1
cixi
s.t. Φ−1(1− ε)
√√√√ n∑i=1
σ2i x2i +
n∑i=1
aixi ≤ b
x ∈ {0, 1}n
After relaxing the integrality of x , a second order cone problem remains,which can be solved in polynomial time.Arie Koster – RWTH Aachen University 10 / 48
Outline
1 Motivation2 Chance-Constrained Programming3 Robust Optimization4 Γ-Robust Optimization5 Recoverable Robustness6 Robust Network Design with Affine Recourse7 Multi-Band Robustness8 Conclusions
Arie Koster – RWTH Aachen University 11 / 48
Uncertain LPs
ObservationIn the example, normal distribution of the weights was assumed. What if,the weights are distributed differently, or unknown?
Uncertain Linear ProgramAn Uncertain Linear Optimization problem (ULO) is a collection of linearoptimization problems (instances){
min{cT x : Ax ≤ b}}
(c,A,b)∈U
where all input data stems from an uncertainty set U ⊂ Rn × Rm×n × Rm.
Arie Koster – RWTH Aachen University 12 / 48
Uncertain LPs
ObservationIn the example, normal distribution of the weights was assumed. What if,the weights are distributed differently, or unknown?
Uncertain Linear ProgramAn Uncertain Linear Optimization problem (ULO) is a collection of linearoptimization problems (instances){
min{cT x : Ax ≤ b}}
(c,A,b)∈U
where all input data stems from an uncertainty set U ⊂ Rn × Rm×n × Rm.
Arie Koster – RWTH Aachen University 12 / 48
Robust CounterpartULO
{min{cT x : Ax ≤ b}
}(c,A,b)∈U
Robust feasible solutionA vector x ∈ Rn is robust feasible for ULO if
Ax ≤ b ∀(c ,A, b) ∈ U
Robust solution valueGiven a vector x ∈ Rn, the robust solution value c(x) is defined as
c(x) := sup(c,A,b)∈U
cT x
Robust CounterpartThe robust counterpart of an ULO is the optimization problem
min {c(x) : x is robust feasible}
Arie Koster – RWTH Aachen University 13 / 48
Robust CounterpartULO
{min{cT x : Ax ≤ b}
}(c,A,b)∈U
Robust feasible solutionA vector x ∈ Rn is robust feasible for ULO if
Ax ≤ b ∀(c ,A, b) ∈ U
Robust solution valueGiven a vector x ∈ Rn, the robust solution value c(x) is defined as
c(x) := sup(c,A,b)∈U
cT x
Robust CounterpartThe robust counterpart of an ULO is the optimization problem
min {c(x) : x is robust feasible}
Arie Koster – RWTH Aachen University 13 / 48
Robust CounterpartULO
{min{cT x : Ax ≤ b}
}(c,A,b)∈U
Robust feasible solutionA vector x ∈ Rn is robust feasible for ULO if
Ax ≤ b ∀(c ,A, b) ∈ U
Robust solution valueGiven a vector x ∈ Rn, the robust solution value c(x) is defined as
c(x) := sup(c,A,b)∈U
cT x
Robust CounterpartThe robust counterpart of an ULO is the optimization problem
min {c(x) : x is robust feasible}
Arie Koster – RWTH Aachen University 13 / 48
Example
Let{min{cT x : Ax ≤ b, x ≥ 0}
}(c,A,b)∈U
be an ULO with uncertain
right-hand-side
b ∈ [b, b + b]
uncertain matrix A,
aij ∈ [aij , aij + aij ]
but certain objective vector c .
The robust counterpart can be written as
min{cT x : (A + A)x ≤ b, x ≥ 0}
Arie Koster – RWTH Aachen University 14 / 48
Example
Let{min{cT x : Ax ≤ b, x ≥ 0}
}(c,A,b)∈U
be an ULO with uncertain
right-hand-sideb ∈ [b, b + b]
uncertain matrix A,
aij ∈ [aij , aij + aij ]
but certain objective vector c .
The robust counterpart can be written as
min{cT x : (A + A)x ≤ b, x ≥ 0}
Arie Koster – RWTH Aachen University 14 / 48
Example
Let{min{cT x : Ax ≤ b, x ≥ 0}
}(c,A,b)∈U
be an ULO with uncertain
right-hand-sideb ∈ [b, b + b]
uncertain matrix A,aij ∈ [aij , aij + aij ]
but certain objective vector c .
The robust counterpart can be written as
min{cT x : (A + A)x ≤ b, x ≥ 0}
Arie Koster – RWTH Aachen University 14 / 48
Example
Let{min{cT x : Ax ≤ b, x ≥ 0}
}(c,A,b)∈U
be an ULO with uncertain
right-hand-sideb ∈ [b, b + b]
uncertain matrix A,aij ∈ [aij , aij + aij ]
but certain objective vector c .
The robust counterpart can be written as
min{cT x : (A + A)x ≤ b, x ≥ 0}
Arie Koster – RWTH Aachen University 14 / 48
Robust Counterpart
ObservationIf the objective is certain, the robust counterpart can be constructedrow-wise, i.e.,
keep the objectivereplace every constraint aT
i x ≤ bi by its robust counterpart
aTi x ≤ bi ∀(ai , bi ) ∈ Ui
where
Ui :={
(ai , bi ) ∈ Rn+1 : ∃(A, b) ∈ U with Ai . = ai , bi = bi
}
Note: the robust counterpart does not change if U = U1 × U2 × . . .× Um
instead of U is used.
Arie Koster – RWTH Aachen University 15 / 48
Robust Counterpart
ObservationIf the objective is certain, the robust counterpart can be constructedrow-wise, i.e.,
keep the objectivereplace every constraint aT
i x ≤ bi by its robust counterpart
aTi x ≤ bi ∀(ai , bi ) ∈ Ui
where
Ui :={
(ai , bi ) ∈ Rn+1 : ∃(A, b) ∈ U with Ai . = ai , bi = bi
}Note: the robust counterpart does not change if U = U1 × U2 × . . .× Um
instead of U is used.
Arie Koster – RWTH Aachen University 15 / 48
Robust Counterpart
CorollaryIf only the right hand side b is uncertain, the robust counter part reads
Ax ≤ b
with bi = min{bi : (A, b, c) ∈ U}.
Max-Flow with uncertain capacities:Take minimum capacity on every arc, and solve the max flow problem.
Min-Cut with uncertain capacities:Objective vector c is uncertain! Requires solving of a new problem.
Corollary: Robust Max-Flow 6= Robust Min-Cut
Arie Koster – RWTH Aachen University 16 / 48
Robust Counterpart
CorollaryIf only the right hand side b is uncertain, the robust counter part reads
Ax ≤ b
with bi = min{bi : (A, b, c) ∈ U}.
Max-Flow with uncertain capacities:Take minimum capacity on every arc, and solve the max flow problem.
Min-Cut with uncertain capacities:Objective vector c is uncertain! Requires solving of a new problem.
Corollary: Robust Max-Flow 6= Robust Min-Cut
Arie Koster – RWTH Aachen University 16 / 48
Robust Counterpart
CorollaryIf only the right hand side b is uncertain, the robust counter part reads
Ax ≤ b
with bi = min{bi : (A, b, c) ∈ U}.
Max-Flow with uncertain capacities:Take minimum capacity on every arc, and solve the max flow problem.
Min-Cut with uncertain capacities:Objective vector c is uncertain! Requires solving of a new problem.
Corollary: Robust Max-Flow 6= Robust Min-Cut
Arie Koster – RWTH Aachen University 16 / 48
Robust Counterpart
CorollaryIf only the right hand side b is uncertain, the robust counter part reads
Ax ≤ b
with bi = min{bi : (A, b, c) ∈ U}.
Max-Flow with uncertain capacities:Take minimum capacity on every arc, and solve the max flow problem.
Min-Cut with uncertain capacities:Objective vector c is uncertain! Requires solving of a new problem.
Corollary: Robust Max-Flow 6= Robust Min-Cut
Arie Koster – RWTH Aachen University 16 / 48
Robust Counterparts – Limitations
Corollary: Robust Max-Flow 6= Robust Min-Cut
Challenge: Find another way to handle right-hand-side uncertainty.
Minoux [16] considers
maxb∈U
min cT x(b) : Ax(b) ≤ b, x ≥ 0}
instead ofmin cT x : Ax ≤ b ∀b ∈ U , x ≥ 0}
Problem is NP-hard for commonly used uncertainty sets [17, 18]Intermediate solutions required!
Arie Koster – RWTH Aachen University 17 / 48
Robust Counterparts – Limitations
Corollary: Robust Max-Flow 6= Robust Min-Cut
Challenge: Find another way to handle right-hand-side uncertainty.Minoux [16] considers
maxb∈U
min cT x(b) : Ax(b) ≤ b, x ≥ 0}
instead ofmin cT x : Ax ≤ b ∀b ∈ U , x ≥ 0}
Problem is NP-hard for commonly used uncertainty sets [17, 18]Intermediate solutions required!
Arie Koster – RWTH Aachen University 17 / 48
Robust Counterparts – Limitations
Corollary: Robust Max-Flow 6= Robust Min-Cut
Challenge: Find another way to handle right-hand-side uncertainty.Minoux [16] considers
maxb∈U
min cT x(b) : Ax(b) ≤ b, x ≥ 0}
instead ofmin cT x : Ax ≤ b ∀b ∈ U , x ≥ 0}
Problem is NP-hard for commonly used uncertainty sets [17, 18]
Intermediate solutions required!
Arie Koster – RWTH Aachen University 17 / 48
Robust Counterparts – Limitations
Corollary: Robust Max-Flow 6= Robust Min-Cut
Challenge: Find another way to handle right-hand-side uncertainty.Minoux [16] considers
maxb∈U
min cT x(b) : Ax(b) ≤ b, x ≥ 0}
instead ofmin cT x : Ax ≤ b ∀b ∈ U , x ≥ 0}
Problem is NP-hard for commonly used uncertainty sets [17, 18]Intermediate solutions required!
Arie Koster – RWTH Aachen University 17 / 48
Uncertainty Sets
How to define the uncertainty sets?
Uncertainty set is an ellipsoid [4], e.g.,
Ui ={
(a, b) ∈ Rn+1 : ‖(a, b)− (a, b)‖ < κ}
Uncertainty set is an polyhedron, e.g.,
Ui ={
(a, b) ∈ Rn+1 : D · (a, b) ≤ d}
with D ∈ Rk×n, d ∈ Rk for some k ∈ N [1].
equivalent: set of discrete scenarios (extreme points of polyhedron)special case: Γ-Robustness
Arie Koster – RWTH Aachen University 18 / 48
Uncertainty Sets
How to define the uncertainty sets?
Uncertainty set is an ellipsoid [4], e.g.,
Ui ={
(a, b) ∈ Rn+1 : ‖(a, b)− (a, b)‖ < κ}
Uncertainty set is an polyhedron, e.g.,
Ui ={
(a, b) ∈ Rn+1 : D · (a, b) ≤ d}
with D ∈ Rk×n, d ∈ Rk for some k ∈ N [1].
equivalent: set of discrete scenarios (extreme points of polyhedron)special case: Γ-Robustness
Arie Koster – RWTH Aachen University 18 / 48
Uncertainty Sets
How to define the uncertainty sets?
Uncertainty set is an ellipsoid [4], e.g.,
Ui ={
(a, b) ∈ Rn+1 : ‖(a, b)− (a, b)‖ < κ}
Uncertainty set is an polyhedron, e.g.,
Ui ={
(a, b) ∈ Rn+1 : D · (a, b) ≤ d}
with D ∈ Rk×n, d ∈ Rk for some k ∈ N [1].
equivalent: set of discrete scenarios (extreme points of polyhedron)special case: Γ-Robustness
Arie Koster – RWTH Aachen University 18 / 48
Uncertainty Sets
How to define the uncertainty sets?
Uncertainty set is an ellipsoid [4], e.g.,
Ui ={
(a, b) ∈ Rn+1 : ‖(a, b)− (a, b)‖ < κ}
Uncertainty set is an polyhedron, e.g.,
Ui ={
(a, b) ∈ Rn+1 : D · (a, b) ≤ d}
with D ∈ Rk×n, d ∈ Rk for some k ∈ N [1].equivalent: set of discrete scenarios (extreme points of polyhedron)
special case: Γ-Robustness
Arie Koster – RWTH Aachen University 18 / 48
Uncertainty Sets
How to define the uncertainty sets?
Uncertainty set is an ellipsoid [4], e.g.,
Ui ={
(a, b) ∈ Rn+1 : ‖(a, b)− (a, b)‖ < κ}
Uncertainty set is an polyhedron, e.g.,
Ui ={
(a, b) ∈ Rn+1 : D · (a, b) ≤ d}
with D ∈ Rk×n, d ∈ Rk for some k ∈ N [1].equivalent: set of discrete scenarios (extreme points of polyhedron)special case: Γ-Robustness
Arie Koster – RWTH Aachen University 18 / 48
Outline
1 Motivation2 Chance-Constrained Programming3 Robust Optimization4 Γ-Robust Optimization5 Recoverable Robustness6 Robust Network Design with Affine Recourse7 Multi-Band Robustness8 Conclusions
Arie Koster – RWTH Aachen University 19 / 48
Robust Optimization ApproachSimplifying assumption: b and c are certain
Uncertainty Set by Bertsimas & Sim [5, 6]: Let aij ∈ R, aij ≥ 0 be given,and Γ ∈ R+ a parameter.
Ui (Γ) = {ai ∈ Rn : aij = aij + aijzij ∀j = 1, . . . , n, zi ∈ Zi (Γ)}
with
Zi (Γ) =
zi ∈ Rn : |zij | ≤ 1 ∀j = 1, . . . , n,n∑
j=1
|zij | ≤ Γ
Stated otherwise:
nominal values aij and deviations aij
aij ∈ [aij − aij , aij + aij ]
Sum of relative deviations from the nominal values is bounded by Γ
At most Γ many entries might deviate from their nominal value
Arie Koster – RWTH Aachen University 20 / 48
Robust Optimization ApproachSimplifying assumption: b and c are certainUncertainty Set by Bertsimas & Sim [5, 6]: Let aij ∈ R, aij ≥ 0 be given,and Γ ∈ R+ a parameter.
Ui (Γ) = {ai ∈ Rn : aij = aij + aijzij ∀j = 1, . . . , n, zi ∈ Zi (Γ)}
with
Zi (Γ) =
zi ∈ Rn : |zij | ≤ 1 ∀j = 1, . . . , n,n∑
j=1
|zij | ≤ Γ
Stated otherwise:nominal values aij and deviations aij
aij ∈ [aij − aij , aij + aij ]
Sum of relative deviations from the nominal values is bounded by Γ
At most Γ many entries might deviate from their nominal value
Arie Koster – RWTH Aachen University 20 / 48
Robust Optimization ApproachSimplifying assumption: b and c are certainUncertainty Set by Bertsimas & Sim [5, 6]: Let aij ∈ R, aij ≥ 0 be given,and Γ ∈ R+ a parameter.
Ui (Γ) = {ai ∈ Rn : aij = aij + aijzij ∀j = 1, . . . , n, zi ∈ Zi (Γ)}
with
Zi (Γ) =
zi ∈ Rn : |zij | ≤ 1 ∀j = 1, . . . , n,n∑
j=1
|zij | ≤ Γ
Stated otherwise:
nominal values aij and deviations aij
aij ∈ [aij − aij , aij + aij ]
Sum of relative deviations from the nominal values is bounded by Γ
At most Γ many entries might deviate from their nominal value
Arie Koster – RWTH Aachen University 20 / 48
Robust Optimization ApproachSimplifying assumption: b and c are certainUncertainty Set by Bertsimas & Sim [5, 6]: Let aij ∈ R, aij ≥ 0 be given,and Γ ∈ R+ a parameter.
Ui (Γ) = {ai ∈ Rn : aij = aij + aijzij ∀j = 1, . . . , n, zi ∈ Zi (Γ)}
with
Zi (Γ) =
zi ∈ Rn : |zij | ≤ 1 ∀j = 1, . . . , n,n∑
j=1
|zij | ≤ Γ
Stated otherwise:
nominal values aij and deviations aij
aij ∈ [aij − aij , aij + aij ]
Sum of relative deviations from the nominal values is bounded by ΓAt most Γ many entries might deviate from their nominal value
Arie Koster – RWTH Aachen University 20 / 48
Example
a =
255
, a =
121
, Γ = 2
1 1.5 2 2.5 34
64
5
6
a1
a2
a 3
Provided by Manuel Kutschka
Arie Koster – RWTH Aachen University 21 / 48
Γ-Robust CounterpartRobust Counterpart
minn∑
i=1
cixi
s.t.n∑
j=1
aijxj + maxzi∈Zi (Γ)
n∑j=1
aijzijxj
≤ bi i = 1, . . . ,m
x ≥ 0
ObservationSince Zi defines a (bounded) polyhedron, only the extreme points have tobe treated.
For Γ ∈ Z+:
n∑j=1
aijxj + maxS⊆{1,...,n}:|S |≤Γ
∑j∈S
aijxj
≤ bi (1)
Arie Koster – RWTH Aachen University 22 / 48
Γ-Robust CounterpartRobust Counterpart
minn∑
i=1
cixi
s.t.n∑
j=1
aijxj + maxzi∈Zi (Γ)
n∑j=1
aijzijxj
≤ bi i = 1, . . . ,m
x ≥ 0
ObservationSince Zi defines a (bounded) polyhedron, only the extreme points have tobe treated.
For Γ ∈ Z+:
n∑j=1
aijxj + maxS⊆{1,...,n}:|S |≤Γ
∑j∈S
aijxj
≤ bi (1)
Arie Koster – RWTH Aachen University 22 / 48
Γ-Robust CounterpartRobust Counterpart
minn∑
i=1
cixi
s.t.n∑
j=1
aijxj + maxzi∈Zi (Γ)
n∑j=1
aijzijxj
≤ bi i = 1, . . . ,m
x ≥ 0
ObservationSince Zi defines a (bounded) polyhedron, only the extreme points have tobe treated.
For Γ ∈ Z+:
n∑j=1
aijxj + maxS⊆{1,...,n}:|S |≤Γ
∑j∈S
aijxj
≤ bi (1)
Arie Koster – RWTH Aachen University 22 / 48
Γ-Robustness – Theory
Theorem 1 (Bertsimas & Sim [6])Let x? be an optimal solution of the Γ-robust counterpart. If aij ,j = 1, . . . , n, are independent and symmetric distributed random variables in[aij − aij , aij + aij ], then
P
(n∑
i=1
aix?i > b
)≤ B(n, Γ)
with
limn→∞
B(n, Γ) = 1− Φ
(Γ√n
)where Φ(.) is the CDF of the standard normal distribution.
Instead of the limit: B(n, Γ) ≈ 1− Φ(
Γ−1√n
)
Arie Koster – RWTH Aachen University 23 / 48
Γ-Robustness – Theory
Theorem 1 (Bertsimas & Sim [6])Let x? be an optimal solution of the Γ-robust counterpart. If aij ,j = 1, . . . , n, are independent and symmetric distributed random variables in[aij − aij , aij + aij ], then
P
(n∑
i=1
aix?i > b
)≤ B(n, Γ)
with
limn→∞
B(n, Γ) = 1− Φ
(Γ√n
)where Φ(.) is the CDF of the standard normal distribution.
Instead of the limit: B(n, Γ) ≈ 1− Φ(
Γ−1√n
)Arie Koster – RWTH Aachen University 23 / 48
Choice of Γ
Choice of Γ as a function of n so that theprobability of constraint violation is less than p%:
Γn p = 5 p = 2 p = 1 p = 0.5 p = 0.15 4.7 5.0 5.0 5.0 5.0
10 6.2 7.5 8.4 9.1 10.020 8.4 10.2 11.4 12.5 14.850 12.6 15.5 17.4 19.2 22.9
100 17.4 21.5 24.3 26.8 31.9200 24.3 30.0 33.9 37.4 44.7
1,000 53.0 65.9 74.6 82.5 98.72,000 74.6 92.8 105.0 116.2 139.2
Note: Result is independent of actual distribution of random variables aij ,only symmetry and independence are required.
Arie Koster – RWTH Aachen University 24 / 48
Solving the robust counterpart
n∑j=1
aijxj + maxS⊆{1,...,n}:|S |≤Γ
∑j∈S
aijxj
≤ bi
Observations:
Inequality (1) can be linearized by∑j 6∈S
aijxj +∑j∈S
(aij + aij )xj ≤ bi ∀S ⊆ {1, . . . , n}, |S | ≤ Γ (2)
This number of inequalities is exponential if Γ = O(n)
Separation can be done in polynomial timeAlternatively, a compact formulation can be obtained via dualization
Arie Koster – RWTH Aachen University 25 / 48
Solving the robust counterpart
n∑j=1
aijxj + maxS⊆{1,...,n}:|S |≤Γ
∑j∈S
aijxj
≤ bi
Observations:Inequality (1) can be linearized by∑
j 6∈S
aijxj +∑j∈S
(aij + aij )xj ≤ bi ∀S ⊆ {1, . . . , n}, |S | ≤ Γ (2)
This number of inequalities is exponential if Γ = O(n)
Separation can be done in polynomial timeAlternatively, a compact formulation can be obtained via dualization
Arie Koster – RWTH Aachen University 25 / 48
Solving the robust counterpart
n∑j=1
aijxj + maxS⊆{1,...,n}:|S |≤Γ
∑j∈S
aijxj
≤ bi
Observations:Inequality (1) can be linearized by∑
j 6∈S
aijxj +∑j∈S
(aij + aij )xj ≤ bi ∀S ⊆ {1, . . . , n}, |S | ≤ Γ (2)
This number of inequalities is exponential if Γ = O(n)
Separation can be done in polynomial timeAlternatively, a compact formulation can be obtained via dualization
Arie Koster – RWTH Aachen University 25 / 48
Solving the robust counterpart
n∑j=1
aijxj + maxS⊆{1,...,n}:|S |≤Γ
∑j∈S
aijxj
≤ bi
Observations:Inequality (1) can be linearized by∑
j 6∈S
aijxj +∑j∈S
(aij + aij )xj ≤ bi ∀S ⊆ {1, . . . , n}, |S | ≤ Γ (2)
This number of inequalities is exponential if Γ = O(n)
Separation can be done in polynomial time
Alternatively, a compact formulation can be obtained via dualization
Arie Koster – RWTH Aachen University 25 / 48
Solving the robust counterpart
n∑j=1
aijxj + maxS⊆{1,...,n}:|S |≤Γ
∑j∈S
aijxj
≤ bi
Observations:Inequality (1) can be linearized by∑
j 6∈S
aijxj +∑j∈S
(aij + aij )xj ≤ bi ∀S ⊆ {1, . . . , n}, |S | ≤ Γ (2)
This number of inequalities is exponential if Γ = O(n)
Separation can be done in polynomial timeAlternatively, a compact formulation can be obtained via dualization
Arie Koster – RWTH Aachen University 25 / 48
Separation of robust inequalities
Given x?, find a subset S ⊆ {1, . . . , n} with |S | ≤ Γ such that∑j 6∈S
aijx?j +
∑j∈S
(aij + aij )x?j > bi
Separation problem:
ZSEP = maxn∑
j=1
aijx?j zj
s.t.n∑
j=1
zj ≤ Γ
If ZSEP > bi , add robust inequality (2) for S = {j : zj = 1}.Optimization = Separation implies polynomial solvability of LP
Arie Koster – RWTH Aachen University 26 / 48
Separation of robust inequalities
Given x?, find a subset S ⊆ {1, . . . , n} with |S | ≤ Γ such that
n∑j=1
aijx?j +
∑j∈S
aijx?j > bi
Separation problem:
ZSEP = maxn∑
j=1
aijx?j zj
s.t.n∑
j=1
zj ≤ Γ
If ZSEP > bi , add robust inequality (2) for S = {j : zj = 1}.Optimization = Separation implies polynomial solvability of LP
Arie Koster – RWTH Aachen University 26 / 48
Separation of robust inequalities
Given x?, find a subset S ⊆ {1, . . . , n} with |S | ≤ Γ such that
∑j∈S
aijx?j > bi −
n∑j=1
aijx?j
Separation problem:
ZSEP = maxn∑
j=1
aijx?j zj
s.t.n∑
j=1
zj ≤ Γ
If ZSEP > bi −∑n
j=1 aijx?j , add robust inequality (2) for S = {j : zj = 1}.
Optimization = Separation implies polynomial solvability of LP
Arie Koster – RWTH Aachen University 26 / 48
Separation of robust inequalitiesGiven x?, find a subset S ⊆ {1, . . . , n} with |S | ≤ Γ such that
∑j∈S
aijx?j > bi −
n∑j=1
aijx?j
Separation problem:
ZSEP = maxn∑
j=1
aijx?j zj
s.t.n∑
j=1
zj ≤ Γ
zj ∈ {0, 1}
If ZSEP > bi −∑n
j=1 aijx?j , add robust inequality (2) for S = {j : zj = 1}.
Optimization = Separation implies polynomial solvability of LP
Arie Koster – RWTH Aachen University 26 / 48
Separation of robust inequalitiesGiven x?, find a subset S ⊆ {1, . . . , n} with |S | ≤ Γ such that
∑j∈S
aijx?j > bi −
n∑j=1
aijx?j
Separation problem:
ZSEP = maxn∑
j=1
aijx?j zj
s.t.n∑
j=1
zj ≤ Γ
zj ∈ {0, 1}
If ZSEP > bi −∑n
j=1 aijx?j , add robust inequality (2) for S = {j : zj = 1}.
Optimization = Separation implies polynomial solvability of LP
Arie Koster – RWTH Aachen University 26 / 48
Separation of robust inequalities
Given x?, find a subset S ⊆ {1, . . . , n} with |S | ≤ Γ such that
∑j∈S
aijx?j > bi −
n∑j=1
aijx?j
Separation problem:
ZSEP = maxn∑
j=1
aijx?j zj
s.t.n∑
j=1
zj ≤ Γ
0 ≤ zj ≤ 1
If ZSEP > bi −∑n
j=1 aijx?j , add robust inequality (2) for S = {j : zj = 1}.
Optimization = Separation implies polynomial solvability of LP
Arie Koster – RWTH Aachen University 26 / 48
Separation of robust inequalities
Given x?, find a subset S ⊆ {1, . . . , n} with |S | ≤ Γ such that
∑j∈S
aijx?j > bi −
n∑j=1
aijx?j
Separation problem:
ZSEP = maxn∑
j=1
aijx?j zj
s.t.n∑
j=1
zj ≤ Γ
0 ≤ zj ≤ 1
If ZSEP > bi −∑n
j=1 aijx?j , add robust inequality (2) for S = {j : zj = 1}.
Optimization = Separation implies polynomial solvability of LPArie Koster – RWTH Aachen University 26 / 48
Compact formulation
Let βi (x , Γ) = maxS⊆{1,...,n}:|S|≤Γ
∑j∈S
aijxj
, and hence (1) reads
n∑j=1
aijxj + βi (x , Γ) ≤ bi
Given x?, βi (x?, Γ) is the optimization problem
βi (x?, Γ) = max
n∑j=1
aijx?j zj
= minΓπi +n∑
j=1
ρij
s.t.n∑
j=1
zj ≤ Γ
s.t.πi + ρij ≥ aijx?j ∀j = 1, . . . , n
0 ≤ zj ≤ 1
πi , ρij ≥ 0
Arie Koster – RWTH Aachen University 27 / 48
Compact formulation
Let βi (x , Γ) = maxS⊆{1,...,n}:|S|≤Γ
∑j∈S
aijxj
, and hence (1) reads
n∑j=1
aijxj + βi (x , Γ) ≤ bi
Given x?, βi (x?, Γ) is the optimization problem
βi (x?, Γ) = max
n∑j=1
aijx?j zj
= minΓπi +n∑
j=1
ρij
s.t.n∑
j=1
zj ≤ Γ
s.t.πi + ρij ≥ aijx?j ∀j = 1, . . . , n
0 ≤ zj ≤ 1
πi , ρij ≥ 0
Arie Koster – RWTH Aachen University 27 / 48
Compact formulation
Let βi (x , Γ) = maxS⊆{1,...,n}:|S|≤Γ
∑j∈S
aijxj
, and hence (1) reads
n∑j=1
aijxj + βi (x , Γ) ≤ bi
Given x?, βi (x?, Γ) is the optimization problem
βi (x?, Γ) = max
n∑j=1
aijx?j zj = minΓπi +
n∑j=1
ρij
s.t.n∑
j=1
zj ≤ Γ
s.t.πi + ρij ≥ aijx?j ∀j = 1, . . . , n
0 ≤ zj ≤ 1
πi , ρij ≥ 0
Arie Koster – RWTH Aachen University 27 / 48
Compact formulation
Let βi (x , Γ) = maxS⊆{1,...,n}:|S|≤Γ
∑j∈S
aijxj
, and hence (1) reads
n∑j=1
aijxj + βi (x , Γ) ≤ bi
Given x?, βi (x?, Γ) is the optimization problem
βi (x?, Γ) = max
n∑j=1
aijx?j zj = minΓπi +
n∑j=1
ρij
s.t.n∑
j=1
zj ≤ Γ s.t.πi + ρij ≥ aijx?j ∀j = 1, . . . , n
0 ≤ zj ≤ 1 πi , ρij ≥ 0
Arie Koster – RWTH Aachen University 27 / 48
Compact formulation
Let βi (x , Γ) = maxS⊆{1,...,n}:|S|≤Γ
∑j∈S
aijxj
, and hence (1) reads
n∑j=1
aijxj + βi (x , Γ) ≤ bi
Given x?, βi (x?, Γ) is the optimization problem
βi (x?, Γ) = max
n∑j=1
aijx?j zj = minΓπi +
n∑j=1
ρij
s.t.n∑
j=1
zj ≤ Γ s.t.πi + ρij ≥ aijx?j ∀j = 1, . . . , n
0 ≤ zj ≤ 1 πi , ρij ≥ 0
Arie Koster – RWTH Aachen University 27 / 48
Compact Reformulation
Thus, (1) now reads
n∑j=1
aijxj + min
Γπi +n∑
j=1
ρij : πi + ρij ≥ aijxj ∀j , πi ≥ 0, ρij ≥ 0
≤ bi
or equivalently
n∑j=1
aijxj + Γπi +n∑
j=1
ρij ≤ bi
πi + ρij ≥ aijxj ∀j = 1, . . . , nπi ≥ 0, ρij ≥ 0
Arie Koster – RWTH Aachen University 28 / 48
Compact Reformulation
Thus, (1) now reads
n∑j=1
aijxj + min
Γπi +n∑
j=1
ρij : πi + ρij ≥ aijxj ∀j , πi ≥ 0, ρij ≥ 0
≤ bi
or equivalently
n∑j=1
aijxj + Γπi +n∑
j=1
ρij ≤ bi
πi + ρij ≥ aijxj ∀j = 1, . . . , nπi ≥ 0, ρij ≥ 0
Arie Koster – RWTH Aachen University 28 / 48
Further results
Theorem 2 (Bertsimas & Sim [6])If only the objective is uncertain and x ∈ {0, 1}n, then the robustcounterpart can be solved by solving n + 1 nominal problems of the sametype.
Corollary 3The knapsack problem with uncertain objective can be solved in O(n2B).
Theorem 4 (Pferschy et al., 2012)The knapsack problem with uncertain weights can be solved in O(nΓB).
Arie Koster – RWTH Aachen University 29 / 48
Further results
Theorem 2 (Bertsimas & Sim [6])If only the objective is uncertain and x ∈ {0, 1}n, then the robustcounterpart can be solved by solving n + 1 nominal problems of the sametype.
Corollary 3The knapsack problem with uncertain objective can be solved in O(n2B).
Theorem 4 (Pferschy et al., 2012)The knapsack problem with uncertain weights can be solved in O(nΓB).
Arie Koster – RWTH Aachen University 29 / 48
Further results
Theorem 2 (Bertsimas & Sim [6])If only the objective is uncertain and x ∈ {0, 1}n, then the robustcounterpart can be solved by solving n + 1 nominal problems of the sametype.
Corollary 3The knapsack problem with uncertain objective can be solved in O(n2B).
Theorem 4 (Pferschy et al., 2012)The knapsack problem with uncertain weights can be solved in O(nΓB).
Arie Koster – RWTH Aachen University 29 / 48
Robust Optimization Revisited
Advantages Robust Optimization:Only marginal complexity increase (compared to deterministic case)
Trade-off between level of robustness and cost of solution by parameter Γ
Optimizes result in the worst-case (in advance)No information on probability distribution needed
Disadvantages Robust Optimization:
Right-hand-side uncertainty not satisfyingSingle solution without any flexibity!The almost always optimal solution might be infeasible
⇒ Two-Stage Robustness Concepts
Very inprecise description of uncertainty (only two values)
⇒ More detailed description of uncertainty
Arie Koster – RWTH Aachen University 30 / 48
Robust Optimization Revisited
Advantages Robust Optimization:Only marginal complexity increase (compared to deterministic case)Trade-off between level of robustness and cost of solution by parameter Γ
Optimizes result in the worst-case (in advance)No information on probability distribution needed
Disadvantages Robust Optimization:
Right-hand-side uncertainty not satisfyingSingle solution without any flexibity!The almost always optimal solution might be infeasible
⇒ Two-Stage Robustness Concepts
Very inprecise description of uncertainty (only two values)
⇒ More detailed description of uncertainty
Arie Koster – RWTH Aachen University 30 / 48
Robust Optimization Revisited
Advantages Robust Optimization:Only marginal complexity increase (compared to deterministic case)Trade-off between level of robustness and cost of solution by parameter Γ
Optimizes result in the worst-case (in advance)
No information on probability distribution needed
Disadvantages Robust Optimization:Right-hand-side uncertainty not satisfying
Single solution without any flexibity!The almost always optimal solution might be infeasible
⇒ Two-Stage Robustness Concepts
Very inprecise description of uncertainty (only two values)
⇒ More detailed description of uncertainty
Arie Koster – RWTH Aachen University 30 / 48
Robust Optimization Revisited
Advantages Robust Optimization:Only marginal complexity increase (compared to deterministic case)Trade-off between level of robustness and cost of solution by parameter Γ
Optimizes result in the worst-case (in advance)
No information on probability distribution needed
Disadvantages Robust Optimization:Right-hand-side uncertainty not satisfyingSingle solution without any flexibity!The almost always optimal solution might be infeasible
⇒ Two-Stage Robustness ConceptsVery inprecise description of uncertainty (only two values)
⇒ More detailed description of uncertainty
Arie Koster – RWTH Aachen University 30 / 48
Robust Optimization Revisited
Advantages Robust Optimization:Only marginal complexity increase (compared to deterministic case)Trade-off between level of robustness and cost of solution by parameter Γ
Optimizes result in the worst-case (in advance)
No information on probability distribution needed
Disadvantages Robust Optimization:Right-hand-side uncertainty not satisfyingSingle solution without any flexibity!The almost always optimal solution might be infeasible⇒ Two-Stage Robustness Concepts
Very inprecise description of uncertainty (only two values)
⇒ More detailed description of uncertainty
Arie Koster – RWTH Aachen University 30 / 48
Robust Optimization Revisited
Advantages Robust Optimization:Only marginal complexity increase (compared to deterministic case)Trade-off between level of robustness and cost of solution by parameter Γ
Optimizes result in the worst-case (in advance)No information on probability distribution needed
Disadvantages Robust Optimization:Right-hand-side uncertainty not satisfyingSingle solution without any flexibity!The almost always optimal solution might be infeasible⇒ Two-Stage Robustness Concepts
Very inprecise description of uncertainty (only two values)
⇒ More detailed description of uncertainty
Arie Koster – RWTH Aachen University 30 / 48
Robust Optimization Revisited
Advantages Robust Optimization:Only marginal complexity increase (compared to deterministic case)Trade-off between level of robustness and cost of solution by parameter Γ
Optimizes result in the worst-case (in advance)No information on probability distribution needed
Disadvantages Robust Optimization:Right-hand-side uncertainty not satisfyingSingle solution without any flexibity!The almost always optimal solution might be infeasible⇒ Two-Stage Robustness ConceptsVery inprecise description of uncertainty (only two values)
⇒ More detailed description of uncertainty
Arie Koster – RWTH Aachen University 30 / 48
Robust Optimization Revisited
Advantages Robust Optimization:Only marginal complexity increase (compared to deterministic case)Trade-off between level of robustness and cost of solution by parameter Γ
Optimizes result in the worst-case (in advance)No information on probability distribution needed
Disadvantages Robust Optimization:Right-hand-side uncertainty not satisfyingSingle solution without any flexibity!The almost always optimal solution might be infeasible⇒ Two-Stage Robustness ConceptsVery inprecise description of uncertainty (only two values)⇒ More detailed description of uncertainty
Arie Koster – RWTH Aachen University 30 / 48
Outline
1 Motivation2 Chance-Constrained Programming3 Robust Optimization4 Γ-Robust Optimization5 Recoverable Robustness6 Robust Network Design with Affine Recourse7 Multi-Band Robustness8 Conclusions
Arie Koster – RWTH Aachen University 31 / 48
Recoverable Robustness
Recoverable robustness [14, 7]uncertainty as two-stage process:
1st stage: a-priori decision2nd stage: recovery:
limited change of first-stage decisionafter realization of uncertainty is known
optimize worst-case w. r. t. recovery
Example:Recoverable Robust Knapsack problem (RRKP) with
Discrete Scenarios [9]Γ Scenarios [8]
Recoverable Robust Network Topology Design (discrete scenarios) [2]
Arie Koster – RWTH Aachen University 32 / 48
Recoverable Robustness
Recoverable robustness [14, 7]uncertainty as two-stage process:
1st stage: a-priori decision2nd stage: recovery:
limited change of first-stage decisionafter realization of uncertainty is known
optimize worst-case w. r. t. recovery
Example:Recoverable Robust Knapsack problem (RRKP) with
Discrete Scenarios [9]Γ Scenarios [8]
Recoverable Robust Network Topology Design (discrete scenarios) [2]
Arie Koster – RWTH Aachen University 32 / 48
(k , `)-RRKP with Discrete Scenarios
Given items N = {1, . . . , n},first stage: profits p0, weight w0, capacity c0,
scenarios S ∈ SD with profits pS , weight wS , capacity cS ,recovery set X (X ): delete ≤ k items, add ≤ ` items
Find subset X ⊆ NSuch that w0(X ) ≤ c0,
for all S ∈ SD there exists X S ∈ X (X ) with wS (X S ) ≤ cS ,
total profit
pT (X ) = p0(X )
+ minS∈SD
maxX S
pS (X S )
is maximized.
First
Arie Koster – RWTH Aachen University 33 / 48
(k , `)-RRKP with Discrete Scenarios
Given items N = {1, . . . , n},first stage: profits p0, weight w0, capacity c0,
scenarios S ∈ SD with profits pS , weight wS , capacity cS ,recovery set X (X ): delete ≤ k items, add ≤ ` items
Find subset X ⊆ NSuch that w0(X ) ≤ c0,
for all S ∈ SD there exists X S ∈ X (X ) with wS (X S ) ≤ cS ,
total profit
pT (X ) = p0(X )
+ minS∈SD
maxX S
pS (X S )
is maximized.
First
Arie Koster – RWTH Aachen University 33 / 48
(k , `)-RRKP with Discrete Scenarios
Given items N = {1, . . . , n},first stage: profits p0, weight w0, capacity c0,scenarios S ∈ SD with profits pS , weight wS , capacity cS ,
recovery set X (X ): delete ≤ k items, add ≤ ` items
Find subset X ⊆ NSuch that w0(X ) ≤ c0,
for all S ∈ SD there exists X S ∈ X (X ) with wS (X S ) ≤ cS ,total profit
pT (X ) = p0(X ) + minS∈SD
maxX S
pS (X S )
is maximized.
S_1 S_2First
Arie Koster – RWTH Aachen University 33 / 48
(k , `)-RRKP with Discrete Scenarios
Given items N = {1, . . . , n},first stage: profits p0, weight w0, capacity c0,scenarios S ∈ SD with profits pS , weight wS , capacity cS ,recovery set X (X ): delete ≤ k items, add ≤ ` items
Find subset X ⊆ NSuch that w0(X ) ≤ c0,
for all S ∈ SD there exists X S ∈ X (X ) with wS (X S ) ≤ cS ,total profit
pT (X ) = p0(X ) + minS∈SD
maxX S
pS (X S )
is maximized.
S_1 S_2First
Arie Koster – RWTH Aachen University 33 / 48
k-RRKP with Γ Scenarios
Given Items N = {1, . . . , n},first stage: profits p0, weights w0, capacity c0,
Γ-scenarios: weights [w , w + w ], capacity c , Γ ∈ N,recovery set X (X ): delete ≤ k items from X ⊆ N
Find subset X ⊆ N,Such that w0(X ) ≤ c0,
for all S ∈ SΓ there exists X S ∈ X (X ) with wS (X S ) ≤ c ,
total profit p0(X ) is maximized
S_1SecondFirst
Arie Koster – RWTH Aachen University 34 / 48
k-RRKP with Γ Scenarios
Given Items N = {1, . . . , n},first stage: profits p0, weights w0, capacity c0,Γ-scenarios: weights [w , w + w ], capacity c , Γ ∈ N,recovery set X (X ): delete ≤ k items from X ⊆ N
Find subset X ⊆ N,Such that w0(X ) ≤ c0,
for all S ∈ SΓ there exists X S ∈ X (X ) with wS (X S ) ≤ c ,total profit p0(X ) is maximized
S_1SecondFirst
Arie Koster – RWTH Aachen University 34 / 48
k-RRKP with Γ Scenarios
Given Items N = {1, . . . , n},first stage: profits p0, weights w0, capacity c0,Γ-scenarios: weights [w , w + w ], capacity c , Γ ∈ N,recovery set X (X ): delete ≤ k items from X ⊆ N
Find subset X ⊆ N,Such that w0(X ) ≤ c0,
for all S ∈ SΓ there exists X S ∈ X (X ) with wS (X S ) ≤ c ,total profit p0(X ) is maximized
S_1SecondFirst
Arie Koster – RWTH Aachen University 34 / 48
RRKP with Γ scenarios - MP
Mathematical Programming formulation:
max∑i∈N
p0i xi
s. t.∑i∈N
w0i xi ≤c0
∑i∈N
wixi + maxX⊆N|X |≤Γ
∑i∈X
wixi − maxY⊆N|Y |≤k
(∑i∈Y
wixi +∑
i∈X∩Y
wixi
) ≤cxi ∈ {0, 1}
Question: Compact Linear reformulation?Answer: LP duality and enumeration of solution values!
Arie Koster – RWTH Aachen University 35 / 48
RRKP with Γ scenarios - MP
Mathematical Programming formulation:
max∑i∈N
p0i xi
s. t.∑i∈N
w0i xi ≤c0
∑i∈N
wixi + maxX⊆N|X |≤Γ
∑i∈X
wixi − maxY⊆N|Y |≤k
(∑i∈Y
wixi +∑
i∈X∩Y
wixi
) ≤cxi ∈ {0, 1}
Question: Compact Linear reformulation?
Answer: LP duality and enumeration of solution values!
Arie Koster – RWTH Aachen University 35 / 48
RRKP with Γ scenarios - MP
Mathematical Programming formulation:
max∑i∈N
p0i xi
s. t.∑i∈N
w0i xi ≤c0
∑i∈N
wixi + maxX⊆N|X |≤Γ
∑i∈X
wixi − maxY⊆N|Y |≤k
(∑i∈Y
wixi +∑
i∈X∩Y
wixi
) ≤cxi ∈ {0, 1}
Question: Compact Linear reformulation?Answer: LP duality and enumeration of solution values!
Arie Koster – RWTH Aachen University 35 / 48
Outline
1 Motivation2 Chance-Constrained Programming3 Robust Optimization4 Γ-Robust Optimization5 Recoverable Robustness6 Robust Network Design with Affine Recourse7 Multi-Band Robustness8 Conclusions
Arie Koster – RWTH Aachen University 36 / 48
Static vs. Dynamic Routing
Static Routing:Capacities have to be installed in integer amountsRouting templates fixes percentual distribution of traffic volume alongpaths
Dynamic Routing:
Capacities have to be installed in integer amountsRouting can be adapted to actual traffic volumes (realization fromuncertainty set)
Arie Koster – RWTH Aachen University 37 / 48
Static vs. Dynamic Routing
Static Routing:Capacities have to be installed in integer amountsRouting templates fixes percentual distribution of traffic volume alongpaths
Dynamic Routing:Capacities have to be installed in integer amountsRouting can be adapted to actual traffic volumes (realization fromuncertainty set)
Arie Koster – RWTH Aachen University 37 / 48
Robust Network Design with DynamicRouting
ykij (d) = fraction of demand k ∈ K routed along arc (i, j) ∈ A for realization d ∈ D.
xe = number of link capacity modules to be installed on link e ∈ E .
Integer Linear Programming formulation:
min∑e∈E
κe xe
s.t.∑
j∈N(i)
(ykij (d)− yk
ji (d)) =
dk (d) i = s(k)
−dk (d) i = t(k)
0 else
, ∀d ∈ D, i ∈ V , k ∈ K
∑k∈K
yke ≤ Cxe , ∀d ∈ D, e ∈ E
y(d) ≥ 0, x ∈ Z|E|+
Theorem (Mattia [15])
The vector x ∈ Px if and only if for all length functions` : E → R+ holds
∑e∈E
`(e)xe ≥ maxd∈D
{∑k∈K
dk(d)`(sk , tk )
}
Arie Koster – RWTH Aachen University 38 / 48
Robust Network Design with DynamicRouting
ykij (d) = fraction of demand k ∈ K routed along arc (i, j) ∈ A for realization d ∈ D.
xe = number of link capacity modules to be installed on link e ∈ E .Integer Linear Programming formulation:
min∑e∈E
κe xe
s.t.∑
j∈N(i)
(ykij (d)− yk
ji (d)) =
dk (d) i = s(k)
−dk (d) i = t(k)
0 else
, ∀d ∈ D, i ∈ V , k ∈ K
∑k∈K
yke ≤ Cxe , ∀d ∈ D, e ∈ E
y(d) ≥ 0, x ∈ Z|E|+
Theorem (Mattia [15])
The vector x ∈ Px if and only if for all length functions` : E → R+ holds
∑e∈E
`(e)xe ≥ maxd∈D
{∑k∈K
dk (d)`(sk , tk )
}Arie Koster – RWTH Aachen University 38 / 48
Affine Routing Function
Robust Network Design with Affine Routing:Capacities have to be installed in integer amountsRouting follows a linear function of all traffic values
ykij (d) := hk0
ij +∑k∈K
hkkij d k
where hk0ij , h
kkij ∈ R for all ij ∈ A, k , k ∈ K .
Theorem (Poss & Raack [19])Let D be an arbitrary demand uncertainty set. Then
OPTdyn(D) ≤ OPTaff (D) ≤ OPTstat(D)
Arie Koster – RWTH Aachen University 39 / 48
Affine Routing Function
Robust Network Design with Affine Routing:Capacities have to be installed in integer amountsRouting follows a linear function of all traffic values
ykij (d) := hk0
ij +∑k∈K
hkkij d k
where hk0ij , h
kkij ∈ R for all ij ∈ A, k , k ∈ K .
Theorem (Poss & Raack [19])Let D be an arbitrary demand uncertainty set. Then
OPTdyn(D) ≤ OPTaff (D) ≤ OPTstat(D)
Arie Koster – RWTH Aachen University 39 / 48
Affine Routing Function
Robust Network Design with Affine Routing:Capacities have to be installed in integer amountsRouting follows a linear function of all traffic values
ykij (d) := hk0
ij +∑k∈K
hkkij d k
where hk0ij , h
kkij ∈ R for all ij ∈ A, k , k ∈ K .
Theorem (Poss & Raack [19])Let D be an arbitrary demand uncertainty set. Then
OPTdyn(D) ≤ OPTaff (D) ≤ OPTstat(D)
Arie Koster – RWTH Aachen University 39 / 48
Outline
1 Motivation2 Chance-Constrained Programming3 Robust Optimization4 Γ-Robust Optimization5 Recoverable Robustness6 Robust Network Design with Affine Recourse7 Multi-Band Robustness8 Conclusions
Arie Koster – RWTH Aachen University 40 / 48
Multi-band robustnessIdea: Refinement of Γ-robustness approach
valuevalue
prob prob
da d1 d2 d3da d3 a0
Γ-robustness
� dk ≥ 0
� dk ≥ 0
� [dk , dk + dk ]
� Γ ∈ N
Multi-band robustness
� dk ≥ 0
� 0 = dk0 ≤ dk
1 ≤ . . . ≤ dk|B| = dk
� [dk + dkb−1, d
k + dkb ] forall b ∈ B
� u0, u1, . . . , u|B| ∈ N
Γ-robustness ≡ multi-band robustness with B = {1}, u0 = |K |, u1 = Γ
work by Büsing and D’Andreagiovanni [10]; based on Bienstock
Arie Koster – RWTH Aachen University 41 / 48
Multi-band RND
Compact ILP formulation:
min∑e∈E
cexe
s. t.∑
j∈V :ij∈E
f kij −
∑j∈V :ji∈E
f kji =
1 , i = sk
−1 , i = tk
0 , otherwise∀i , k
∑k∈K
dk f ke +
∑b∈B
ubwe,b +∑k∈K
zke ≤ Cxe ∀e
we,b + zke ≥ dk
b fk
e ∀b, kxe ∈ Z+, f
kij ∈ [0, 1], we,b ≥ 0, zk
e ≥ 0 ∀e, ij , b, k
Arie Koster – RWTH Aachen University 42 / 48
Outline
1 Motivation2 Chance-Constrained Programming3 Robust Optimization4 Γ-Robust Optimization5 Recoverable Robustness6 Robust Network Design with Affine Recourse7 Multi-Band Robustness8 Conclusions
Arie Koster – RWTH Aachen University 43 / 48
Concluding Remarks
Robust optimization is an emerging field in mathematical optimizationbut requires an additional effort to solve robust problems
Correct modelling of uncertainties in network applications requiredIntegration of other (real) networking aspects (survivability, multi-layer)⇒ Real applications [3, 12, 11]⇒ Robustness & 1+1 Protection [13]Recoverable Robustness allows a two-stage approach⇒ What recovery action is possible?⇒ Algorithmic implications?Quality of robust approach has to be evaluated⇒ Which value of Γ is enough to obtain robust designs?
Arie Koster – RWTH Aachen University 44 / 48
Concluding Remarks
Robust optimization is an emerging field in mathematical optimizationbut requires an additional effort to solve robust problemsCorrect modelling of uncertainties in network applications requiredIntegration of other (real) networking aspects (survivability, multi-layer)⇒ Real applications [3, 12, 11]⇒ Robustness & 1+1 Protection [13]
Recoverable Robustness allows a two-stage approach⇒ What recovery action is possible?⇒ Algorithmic implications?Quality of robust approach has to be evaluated⇒ Which value of Γ is enough to obtain robust designs?
Arie Koster – RWTH Aachen University 44 / 48
Concluding Remarks
Robust optimization is an emerging field in mathematical optimizationbut requires an additional effort to solve robust problemsCorrect modelling of uncertainties in network applications requiredIntegration of other (real) networking aspects (survivability, multi-layer)⇒ Real applications [3, 12, 11]⇒ Robustness & 1+1 Protection [13]Recoverable Robustness allows a two-stage approach⇒ What recovery action is possible?⇒ Algorithmic implications?
Quality of robust approach has to be evaluated⇒ Which value of Γ is enough to obtain robust designs?
Arie Koster – RWTH Aachen University 44 / 48
Concluding Remarks
Robust optimization is an emerging field in mathematical optimizationbut requires an additional effort to solve robust problemsCorrect modelling of uncertainties in network applications requiredIntegration of other (real) networking aspects (survivability, multi-layer)⇒ Real applications [3, 12, 11]⇒ Robustness & 1+1 Protection [13]Recoverable Robustness allows a two-stage approach⇒ What recovery action is possible?⇒ Algorithmic implications?Quality of robust approach has to be evaluated⇒ Which value of Γ is enough to obtain robust designs?
Arie Koster – RWTH Aachen University 44 / 48
A Crash Course in Robust Optimization
Arie M.C.A. [email protected]
INRIA – Project COATI – 12 February 2013
Lehrstuhl II für Mathematik
References I
[1] A. Altin, H. Yaman, and M. C. Pinar.The Network Loading Problem under Hose Demand Uncertainty: Formulation, Polyhedral Analysis, andComputations.INFORMS Journal on Computing, 23:75–89, 2011.
[2] E. Alvarez-Miranda, I. Ljubic, S. Raghavan, and P. Toth.The recoverable robust two-level network design problem.Technical report, Vienna University, 2012.http://homepage.univie.ac.at/ivana.ljubic/publications.html.
[3] P. Belotti, K. Kompella, and L. Noronha.A comparison of OTN and MPLS networks under traffic uncertainty.Working paper, 2011.http://myweb.clemson.edu/~pbelott/papers/\robust-opt-network-design.pdf.
[4] A. Ben-Tal, L. E. Ghaoui, and A. Nemirovski.Robust optimization.Princeton University Press, 2009.
[5] D. Bertsimas and M. Sim.Robust discrete optimization and network flows.Math. Program., Ser. B 98:49–71, 2003.
[6] D. Bertsimas and M. Sim.The Price of Robustness.Operations Research, 52(1):35–53, 2004.
[7] C. Büsing.Recoverable Robustness in Combinatorial Optimization.PhD thesis, Technische Universität Berlin, 2011.
Arie Koster – RWTH Aachen University 46 / 48
References II
[8] C. Büsing, A. M. C. A. Koster, and M. Kutschka.Recoverable robust knapsacks: gamma-scenarios.In Proceedings of INOC 2011, International Network Optimization Conference, volume 6701 of Lecture Noteson Computer Science, pages 583–588, 2011.
[9] C. Büsing, A. M. C. A. Koster, and M. Kutschka.Recoverable robust knapsacks: the discrete scenario case.Optimization Letters, 5(3):379–392, 2011.
[10] F. D. C. Büsing.New results about multi-band uncertainty in robust optimization.Proceedings of SEA 2012, 11th Symposium on Experimental Algorithms, Lecture Notes on Computer Science7276, pages 63–74, 2012.
[11] G. Claßen, A. M. C. A. Koster, and A. Schmeink.A robust optimisation model and cutting planes for the planning of energy-efficient wireless networks.Computers & Operations Research, 40(1):80–90, 2013.
[12] S. Duhovniko, A. M. C. A. Koster, M. Kutschka, F. Rambach, and D. Schupke.Γ-robust network design for mixed-line-rate-planning of optical networks.In Proc. Optical Fiber Communication - National Fiber Optic Engineers Conference (OFC/NFOEC), 2013.
[13] A. M. C. A. Koster and M. Kutschka.An integrated model for survivable network design under demand uncertainty.In Proceedings of 8th International Workshop on the Design of Reliable Communication Networks (DRCN2011), pages 54–61, 2011.
[14] C. Liebchen, M. E. Lübbecke, R. H. Möhring, and S. Stiller.The concept of recoverable robustness, linear programming recovery, and railway applications.In R. Ahuja, R. Möhring, and C. Zaroliagis, editors, Robust and Online Large-Scale Optimization, volume5868 of Lecture Notes on Computer Science, pages 1–27, 2009.
Arie Koster – RWTH Aachen University 47 / 48
References III
[15] S. Mattia.The robust network loading problem with dynamic routing.Computational Optimization and Applications, August 2012.
[16] M. Minoux.On robust maximum flow with polyhedral uncertainty sets.Optimization Letters, 3:367–376, 2009.
[17] M. Minoux.On 2-stage robust lp with rhs uncertainty: complexity results and applications.Journal of Global Optimization, 49:521–537, 2011.
[18] M. Minoux.Two-stage robust lp with ellipsoidal right-hand side uncertainty is np-hard.Optimization Letters, 6:1463–1475, 2012.
[19] M. Poss and C. Raack.Affine recourse for the robust network design problem: between static and dynamic routing.Networks, to appear, 2013.
Arie Koster – RWTH Aachen University 48 / 48