Upload
halien
View
235
Download
0
Embed Size (px)
Citation preview
Theoretical and computational issues forimproving the performance of linear
optimization methods
Pedro Munari
Advisor: Marcos Nereu Arenales (ICMC/USP)
Co-advisor: Jacek Gondzio (University of Edinburgh)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we do
I We are interested in solving problems which are relevant in our
day-to-day lives;
I Model: mathematical formulation for a real-life problem;
I We can apply all the theoretical and computational tools to analyse
and solve the model;
I Formal, reliable and safe way to support decision-making.
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we do
I We are interested in solving problems which are relevant in our
day-to-day lives;
I Model: mathematical formulation for a real-life problem;
I We can apply all the theoretical and computational tools to analyse
and solve the model;
I Formal, reliable and safe way to support decision-making.
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we do
I We are interested in solving problems which are relevant in our
day-to-day lives;
I Model: mathematical formulation for a real-life problem;
I We can apply all the theoretical and computational tools to analyse
and solve the model;
I Formal, reliable and safe way to support decision-making.
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we do
I We are interested in solving problems which are relevant in our
day-to-day lives;
I Model: mathematical formulation for a real-life problem;
I We can apply all the theoretical and computational tools to analyse
and solve the model;
I Formal, reliable and safe way to support decision-making.
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we actually do
I We are interested in solving problems which are relevant in our
day-to-day lives;
I Model: mathematical formulation for a real-life problem;
I We can apply all the theoretical and computational tools to analyse
and solve the model;
I Formal, reliable and safe way to support decision-making.
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we actually do
I We are interested in solving complex and difficult problems which
are relevant in our day-to-day lives!
I Model: mathematical formulation for a real-life problem;
I We can apply all the theoretical and computational tools to analyse
and solve the model;
I Formal, reliable and safe way to support decision-making.
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we actually do
I We are interested in solving complex and difficult problems which
are relevant in our day-to-day lives!
I Propose and improve models for real-life problems!
I We can apply all the theoretical and computational tools to analyse
and solve the model;
I Formal, reliable and safe way to support decision-making.
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we actually do
I We are interested in solving complex and difficult problems which
are relevant in our day-to-day lives!
I Propose and improve models for real-life problems!
I We want to apply, improve and propose theoretical and
computational tools to analyse and solve the models!
I Formal, reliable and safe way to support decision-making.
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we actually do
I We are interested in solving complex and difficult problems which
are relevant in our day-to-day lives!
I Propose and improve models for real-life problems!
I We want to apply, improve and propose theoretical and
computational tools to analyse and solve the models!
I Formal, reliable, safe, effective and efficient way to decision-making!
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Applied Mathematics: What we actually do
I We are interested in solving complex and difficult problems which
are relevant in our day-to-day lives!
I Propose and improve models for real-life problems!
I We want to apply, improve and propose theoretical and
computational tools to analyse and solve the models!
I Formal, reliable, safe, effective and efficient way to decision-making!
I We cannot stop until the problem is completely under our domain!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Optimization
I One of the tools in Applied Mathematics;
I Get the best solution from a set of possible ones;
I Natural idea, we are trying to optimize all the time;
I Mathematical formulation:
minimize f(x) subject to x ∈ X .(maximize)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Optimization
I One of the tools in Applied Mathematics;
I Get the best solution from a set of possible ones;
I Natural idea, we are trying to optimize all the time;
I Mathematical formulation:
minimize f(x) subject to x ∈ X .(maximize)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Optimization
I One of the tools in Applied Mathematics;
I Get the best solution from a set of possible ones;
I Natural idea, we are trying to optimize all the time;
I Mathematical formulation:
minimize f(x) subject to x ∈ X .(maximize)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Optimization
I One of the tools in Applied Mathematics;
I Get the best solution from a set of possible ones;
I Natural idea, we are trying to optimize all the time;
I Mathematical formulation:
minimize f(x) subject to x ∈ X .(maximize)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Optimization
I One of the tools in Applied Mathematics;
I Get the best solution from a set of possible ones;
I Natural idea, we are trying to optimize all the time;
I Mathematical formulation:
minimize f(x) subject to x ∈ X .(maximize)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Optimization
I One of the tools in Applied Mathematics;
I Get the best solution from a set of possible ones;
I Natural idea, we are trying to optimize all the time;
I Mathematical formulation:
minimize f(x) subject to x ∈ X .
(maximize)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Optimization
I One of the tools in Applied Mathematics;
I Get the best solution from a set of possible ones;
I Natural idea, we are trying to optimize all the time;
I Mathematical formulation:
minimize f(x) subject to x ∈ X .(maximize)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I f(x) is a linear function;
I X is described by a set of linear equalities/inequalities;
I X may be continuous or discrete;
I Linear Programming: continuous linear formulations;
I Integer Programming: discrete linear formulations;
I Combinatorial Optimization problems.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I f(x) is a linear function;
I X is described by a set of linear equalities/inequalities;
I X may be continuous or discrete;
I Linear Programming: continuous linear formulations;
I Integer Programming: discrete linear formulations;
I Combinatorial Optimization problems.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I f(x) is a linear function;
I X is described by a set of linear equalities/inequalities;
I X may be continuous or discrete;
I Linear Programming: continuous linear formulations;
I Integer Programming: discrete linear formulations;
I Combinatorial Optimization problems.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I f(x) is a linear function;
I X is described by a set of linear equalities/inequalities;
I X may be continuous or discrete;
I Linear Programming: continuous linear formulations;
I Integer Programming: discrete linear formulations;
I Combinatorial Optimization problems.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I f(x) is a linear function;
I X is described by a set of linear equalities/inequalities;
I X may be continuous or discrete;
I Linear Programming: continuous linear formulations;
I Integer Programming: discrete linear formulations;
I Combinatorial Optimization problems.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I f(x) is a linear function;
I X is described by a set of linear equalities/inequalities;
I X may be continuous or discrete;
I Linear Programming: continuous linear formulations;
I Integer Programming: discrete linear formulations;
I Combinatorial Optimization problems.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I f(x) is a linear function;
I X is described by a set of linear equalities/inequalities;
I X may be continuous or discrete;
I Linear Programming: continuous linear formulations;
I Integer Programming: discrete linear formulations;
I Combinatorial Optimization problems.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I Although these formulations seem much simpler than the others,
they are able to model very complex situations, from our day-to-day
lives and faced by many companies around the world;
I In addition, the solution methods work relatively well in practice.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I Although these formulations seem much simpler than the others,
they are able to model very complex situations, from our day-to-day
lives and faced by many companies around the world;
I In addition, the solution methods work relatively well in practice.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Linear Optimization
I Although these formulations seem much simpler than the others,
they are able to model very complex situations, from our day-to-day
lives and faced by many companies around the world;
I In addition, the solution methods work relatively well in practice.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Production planning
. Lot sizing problem
1 2 3 4 5 6 7 8 9 10 11 12
Demand of product 1 for the next twelve months
1 2 3 4 5 6 7 8 9 10 11 12
Demand of product 2 for the next twelve months
8
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Production planning
. Lot sizing problem
1 2 3 4 5 6 7 8 9 10 11 12
Demand of product 1 for the next twelve months
1 2 3 4 5 6 7 8 9 10 11 12
Demand of product 2 for the next twelve months
8
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Production planning
. Lot sizing problem
1 2 3 4 5 6 7 8 9 10 11 12
Demand of product 1 for the next twelve months
1 2 3 4 5 6 7 8 9 10 11 12
Demand of product 2 for the next twelve months
8
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Production planning
. Lot sizing problem
1 2 3 4 5 6 7 8 9 10 11 12
Demand of product 1 for the next twelve months
1 2 3 4 5 6 7 8 9 10 11 12
Demand of product 2 for the next twelve months
8
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Production planning
. Lot sizing problem
minn∑
i=1
T∑t=1
citxit +
n∑i=1
T∑t=1
hitIit +
n∑i=1
T∑t=1
siyit
s.t. xit + Ii,t−1 = dit + Iit, i = 1, . . . , n; t = 1, . . . , T,n∑
i=1
(aixit + stiyit) ≤ bt, t = 1, . . . , T,
xit ≤ Cyit, i = 1, . . . , n; t = 1, . . . , T,
Ii0 = 0, i = 1, . . . , n,
xit ≥ 0, Iit ≥ 0, yit ∈ {0, 1}, i = 1, . . . , n; t = 1, . . . , T.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
10
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
10
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
10
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
10
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
10
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
10
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
Q1
Q2 Q3
d 1 d 2
d 3
d 4
d 5
d 6
d 7
c13
c01
c12
c23
c02c03
10
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
I Find a set of routes to visit all the customers in order to meet the
demand, while minimizing the total travel cost.
I Decision variables :
xijk =
1, vehicle k visit i and goes to j immediately,
0, otherwise.i, j = 0, . . . , n+ 1; k = 1, . . . ,K.
wik : time instant that vehicle k starts servicing the customer i.
i = 0, . . . , n+ 1; k = 1, . . . ,K.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
I Find a set of routes to visit all the customers in order to meet the
demand, while minimizing the total travel cost.
I Decision variables :
xijk =
1, vehicle k visit i and goes to j immediately,
0, otherwise.i, j = 0, . . . , n+ 1; k = 1, . . . ,K.
wik : time instant that vehicle k starts servicing the customer i.
i = 0, . . . , n+ 1; k = 1, . . . ,K.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
I Find a set of routes to visit all the customers in order to meet the
demand, while minimizing the total travel cost.
I Decision variables :
xijk =
1, vehicle k visit i and goes to j immediately,
0, otherwise.i, j = 0, . . . , n+ 1; k = 1, . . . ,K.
wik : time instant that vehicle k starts servicing the customer i.
i = 0, . . . , n+ 1; k = 1, . . . ,K.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
I Find a set of routes to visit all the customers in order to meet the
demand, while minimizing the total travel cost.
I Decision variables :
xijk =
1, vehicle k visit i and goes to j immediately,
0, otherwise.i, j = 0, . . . , n+ 1; k = 1, . . . ,K.
wik : time instant that vehicle k starts servicing the customer i.
i = 0, . . . , n+ 1; k = 1, . . . ,K.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
minK∑
k=1
n+1∑i=0
n+1∑j=0
cijxijk
s.t.K∑
k=1
n+1∑j=1j 6=i
xijk = 1, i = 1, . . . , n,
n∑i=0i6=h
xihk =
n+1∑j=1j 6=h
xhjk, h = 1, . . . , n, k = 1, . . . , K,
n∑i=1
di
n+1∑j=1j 6=i
xijk ≤ Qk, k = 1, . . . , K,
wjk ≥ wik + (si + tij)xijk + Mij(xijk − 1), i = 0, . . . , n, j = 1, . . . , n + 1, k = 1, . . . , K,
n+1∑j=1
x0jk = 1, k = 1, . . . , K,
n∑i=0
xi,n+1,k = 1, k = 1, . . . , K,
ai ≤ wik ≤ bi, i = 1, . . . , n + 1, k = 1, . . . , K,
xijk ∈ {0, 1}, i, j = 0, . . . , n + 1, k = 1, . . . , K.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
I Although this formulation is correct, the current methods may take
hours, days or even more, to solve real-life instances;
I By decomposing the problem we can obtain a more efficient
strategy, but still solving the problem may require a long time;
I A way to overcome this is to recur to approximate solutions
(heuristics);
I Or, we can try to improve the current methods and propose new
solution strategies!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
I Although this formulation is correct, the current methods may take
hours, days or even more, to solve real-life instances;
I By decomposing the problem we can obtain a more efficient
strategy, but still solving the problem may require a long time;
I A way to overcome this is to recur to approximate solutions
(heuristics);
I Or, we can try to improve the current methods and propose new
solution strategies!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
I Although this formulation is correct, the current methods may take
hours, days or even more, to solve real-life instances;
I By decomposing the problem we can obtain a more efficient
strategy, but still solving the problem may require a long time;
I A way to overcome this is to recur to approximate solutions
(heuristics);
I Or, we can try to improve the current methods and propose new
solution strategies!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Logistics
. Vehicle routing problem
I Although this formulation is correct, the current methods may take
hours, days or even more, to solve real-life instances;
I By decomposing the problem we can obtain a more efficient
strategy, but still solving the problem may require a long time;
I A way to overcome this is to recur to approximate solutions
(heuristics);
I Or, we can try to improve the current methods and propose new
solution strategies!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
. Purpose
I Study the state-of-the-art methodologies used to solve linear
optimization problems and see if we can find a way to improve them;
I Simplex method, column generation, branch-price-and-cut;
I Why not using the advantages offered by interior point methods?
14
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
. Purpose
I Study the state-of-the-art methodologies used to solve linear
optimization problems and see if we can find a way to improve them;
I Simplex method, column generation, branch-price-and-cut;
I Why not using the advantages offered by interior point methods?
14
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
. Purpose
I Study the state-of-the-art methodologies used to solve linear
optimization problems and see if we can find a way to improve them;
I Simplex method, column generation, branch-price-and-cut;
I Why not using the advantages offered by interior point methods?
14
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
minK∑
k=1
n+1∑i=0
n+1∑j=0
cijxijk
s.aK∑
k=1
n+1∑j=1j 6=i
xijk = 1, i = 1, . . . , n,
n∑i=0i6=h
xihk =
n+1∑j=1j 6=h
xhjk, h = 1, . . . , n, k = 1, . . . , K,
n∑i=1
di
n+1∑j=1j 6=i
xijk ≤ Qk, k = 1, . . . , K,
wjk ≥ wik + (si + tij)xijk + Mij(xijk − 1), i = 0, . . . , n, j = 1, . . . , n + 1, k = 1, . . . , K,
n+1∑j=1
x0jk = 1, k = 1, . . . , K,
n∑i=0
xi,n+1,k = 1, k = 1, . . . , K,
ai ≤ wik ≤ bi, i = 1, . . . , n + 1, k = 1, . . . , K,
xijk ∈ {0, 1}, i, j = 0, . . . , n + 1, k = 1, . . . , K.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Special structure of the model
ℤn
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Special structure of the model
ℤn ℝnMaster
Subproblems
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Special structure of the model
ℤn ℝnMaster
Subproblems
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I Compact formulation:
min cTx,
s.t. Ax = b, (linking constraints)
Dx = d, (special structure)
x ∈ Zn+,
I x ∈ Zn+: vector of decision variables;
I c ∈ Rn, A ∈ Rm×n, b ∈ Rm, D ∈ Rh×n, d ∈ Rh.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I Compact formulation:
min cTx,
s.t. Ax = b, (linking constraints)
Dx = d, (special structure)
x ∈ Zn+,
I x ∈ Zn+: vector of decision variables;
I c ∈ Rn, A ∈ Rm×n, b ∈ Rm, D ∈ Rh×n, d ∈ Rh.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I Compact formulation:
min cTx,
s.t. Ax = b,
x ∈ X ,
I X = {x ∈ Zn+ : Dx = d} is a discrete set;
I x ∈ Zn+: vector of decision variables;
I c ∈ Rn, A ∈ Rm×n, b ∈ Rm, D ∈ Rh×n, d ∈ Rh.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
Resolution Theorem. Let X = {x ∈ R | Ax ≥ b} be a nonempty
polyhedron with at least one extreme point. Let [pq]q∈Q be the extreme
points, and let [pr]r∈R be a complete set of extreme rays of X, where Q
and R are the respective sets of indices. Let
C =
∑q∈Q
λqpq +∑r∈R
µrpr|∑q∈Q
λq = 1, λq ≥ 0, µr ≥ 0
.
Then C = X.
Also known as Representation Theorem and Caratheodory’s Theorem
(see e.g. Bertsimas and Tsitsiklis, 1997).
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I Convex hull of X : C = conv(X );
I From the Resolution Theorem, any x ∈ C can be rewritten as a
convex combination of the extreme points and rays of C:
x =∑q∈Q
λqpq +∑r∈R
µrpr,∑q∈Q
λq = 1, λq ≥ 0, µr ≥ 0.
We can use this to replace x in the original problem:
min cTx,
s.t. Ax = b,
x ∈ X .
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I Convex hull of X : C = conv(X );
I From the Resolution Theorem, any x ∈ C can be rewritten as a
convex combination of the extreme points and rays of C:
x =∑q∈Q
λqpq +∑r∈R
µrpr,∑q∈Q
λq = 1, λq ≥ 0, µr ≥ 0.
We can use this to replace x in the original problem:
min cTx,
s.t. Ax = b,
x ∈ X .
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I Convex hull of X : C = conv(X );
I From the Resolution Theorem, any x ∈ C can be rewritten as a
convex combination of the extreme points and rays of C:
x =∑q∈Q
λqpq +∑r∈R
µrpr,∑q∈Q
λq = 1, λq ≥ 0, µr ≥ 0.
We can use this to replace x in the original problem:
min cTx,
s.t. Ax = b,
x ∈ X .
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I We obtain the equivalent problem:
min∑q∈Q
λq(cT pq) +∑r∈R
µr(cT pr),
s.t.∑q∈Q
λq(Apq) +∑r∈R
µr(Apr) = b,
∑q∈Q
λq = 1,
λq ≥ 0, µr ≥ 0, ∀q ∈ Q,∀r ∈ R,
x =∑q∈Q
λqpq +∑r∈R
µrpr,
x ∈ Zn+.
where cj = cT pj and aj = Apj , ∀j ∈ Q and ∀j ∈ R.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I We obtain the equivalent problem:
min∑q∈Q
cqλq +∑r∈R
crµr
s.t.∑q∈Q
aqλq +∑r∈R
arµr = b,
∑q∈Q
λq = 1,
λq ≥ 0, µr ≥ 0, ∀q ∈ Q,∀r ∈ R,
x =∑q∈Q
λqpq +∑r∈R
µrpr,
x ∈ Zn+,
with cj = cT pj and aj = Apj , ∀j ∈ Q and ∀j ∈ R.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I We obtain the equivalent problem:
min∑q∈Q
cqλq +∑r∈R
crµr
s.t.∑q∈Q
aqλq +∑r∈R
arµr = b,
∑q∈Q
λq = 1,
λq ≥ 0, µr ≥ 0, ∀q ∈ Q,∀r ∈ R,
x =∑q∈Q
λqpq +∑r∈R
µrpr,
x ∈ Zn+,
A huge number of variables! (one for each extreme point and ray)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I Linear relaxation of the problem:
min∑q∈Q
cqλq +∑r∈R
crµr
s.t.∑q∈Q
aqλq +∑r∈R
arµr = b,
∑q∈Q
λq = 1,
λq ≥ 0, µr ≥ 0, ∀q ∈ Q,∀r ∈ R.
where cj = cT pj and aj = Apj , ∀j ∈ Q and ∀j ∈ R.
I Although we have a huge number of columns, we know how to
generate them!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Dantzig-Wolfe Decomposition
I Linear relaxation of the problem:
min∑q∈Q
cqλq +∑r∈R
crµr
s.t.∑q∈Q
aqλq +∑r∈R
arµr = b,
∑q∈Q
λq = 1,
λq ≥ 0, µr ≥ 0, ∀q ∈ Q,∀r ∈ R.
where cj = cT pj and aj = Apj , ∀j ∈ Q and ∀j ∈ R.
I Although we have a huge number of columns, we know how to
generate them!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation technique
First paper: Ford and Fulkerson (1958)
I Multicommodity network flow problem;
I Number of variables too large to be dealt explicitly;
I Idea: Change the pricing operation in the simplex method;
I “Treat non-basic variables implicitly”.
16
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation technique
First paper: Ford and Fulkerson (1958)
I Multicommodity network flow problem;
I Number of variables too large to be dealt explicitly;
I Idea: Change the pricing operation in the simplex method;
I “Treat non-basic variables implicitly”.
16
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I We are interested in solving a linear programming problem, called
the Master Problem (MP):
z? := min∑j∈N
cjλj ,
s.t.∑j∈N
ajλj = b,
λj ≥ 0, ∀j ∈ N.
I N is too big;
I The columns (cj , aj) are not known explicitly;
I We know how to generate them!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I We are interested in solving a linear programming problem, called
the Master Problem (MP):
z? := min∑j∈N
cjλj ,
s.t.∑j∈N
ajλj = b,
λj ≥ 0, ∀j ∈ N.
I N is too big;
I The columns (cj , aj) are not known explicitly;
I We know how to generate them!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I We are interested in solving a linear programming problem, called
the Master Problem (MP):
z? := min∑j∈N
cjλj ,
s.t.∑j∈N
ajλj = b,
λj ≥ 0, ∀j ∈ N.
I N is too big;
I The columns (cj , aj) are not known explicitly;
I We know how to generate them!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I The Restricted Master Problem (RMP):
zRMP := min∑j∈N
cjλj ,
s.t.∑j∈N
ajλj = b,
(u)
λj ≥ 0, ∀j ∈ N.
I with N ⊂ N .
I Let (λ, u) be a primal-dual optimal solution of the RMP.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I The Restricted Master Problem (RMP):
zRMP := min∑j∈N
cjλj ,
s.t.∑j∈N
ajλj = b,
(u)
λj ≥ 0, ∀j ∈ N.
I with N ⊂ N .
I Let (λ, u) be a primal-dual optimal solution of the RMP.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I The Restricted Master Problem (RMP):
zRMP := min∑j∈N
cjλj ,
s.t.∑j∈N
ajλj = b, (u)
λj ≥ 0, ∀j ∈ N.
I with N ⊂ N .
I Let (λ, u) be a primal-dual optimal solution of the RMP.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I Solution λ of the RMP ⇒ solution λ of the MP;
I λj = λj , if j ∈ N ;
I λj = 0, otherwise;
I cT λ ≥ cTλ?;
I How to know if λ is optimal in the MP?
I Call the pricing subproblem (oracle):
zSP := min{0, cj − uT aj |(cj , aj) ∈ A}.
I (cj , aj) are the variables in the subproblem;
I If zSP < 0, then new columns are generated;
I Otherwise, an optimal solution of the MP was found!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I Solution λ of the RMP ⇒ solution λ of the MP;
I λj = λj , if j ∈ N ;
I λj = 0, otherwise;
I cT λ ≥ cTλ?;
I How to know if λ is optimal in the MP?
I Call the pricing subproblem (oracle):
zSP := min{0, cj − uT aj |(cj , aj) ∈ A}.
I (cj , aj) are the variables in the subproblem;
I If zSP < 0, then new columns are generated;
I Otherwise, an optimal solution of the MP was found!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I Solution λ of the RMP ⇒ solution λ of the MP;
I λj = λj , if j ∈ N ;
I λj = 0, otherwise;
I cT λ ≥ cTλ?;
I How to know if λ is optimal in the MP?
I Call the pricing subproblem (oracle):
zSP := min{0, cj − uT aj |(cj , aj) ∈ A}.
I (cj , aj) are the variables in the subproblem;
I If zSP < 0, then new columns are generated;
I Otherwise, an optimal solution of the MP was found!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I Solution λ of the RMP ⇒ solution λ of the MP;
I λj = λj , if j ∈ N ;
I λj = 0, otherwise;
I cT λ ≥ cTλ?;
I How to know if λ is optimal in the MP?
I Call the pricing subproblem (oracle):
zSP := min{0, cj − uT aj |(cj , aj) ∈ A}.
I (cj , aj) are the variables in the subproblem;
I If zSP < 0, then new columns are generated;
I Otherwise, an optimal solution of the MP was found!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I Solution λ of the RMP ⇒ solution λ of the MP;
I λj = λj , if j ∈ N ;
I λj = 0, otherwise;
I cT λ ≥ cTλ?;
I How to know if λ is optimal in the MP?
I Call the pricing subproblem (oracle):
zSP := min{0, cj − uT aj |(cj , aj) ∈ A}.
I (cj , aj) are the variables in the subproblem;
I If zSP < 0, then new columns are generated;
I Otherwise, an optimal solution of the MP was found!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I Solution λ of the RMP ⇒ solution λ of the MP;
I λj = λj , if j ∈ N ;
I λj = 0, otherwise;
I cT λ ≥ cTλ?;
I How to know if λ is optimal in the MP?
I Call the pricing subproblem (oracle):
zSP := min{0, cj − uT aj |(cj , aj) ∈ A}.
I (cj , aj) are the variables in the subproblem;
I If zSP < 0, then new columns are generated;
I Otherwise, an optimal solution of the MP was found!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I Solution λ of the RMP ⇒ solution λ of the MP;
I λj = λj , if j ∈ N ;
I λj = 0, otherwise;
I cT λ ≥ cTλ?;
I How to know if λ is optimal in the MP?
I Call the pricing subproblem (oracle):
zSP := min{0, cj − uT aj |(cj , aj) ∈ A}.
I (cj , aj) are the variables in the subproblem;
I If zSP < 0, then new columns are generated;
I Otherwise, an optimal solution of the MP was found!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
INITIAL SOLUTION
RMP
ORACLE
STOPPING CRITERION
FIRST COLUMN(S)
UPPER BOUND &
DUAL VARIABLE(S)
NEW COLUMN(S)
LOWER BOUND &
NEW COLUMN(S)
END
OPTIMAL SOLUTION MASTER PROBLEM
20
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I To simplify notation, we assume that X is bounded;
I We are interested in solving a linear programming problem, called
the Master Problem (MP):
z? := min∑j∈N
cjλj ,
s.t.∑j∈N
ajλj = b,
λj ≥ 0, ∀j ∈ N.
I N is too big;
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I To get integer solutions, we typically need to combine column
generation with the branch-and-bound method;
I This results in the branch-and-price method;
I In addition, if we add valid inequalities to the MP, then we have a
branch-price-and-cut method;
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I To get integer solutions, we typically need to combine column
generation with the branch-and-bound method;
I This results in the branch-and-price method;
I In addition, if we add valid inequalities to the MP, then we have a
branch-price-and-cut method;
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation
I To get integer solutions, we typically need to combine column
generation with the branch-and-bound method;
I This results in the branch-and-price method;
I In addition, if we add valid inequalities to the MP, then we have a
branch-price-and-cut method;
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
In summary
To solve difficult optimization problems, we typically need:
I Dantzig-Wolfe decomposition: obtain a formulation with a huge number
of variables;
I Column generation: solve the linear relaxation treating variables
(columns) implicitly;
I Start with a subset (RMP) and generate more (Subproblem);
I Branch-and-price: column generation within branch-and-bound in order to
find an integer solution for the original problem;
I Branch-price-and-cut: column and cut generation within
branch-and-bound.
Right! But where are the gaps?
22
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
In summary
To solve difficult optimization problems, we typically need:
I Dantzig-Wolfe decomposition: obtain a formulation with a huge number
of variables;
I Column generation: solve the linear relaxation treating variables
(columns) implicitly;
I Start with a subset (RMP) and generate more (Subproblem);
I Branch-and-price: column generation within branch-and-bound in order to
find an integer solution for the original problem;
I Branch-price-and-cut: column and cut generation within
branch-and-bound.
Right! But where are the gaps?
22
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
In summary
To solve difficult optimization problems, we typically need:
I Dantzig-Wolfe decomposition: obtain a formulation with a huge number
of variables;
I Column generation: solve the linear relaxation treating variables
(columns) implicitly;
I Start with a subset (RMP) and generate more (Subproblem);
I Branch-and-price: column generation within branch-and-bound in order to
find an integer solution for the original problem;
I Branch-price-and-cut: column and cut generation within
branch-and-bound.
Right! But where are the gaps?
22
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
In summary
To solve difficult optimization problems, we typically need:
I Dantzig-Wolfe decomposition: obtain a formulation with a huge number
of variables;
I Column generation: solve the linear relaxation treating variables
(columns) implicitly;
I Start with a subset (RMP) and generate more (Subproblem);
I Branch-and-price: column generation within branch-and-bound in order to
find an integer solution for the original problem;
I Branch-price-and-cut: column and cut generation within
branch-and-bound.
Right! But where are the gaps?
22
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
In summary
To solve difficult optimization problems, we typically need:
I Dantzig-Wolfe decomposition: obtain a formulation with a huge number
of variables;
I Column generation: solve the linear relaxation treating variables
(columns) implicitly;
I Start with a subset (RMP) and generate more (Subproblem);
I Branch-and-price: column generation within branch-and-bound in order to
find an integer solution for the original problem;
I Branch-price-and-cut: column and cut generation within
branch-and-bound.
Right! But where are the gaps?
22
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
In summary
To solve difficult optimization problems, we typically need:
I Dantzig-Wolfe decomposition: obtain a formulation with a huge number
of variables;
I Column generation: solve the linear relaxation treating variables
(columns) implicitly;
I Start with a subset (RMP) and generate more (Subproblem);
I Branch-and-price: column generation within branch-and-bound in order to
find an integer solution for the original problem;
I Branch-price-and-cut: column and cut generation within
branch-and-bound.
Right! But where are the gaps?
22
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
In summary
To solve difficult optimization problems, we typically need:
I Dantzig-Wolfe decomposition: obtain a formulation with a huge number
of variables;
I Column generation: solve the linear relaxation treating variables
(columns) implicitly;
I Start with a subset (RMP) and generate more (Subproblem);
I Branch-and-price: column generation within branch-and-bound in order to
find an integer solution for the original problem;
I Branch-price-and-cut: column and cut generation within
branch-and-bound.
Right! But where are the gaps?
22
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Standard column generation
I Optimal solutions obtained by the simplex method:
⇒ Extreme points of the RMP;
I They oscillate too much between consecutive iterations;
⇒ uj+1 is typically far from uj ;
I Instability and slow convergence of the method.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Standard column generation
I Optimal solutions obtained by the simplex method:
⇒ Extreme points of the RMP;
I They oscillate too much between consecutive iterations;
⇒ uj+1 is typically far from uj ;
I Instability and slow convergence of the method.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Standard column generation
I Optimal solutions obtained by the simplex method:
⇒ Extreme points of the RMP;
I They oscillate too much between consecutive iterations;
⇒ uj+1 is typically far from uj ;
I Instability and slow convergence of the method.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Standard column generation
I Optimal solutions obtained by the simplex method:
⇒ Extreme points of the RMP;
I They oscillate too much between consecutive iterations;
⇒ uj+1 is typically far from uj ;
I Instability and slow convergence of the method.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Standard column generation
RMP (dual)
(a)
RMP(dual)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Standard column generation
(b)
RMP(dual)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Standard column generation
(c)
RMP(dual)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Standard column generation
(d)
RMP(dual)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Oscillation in a real instance
‖uj − uj+1‖2, for each iteration j:������������� ���
�
���
���
���
���
�
���
���
� � � �� �� �� � �� � �� �� �� �� �� �� �� � � � �� �� �� ��
������������������������ ����������������
���������
��������
�������� ������
�����
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stabilization techniques: avoid extreme solutions!
⇒ use a point in the interior of the feasible set;
I Most of them: modify the master problem!
I Add variables, bounds, constraints, penalties, ...
⇒ The master problem may become more difficult to solve;
⇒ Some of them may be difficult to implement;
⇒ Several parameters to set.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stabilization techniques: avoid extreme solutions!
⇒ use a point in the interior of the feasible set;
I Most of them: modify the master problem!
I Add variables, bounds, constraints, penalties, ...
⇒ The master problem may become more difficult to solve;
⇒ Some of them may be difficult to implement;
⇒ Several parameters to set.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stabilization techniques: avoid extreme solutions!
⇒ use a point in the interior of the feasible set;
I Most of them: modify the master problem!
I Add variables, bounds, constraints, penalties, ...
⇒ The master problem may become more difficult to solve;
⇒ Some of them may be difficult to implement;
⇒ Several parameters to set.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stabilization techniques: avoid extreme solutions!
⇒ use a point in the interior of the feasible set;
I Most of them: modify the master problem!
I Add variables, bounds, constraints, penalties, ...
⇒ The master problem may become more difficult to solve;
⇒ Some of them may be difficult to implement;
⇒ Several parameters to set.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stability center: prohibit the next dual solution to go far from it;
RMP (dual)
(a)23
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stability center: prohibit the next dual solution to go far from it;
RMP (dual)
(a)23
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stability center: prohibit the next dual solution to go far from it;
RMP (dual)
(a)23
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stabilization techniques: avoid extreme solutions!
⇒ use a point in the interior of the feasible set;
I Most of them: modify the master problem!
I Add variables, bounds, constraints, penalties, ...
⇒ The master problem may become more difficult to solve;
⇒ Some of them may be difficult to implement;
⇒ Several parameters to set.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stabilization techniques: avoid extreme solutions!
⇒ use a point in the interior of the feasible set;
I Most of them: modify the master problem!
I Add variables, bounds, constraints, penalties, ...
⇒ The master problem may become more difficult to solve;
⇒ Some of them may be difficult to implement;
⇒ Several parameters to set.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stabilization techniques: avoid extreme solutions!
⇒ use a point in the interior of the feasible set;
I Most of them: modify the master problem!
I Add variables, bounds, constraints, penalties, ...
⇒ The master problem may become more difficult to solve;
⇒ Some of them may be difficult to implement;
⇒ Several parameters to set.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I Stabilization techniques: avoid extreme solutions!
⇒ use a point in the interior of the feasible set;
I Most of them: modify the master problem!
I Add variables, bounds, constraints, penalties, ...
⇒ The master problem may become more difficult to solve;
⇒ Some of them may be difficult to implement;
⇒ Several parameters to set.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I The column generation is more efficient when based on well-centered
interior points;
I So, why not using an interior point method?
I This is straightforward: does not require any changes in the RMP
nor parameter adjustments;
I Interior point methods will provided naturally stable solutions.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I The column generation is more efficient when based on well-centered
interior points;
I So, why not using an interior point method?
I This is straightforward: does not require any changes in the RMP
nor parameter adjustments;
I Interior point methods will provided naturally stable solutions.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I The column generation is more efficient when based on well-centered
interior points;
I So, why not using an interior point method?
I This is straightforward: does not require any changes in the RMP
nor parameter adjustments;
I Interior point methods will provided naturally stable solutions.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Column generation variants
I The column generation is more efficient when based on well-centered
interior points;
I So, why not using an interior point method?
I This is straightforward: does not require any changes in the RMP
nor parameter adjustments;
I Interior point methods will provided naturally stable solutions.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point method
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point method
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Optimality conditions
KKT conditions:
b−Ax = 0
c−ATu− s = 0
xisi = 0, ∀i = 1, . . . , n
x ≥ 0
s ≥ 0
I µ→ 0 iteratively;
I Instead of strictly satisfying these conditions, the iterates belong to
a safe neighbourhood, e.g.:
N2(θ) = {(x, u, s) ∈ F0 | ‖XSe− µe‖2 ≤ θµ}; or
Ns(γ) = {(x, u, s) ∈ F0 | γµ ≤ xisi ≤ 1γµ, ∀i = 1, . . . , n}, γ ∈ (0, 1).
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual interior point method
Perturbed KKT conditions:
b−Ax = 0
c−ATu− s = 0
xisi = µ, ∀i = 1, . . . , n
x ≥ 0
s ≥ 0
I µ→ 0 iteratively;
I Instead of strictly satisfying these conditions, the iterates belong to
a safe neighbourhood, e.g.:
N2(θ) = {(x, u, s) ∈ F0 | ‖XSe− µe‖2 ≤ θµ}; or
Ns(γ) = {(x, u, s) ∈ F0 | γµ ≤ xisi ≤ 1γµ, ∀i = 1, . . . , n}, γ ∈ (0, 1).
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual interior point method
Perturbed KKT conditions:
b−Ax = 0
c−ATu− s = 0
xisi = µ, ∀i = 1, . . . , n
x ≥ 0
s ≥ 0
I µ→ 0 iteratively;
I Instead of strictly satisfying these conditions, the iterates belong to
a safe neighbourhood, e.g.:
N2(θ) = {(x, u, s) ∈ F0 | ‖XSe− µe‖2 ≤ θµ}; or
Ns(γ) = {(x, u, s) ∈ F0 | γµ ≤ xisi ≤ 1γµ, ∀i = 1, . . . , n}, γ ∈ (0, 1).
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual interior point method
Perturbed KKT conditions:
b−Ax = 0
c−ATu− s = 0
xisi = µ, ∀i = 1, . . . , n
x ≥ 0
s ≥ 0
I µ→ 0 iteratively;
I Instead of strictly satisfying these conditions, the iterates belong to
a safe neighbourhood, e.g.:
N2(θ) = {(x, u, s) ∈ F0 | ‖XSe− µe‖2 ≤ θµ};
or
Ns(γ) = {(x, u, s) ∈ F0 | γµ ≤ xisi ≤ 1γµ, ∀i = 1, . . . , n}, γ ∈ (0, 1).
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual interior point method
Perturbed KKT conditions:
b−Ax = 0
c−ATu− s = 0
xisi = µ, ∀i = 1, . . . , n
x ≥ 0
s ≥ 0
I µ→ 0 iteratively;
I Instead of strictly satisfying these conditions, the iterates belong to
a safe neighbourhood, e.g.:
N2(θ) = {(x, u, s) ∈ F0 | ‖XSe− µe‖2 ≤ θµ}; or
Ns(γ) = {(x, u, s) ∈ F0 | γµ ≤ xisi ≤ 1γµ, ∀i = 1, . . . , n}, γ ∈ (0, 1).
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point method
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Simplex method
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point method
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point method
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Non-optimal solutions from interior point method
Advantages:
I We save time, as we stop early;
I The solution is well-centered in the feasible set;
I Early termination: good sub-optimal solutions.
I The column corresponds to a deeper cut in the dual space.
27
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Non-optimal solutions from interior point method
Advantages:
I We save time, as we stop early;
I The solution is well-centered in the feasible set;
I Early termination: good sub-optimal solutions.
I The column corresponds to a deeper cut in the dual space.
27
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Non-optimal solutions from interior point method
Advantages:
I We save time, as we stop early;
I The solution is well-centered in the feasible set;
I Early termination: good sub-optimal solutions.
I The column corresponds to a deeper cut in the dual space.
27
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Non-optimal solutions from interior point method
Advantages:
I We save time, as we stop early;
I The solution is well-centered in the feasible set;
I Early termination: good sub-optimal solutions.
I The column corresponds to a deeper cut in the dual space.
27
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Non-optimal solutions from interior point method
(b)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Non-optimal solutions from interior point method
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Non-optimal solutions from interior point method
(c)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual column generation method (PDCGM)
I Primal-dual interior point method to get primal-dual solutions;
I Suboptimal solution (λ, u) (ε-optimal solution): we stop the interior point
method with optimality tolerance ε.
I The distance to optimality ε is dynamically adjusted according to the
relative gap;
ε = min{εmax, gap/D}
I gap = (UB− LB)/(1 + |UB|);
I D: degree of optimality (fixed);
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual column generation method (PDCGM)
I Primal-dual interior point method to get primal-dual solutions;
I Suboptimal solution (λ, u) (ε-optimal solution): we stop the interior point
method with optimality tolerance ε.
I The distance to optimality ε is dynamically adjusted according to the
relative gap;
ε = min{εmax, gap/D}
I gap = (UB− LB)/(1 + |UB|);
I D: degree of optimality (fixed);
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual column generation method (PDCGM)
I Primal-dual interior point method to get primal-dual solutions;
I Suboptimal solution (λ, u) (ε-optimal solution): we stop the interior point
method with optimality tolerance ε.
I The distance to optimality ε is dynamically adjusted according to the
relative gap;
ε = min{εmax, gap/D}
I gap = (UB− LB)/(1 + |UB|);
I D: degree of optimality (fixed);
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual column generation method (PDCGM)
I Primal-dual interior point method to get primal-dual solutions;
I Suboptimal solution (λ, u) (ε-optimal solution): we stop the interior point
method with optimality tolerance ε.
I The distance to optimality ε is dynamically adjusted according to the
relative gap;
ε = min{εmax, gap/D}
I gap = (UB− LB)/(1 + |UB|);
I D: degree of optimality (fixed);
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual column generation method (PDCGM)
I Primal-dual interior point method to get primal-dual solutions;
I Suboptimal solution (λ, u) (ε-optimal solution): we stop the interior point
method with optimality tolerance ε.
I The distance to optimality ε is dynamically adjusted according to the
relative gap;
ε = min{εmax, gap/D}
I gap = (UB− LB)/(1 + |UB|);
I D: degree of optimality (fixed);
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual column generation method (PDCGM)
I Primal-dual interior point method to get primal-dual solutions;
I Suboptimal solution (λ, u) (ε-optimal solution): we stop the interior point
method with optimality tolerance ε.
I The distance to optimality ε is dynamically adjusted according to the
relative gap;
ε = min{εmax, gap/D}
I gap = (UB− LB)/(1 + |UB|);
I D: degree of optimality (fixed);
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual column generation method (PDCGM)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual column generation method (PDCGM)
(c)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Primal-dual column generation method (PDCGM)
(c)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Well-centred solution
I (λ, u) should be well-centered in the feasible set:
γµ ≤ (cj − uTaj)λj ≤ (1/γ)µ, ∀j ∈ N,
for some γ ∈ (0.1, 1], where µ = (1/|N |)(cT − uTA)λ;
I Natural way of stabilizing dual solutions if a primal-dual interior
point method is used.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Well-centred solution
I (λ, u) should be well-centered in the feasible set:
γµ ≤ (cj − uTaj)λj ≤ (1/γ)µ, ∀j ∈ N,
for some γ ∈ (0.1, 1], where µ = (1/|N |)(cT − uTA)λ;
I Natural way of stabilizing dual solutions if a primal-dual interior
point method is used.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Answer from the oracle
I The oracle (subproblem) is called with u:
zSP (u) := min{cj − uTaj |(cj , aj) ∈ A}.
I Two cases:
I zSP (u) < 0 and new columns are generated;
I zSP (u) = 0 and no columns are generated;
I The lower bound provided by a suboptimal solution is still valid?
I If zSP (u) = 0, does the method terminate?
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Answer from the oracle
I The oracle (subproblem) is called with u:
zSP (u) := min{cj − uTaj |(cj , aj) ∈ A}.
I Two cases:
I zSP (u) < 0 and new columns are generated;
I zSP (u) = 0 and no columns are generated;
I The lower bound provided by a suboptimal solution is still valid?
I If zSP (u) = 0, does the method terminate?
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Answer from the oracle
I The oracle (subproblem) is called with u:
zSP (u) := min{cj − uTaj |(cj , aj) ∈ A}.
I Two cases:
I zSP (u) < 0 and new columns are generated;
I zSP (u) = 0 and no columns are generated;
I The lower bound provided by a suboptimal solution is still valid?
I If zSP (u) = 0, does the method terminate?
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Answer from the oracle
I The oracle (subproblem) is called with u:
zSP (u) := min{cj − uTaj |(cj , aj) ∈ A}.
I Two cases:
I zSP (u) < 0 and new columns are generated;
I zSP (u) = 0 and no columns are generated;
I The lower bound provided by a suboptimal solution is still valid?
I If zSP (u) = 0, does the method terminate?
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Convergence
Lemma
Let zSP = zSP (u) be the value of the oracle corresponding to the
suboptimal solution (λ, u). Then, κzSP + bT u ≤ z?.
Lemma
Let (λ, u) be the suboptimal solution of the RMP, found at iteration k
with tolerance εk > 0. If zSP = 0, then the new relative gap is strictly
smaller than the previous one, i.e., gapk < gapk−1.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Convergence
Lemma
Let zSP = zSP (u) be the value of the oracle corresponding to the
suboptimal solution (λ, u). Then, κzSP + bT u ≤ z?.
Lemma
Let (λ, u) be the suboptimal solution of the RMP, found at iteration k
with tolerance εk > 0. If zSP = 0, then the new relative gap is strictly
smaller than the previous one, i.e., gapk < gapk−1.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: Algorithm
1. Input: Initial RMP; parameters κ, εmax, D > 1, δ > 0, .
2. set LB = −∞, UB =∞, gap =∞, ε = 0.5;
3. while (gap > δ) do
4. find a well-centered ε-optimal solution (λ, u) of the RMP;
5. UB = min{UB, zRMP };
6. call the oracle with the query point u;
7. LB = max{LB, κzSP + bT u};
8. gap = (UB− LB)/(1 + |UB|);
9. ε = min{εmax, gap/D};
10. if (zSP < 0) then add the new columns into the RMP;
11. end(while)33
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: Algorithm
1. Input: Initial RMP; parameters κ, εmax, D > 1, δ > 0, .
2. set LB = −∞, UB =∞, gap =∞, ε = 0.5;
3. while (gap > δ) do
4. find a well-centered ε-optimal solution (λ, u) of the RMP;
5. UB = min{UB, zRMP };
6. call the oracle with the query point u;
7. LB = max{LB, κzSP + bT u};
8. gap = (UB− LB)/(1 + |UB|);
9. ε = min{εmax, gap/D};
10. if (zSP < 0) then add the new columns into the RMP;
11. end(while)33
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: Algorithm
1. Input: Initial RMP; parameters κ, εmax, D > 1, δ > 0, .
2. set LB = −∞, UB =∞, gap =∞, ε = 0.5;
3. while (gap > δ) do
4. find a well-centered ε-optimal solution (λ, u) of the RMP;
5. UB = min{UB, zRMP };
6. call the oracle with the query point u;
7. LB = max{LB, κzSP + bT u};
8. gap = (UB− LB)/(1 + |UB|);
9. ε = min{εmax, gap/D};
10. if (zSP < 0) then add the new columns into the RMP;
11. end(while)33
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: Proof of convergence
Theorem
Let z? be the optimal value of the MP. Given the optimality tolerance
δ > 0, the primal-dual column generation method converges in a finite
number of steps to a primal feasible solution λ of the MP with
objective value z that satisfies:
(z − z?) < δ(1 + |z|).
Idea of the proof: By using the previous Lemmas, we show that either new
columns are generated or the relative gap is strictly reduced at each iteration,
until it falls below the optimality tolerance.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: Proof of convergence
Theorem
Let z? be the optimal value of the MP. Given the optimality tolerance
δ > 0, the primal-dual column generation method converges in a finite
number of steps to a primal feasible solution λ of the MP with
objective value z that satisfies:
(z − z?) < δ(1 + |z|).
Idea of the proof: By using the previous Lemmas, we show that either new
columns are generated or the relative gap is strictly reduced at each iteration,
until it falls below the optimality tolerance.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: source code
I Implementation in C, using interior point solver HOPDM;
I Publicly available code
http://www.maths.ed.ac.uk/~gondzio/software/pdcgm.html
I Source-code examples are provided for 6 different applications:
I Cutting stock problem;
I Vehicle routing problem;
I Capacitated lot sizing problem;
I Multiple kernel learning;
I Two-stage stochastic programming;
I Multicommodity network flow.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: source code
I Implementation in C, using interior point solver HOPDM;
I Publicly available code
http://www.maths.ed.ac.uk/~gondzio/software/pdcgm.html
I Source-code examples are provided for 6 different applications:
I Cutting stock problem;
I Vehicle routing problem;
I Capacitated lot sizing problem;
I Multiple kernel learning;
I Two-stage stochastic programming;
I Multicommodity network flow.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: source code
I Implementation in C, using interior point solver HOPDM;
I Publicly available code
http://www.maths.ed.ac.uk/~gondzio/software/pdcgm.html
I Source-code examples are provided for 6 different applications:
I Cutting stock problem;
I Vehicle routing problem;
I Capacitated lot sizing problem;
I Multiple kernel learning;
I Two-stage stochastic programming;
I Multicommodity network flow.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Number of iterations
Sheet2
Page 1
CPU time
Outer iterations
182.33
120.19
290.02
CSP
34.5225.93
126.01
VRPTW
3.37
2.11
2.65
SCGMPDCGMACCPM
CLSPST
53.99
30.47
38.77
SCGMPDCGMACCPM
CLSPST
25.52
7.29
65.38
CSP
15.18
7.80
31.28
VRPTW
Relative to PDCGM CSP VRPTW CLSPST
SCGM 1.52 1.33 1.60
ACCPM 2.41 4.86 1.26
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
CPU time (s)
Sheet2
Page 1
CPU time
Outer iterations
182.33
120.19
290.02
CSP
34.5225.93
126.01
VRPTW
3.37
2.11
2.65
SCGMPDCGMACCPM
CLSPST
53.99
30.47
38.77
SCGMPDCGMACCPM
CLSPST
25.52
7.29
65.38
CSP
15.18
7.80
31.28
VRPTW
Relative to PDCGM CSP VRPTW CLSPST
SCGM 3.50 1.95 1.26
ACCPM 8.97 4.01 1.27
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Oscillation in a VRPTW instance (Solomon C207)
‖uj − uj+1‖2, for each iteration j:������������� ���
�
���
���
���
���
�
���
���
� � � �� �� �� � �� � �� �� �� �� �� �� �� � � � �� �� �� ��
������������������������ ����������������
���������
��������
�������� ������
�����
35
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: Remarks
I Using well-centered, suboptimal solutions is beneficial for column
generation;
I The computational study was based on linear relaxations;
I Next step: branch-and-price method!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: Remarks
I Using well-centered, suboptimal solutions is beneficial for column
generation;
I The computational study was based on linear relaxations;
I Next step: branch-and-price method!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: Remarks
I Using well-centered, suboptimal solutions is beneficial for column
generation;
I The computational study was based on linear relaxations;
I Next step: branch-and-price method!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
PDCGM: Remarks
I Using well-centered, suboptimal solutions is beneficial for column
generation;
I The computational study was based on linear relaxations;
I Next step: branch-and-price method!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and Branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and Branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and Branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and Branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and Branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and Branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and Branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and Branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Branch-and-price and Branch-price-and-cut
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point branch-price-and-cut (IPBPC)
I Several challenging issues are involved when using this algorithm
within a branch-price-and-cut framework;
I It is not just replacing a simplex-type method.
I Change of strategy!
I Rethink every piece of a standard BPC: column generation, valid
inequalities, branching, ...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point branch-price-and-cut (IPBPC)
I Several challenging issues are involved when using this algorithm
within a branch-price-and-cut framework;
I It is not just replacing a simplex-type method.
I Change of strategy!
I Rethink every piece of a standard BPC: column generation, valid
inequalities, branching, ...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point branch-price-and-cut (IPBPC)
I Several challenging issues are involved when using this algorithm
within a branch-price-and-cut framework;
I It is not just replacing a simplex-type method.
I Change of strategy!
I Rethink every piece of a standard BPC: column generation, valid
inequalities, branching, ...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point branch-price-and-cut (IPBPC)
I Several challenging issues are involved when using this algorithm
within a branch-price-and-cut framework;
I It is not just replacing a simplex-type method.
I Change of strategy!
I Rethink every piece of a standard BPC: column generation, valid
inequalities, branching, ...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point branch-price-and-cut (IPBPC)
I The primal-dual interior point algorithm will be used to provide
well-centered, suboptimal solutions:
I Column generation;
I Valid inequalities;
I Branching.
I More stable primal and dual solutions;
I Deeper columns and cuts;
I Speed up the solution times.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point branch-price-and-cut (IPBPC)
I The primal-dual interior point algorithm will be used to provide
well-centered, suboptimal solutions:
I Column generation;
I Valid inequalities;
I Branching.
I More stable primal and dual solutions;
I Deeper columns and cuts;
I Speed up the solution times.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point branch-price-and-cut (IPBPC)
I The primal-dual interior point algorithm will be used to provide
well-centered, suboptimal solutions:
I Column generation;
I Valid inequalities;
I Branching.
I More stable primal and dual solutions;
I Deeper columns and cuts;
I Speed up the solution times.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Interior point branch-price-and-cut (IPBPC)
I The primal-dual interior point algorithm will be used to provide
well-centered, suboptimal solutions:
I Column generation;
I Valid inequalities;
I Branching.
I More stable primal and dual solutions;
I Deeper columns and cuts;
I Speed up the solution times.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Computational experiments
I Vehicle Routing Problem with Time Windows (VRPTW)
I The IPBPC performance was compared to the best results that are
available in the literature for a simplex-based BPC:
I Desaulniers, Lessard and Hadjar, Transp. Science, 2008.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Computational experiments
I Vehicle Routing Problem with Time Windows (VRPTW)
I The IPBPC performance was compared to the best results that are
available in the literature for a simplex-based BPC:
I Desaulniers, Lessard and Hadjar, Transp. Science, 2008.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)Nodes_100
Page 1
C101
C102
C103
C104
C105
C106
C107
C108
C109
RC101
RC102
RC103
RC104
RC105
RC106
RC107
RC108
R101
R102
R103
R104
R105
R106
R107
R108
R109
R110
R111
R112
0 20 40 60 80 100 120
Number of nodes
DLH08 IPBPC
Nodes
Inst
an
ce
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Comparing to a simplex-based BPC
Number of nodes
DLH08 IPBPC Ratio
C1 9 9 1.00
RC1 104 78 1.33
R1 239 182 1.31
352 269 1.31
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)Cuts_100
Page 1
C101
C102
C103
C104
C105
C106
C107
C108
C109
RC101
RC102
RC103
RC104
RC105
RC106
RC107
RC108
R101
R102
R103
R104
R105
R106
R107
R108
R109
R110
R111
R112
0 100 200 300 400 500 600 700 800
Number of valid inequalities
DLH08 IPBPC
Valid inequalities
Inst
an
ce
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Comparing to a simplex-based BPC
Number of valid inequalities
DLH08 IPBPC Ratio
C1 0 0 1.00
RC1 2199 1191 1.85
R1 3391 2140 1.58
5590 3331 1.68
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
CPUtime_100
Page 1
C101
C102
C103
C104
C105
C106
C107
C108
C109
RC101
RC102
RC103
RC104
RC105
RC106
RC107
RC108
R101
R102
R103
R104
R105
R106
R107
R108
R109
R110
R111
R112
0 2000 4000 6000 8000 10000 12000 14000 16000 18000
CPU time
DLH08 IPBPC
Seconds
Inst
an
ce
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Comparing to a simplex-based BPC
CPU time (sec)
DLH08 IPBPC Ratio
C1 158 28 5.69
RC1 17198 3472 4.95
R1 27928 4621 6.04
45284 8121 5.58
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Comparing to a simplex-based BPC
200-series instances:
I Wider time windows and larger vehicle capacity;
I More difficult subproblems!
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)Nodes_200
Page 1
C201
C202
C203
C204
C205
C206
C207
C208
RC201
RC202
RC203
RC205
RC206
RC207
R201
R202
R203
R205
R206
R207
R209
R210
0 2 4 6 8 10 12 14 16 18
Number of nodes
DLH08 IPBPC
Nodes
Inst
an
ce
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Comparing to a simplex-based BPC
Number of nodes
DLH08 IPBPC Ratio
C2 8 8 1.00
RC2 12 10 1.20
R2 38 42 0.90
58 60 0.97
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Cuts_200
Page 1
C201
C202
C203
C204
C205
C206
C207
C208
RC201
RC202
RC203
RC205
RC206
RC207
R201
R202
R203
R205
R206
R207
R209
R210
0 50 100 150 200 250 300 350 400
Number of valid inequalities
DLH08 IPBPC
Valid inequalities
Inst
an
ce
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Comparing to a simplex-based BPC
Number of valid inequalities
DLH08 IPBPC Ratio
C2 0 0 1.00
RC2 456 166 2.75
R2 1336 525 2.54
1792 691 2.59
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)CPUtime_200
Page 1
C201
C202
C203
C204
C205
C206
C207
C208
RC201
RC202
RC203
RC205
RC206
RC207
R201
R202
R203
R205
R206
R207
R209
R210
0 10000 20000 30000 40000 50000 60000 70000 80000 90000 100000
400904
CPU time
DLH08 IPBPC
Seconds
Inst
an
ce
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Comparing to a simplex-based BPC
CPU time (sec)
DLH08 IPBPC Ratio
C2 16745 865 19.35
RC2 92365 3250 28.42
R2 504540 27166 18.57
613650 31281 19.62
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Discrete Optimization
New developments in the primal–dual column generation technique
Jacek Gondzio a, Pablo González-Brevis a,⇑,1, Pedro Munari b,2
a School of Mathematics, The University of Edinburgh, James Clerk Maxwell Building, The King’s Buildings, Mayfield Road, Edinburgh EH9 3JZ, United Kingdomb Instituto de Ciências Matemáticas e de Computação, University of São Paulo, Av. Trabalhador São-carlense, 400, Centro, Cx. Postal 668, CEP 13560-970 São Carlos, SP, Brazil
a r t i c l e i n f o
Article history:
Received 6 October 2011
Accepted 18 July 2012
Available online 31 July 2012
a b s t r a c t
The optimal solutions of the restricted master problems typically leads to an unstable behavior of the
standard column generation technique and, consequently, originates an unnecessarily large number of
iterations of the method. To overcome this drawback, variations of the standard approach use interior
points of the dual feasible set instead of optimal solutions. In this paper, we focus on a variation known
European Journal of Operational Research 224 (2013) 41–51
Contents lists available at SciVerse ScienceDirect
European Journal of Operational Research
journal homepage: www.elsevier .com/locate /e jor
Using the primal-dual interior point algorithm within
the branch-price-and-cut method
Pedro Munari a,n,1, Jacek Gondzio b
a Instituto de Ciencias Matematicas e de Computac- ~ao, University of S ~ao Paulo, Av. Trabalhador, S ~ao-carlense, 400 - Centro, Cx. Postal 668, CEP 13560-970,
S ~ao Carlos-SP, Brazilb School of Mathematics, The University of Edinburgh, James Clerk Maxwell Building, The King’s Buildings, Mayfield Road, Edinburgh EH9 3JZ,
United Kingdom
a r t i c l e i n f o
Available online 14 March 2013
Keywords:
a b s t r a c t
Branch-price-and-cut has proven to be a powerful method for solving integer programming problems.
It combines decomposition techniques with the generation of both columns and valid inequalities and
Contents lists available at SciVerse ScienceDirect
journal homepage: www.elsevier.com/locate/caor
Computers & Operations Research
Computers & Operations Research 40 (2013) 2026–2036
38
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
I Linear optimization is a very rich research area;
I Strong theoretical development and powerful methodologies;
I Still, many gaps to be closed...
I I focused on new ideas for the simplex method, column generation
and branch-and-price, applied to real-life problems.
I Using the advantages offered by interior point methods was essential
for column generation and branch-and-price;
I Combining these methods was a challenge!
I Still, many gaps to be closed...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
I Linear optimization is a very rich research area;
I Strong theoretical development and powerful methodologies;
I Still, many gaps to be closed...
I I focused on new ideas for the simplex method, column generation
and branch-and-price, applied to real-life problems.
I Using the advantages offered by interior point methods was essential
for column generation and branch-and-price;
I Combining these methods was a challenge!
I Still, many gaps to be closed...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
I Linear optimization is a very rich research area;
I Strong theoretical development and powerful methodologies;
I Still, many gaps to be closed...
I I focused on new ideas for the simplex method, column generation
and branch-and-price, applied to real-life problems.
I Using the advantages offered by interior point methods was essential
for column generation and branch-and-price;
I Combining these methods was a challenge!
I Still, many gaps to be closed...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
I Linear optimization is a very rich research area;
I Strong theoretical development and powerful methodologies;
I Still, many gaps to be closed...
I I focused on new ideas for the simplex method, column generation
and branch-and-price, applied to real-life problems.
I Using the advantages offered by interior point methods was essential
for column generation and branch-and-price;
I Combining these methods was a challenge!
I Still, many gaps to be closed...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
I Linear optimization is a very rich research area;
I Strong theoretical development and powerful methodologies;
I Still, many gaps to be closed...
I I focused on new ideas for the simplex method, column generation
and branch-and-price, applied to real-life problems.
I Using the advantages offered by interior point methods was essential
for column generation and branch-and-price;
I Combining these methods was a challenge!
I Still, many gaps to be closed...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
I Linear optimization is a very rich research area;
I Strong theoretical development and powerful methodologies;
I Still, many gaps to be closed...
I I focused on new ideas for the simplex method, column generation
and branch-and-price, applied to real-life problems.
I Using the advantages offered by interior point methods was essential
for column generation and branch-and-price;
I Combining these methods was a challenge!
I Still, many gaps to be closed...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
I Linear optimization is a very rich research area;
I Strong theoretical development and powerful methodologies;
I Still, many gaps to be closed...
I I focused on new ideas for the simplex method, column generation
and branch-and-price, applied to real-life problems.
I Using the advantages offered by interior point methods was essential
for column generation and branch-and-price;
I Combining these methods was a challenge!
I Still, many gaps to be closed...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Thesis
I Linear optimization is a very rich research area;
I Strong theoretical development and powerful methodologies;
I Still, many gaps to be closed...
I I focused on new ideas for the simplex method, column generation
and branch-and-price, applied to real-life problems.
I Using the advantages offered by interior point methods was essential
for column generation and branch-and-price;
I Combining these methods was a challenge!
I Still, many gaps to be closed...
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Currently
I Professor at UFSCar;
I Production Engineering Department;
I Projects on interior point branch-price-and-cut and variants of
vehicle routing problems;
I VRP with multiple deliverymen;
I VRP under uncertainty; (in collaboration with Univ. of Edinburgh)
I Aircraft assignment - air taxi;
I Crop rotation scheduling with sustainable restrictions.
I I need soldiers :)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Currently
I Professor at UFSCar;
I Production Engineering Department;
I Projects on interior point branch-price-and-cut and variants of
vehicle routing problems;
I VRP with multiple deliverymen;
I VRP under uncertainty; (in collaboration with Univ. of Edinburgh)
I Aircraft assignment - air taxi;
I Crop rotation scheduling with sustainable restrictions.
I I need soldiers :)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Currently
I Professor at UFSCar;
I Production Engineering Department;
I Projects on interior point branch-price-and-cut and variants of
vehicle routing problems;
I VRP with multiple deliverymen;
I VRP under uncertainty; (in collaboration with Univ. of Edinburgh)
I Aircraft assignment - air taxi;
I Crop rotation scheduling with sustainable restrictions.
I I need soldiers :)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Currently
I Professor at UFSCar;
I Production Engineering Department;
I Projects on interior point branch-price-and-cut and variants of
vehicle routing problems;
I VRP with multiple deliverymen;
I VRP under uncertainty; (in collaboration with Univ. of Edinburgh)
I Aircraft assignment - air taxi;
I Crop rotation scheduling with sustainable restrictions.
I I need soldiers :)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Currently
I Professor at UFSCar;
I Production Engineering Department;
I Projects on interior point branch-price-and-cut and variants of
vehicle routing problems;
I VRP with multiple deliverymen;
I VRP under uncertainty; (in collaboration with Univ. of Edinburgh)
I Aircraft assignment - air taxi;
I Crop rotation scheduling with sustainable restrictions.
I I need soldiers :)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Currently
I Professor at UFSCar;
I Production Engineering Department;
I Projects on interior point branch-price-and-cut and variants of
vehicle routing problems;
I VRP with multiple deliverymen;
I VRP under uncertainty; (in collaboration with Univ. of Edinburgh)
I Aircraft assignment - air taxi;
I Crop rotation scheduling with sustainable restrictions.
I I need soldiers :)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Currently
I Professor at UFSCar;
I Production Engineering Department;
I Projects on interior point branch-price-and-cut and variants of
vehicle routing problems;
I VRP with multiple deliverymen;
I VRP under uncertainty; (in collaboration with Univ. of Edinburgh)
I Aircraft assignment - air taxi;
I Crop rotation scheduling with sustainable restrictions.
I I need soldiers :)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Currently
I Professor at UFSCar;
I Production Engineering Department;
I Projects on interior point branch-price-and-cut and variants of
vehicle routing problems;
I VRP with multiple deliverymen;
I VRP under uncertainty; (in collaboration with Univ. of Edinburgh)
I Aircraft assignment - air taxi;
I Crop rotation scheduling with sustainable restrictions.
I I need soldiers :)
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Acknowledgements
I Advisor: Prof. Dr. Marcos Arenales;
I Co-advisor: Prof. Dr. Jacek Gondzio;
I Earlier advisor: Prof. Dr. Geraldo Nunes Silva;
I Colleagues and professors at ICMC/USP;
I SBMAC;
I CNMAC organizers;
I Examination committee of “Premio Odelar Leite Linhares 2014”.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Acknowledgements
I Advisor: Prof. Dr. Marcos Arenales;
I Co-advisor: Prof. Dr. Jacek Gondzio;
I Earlier advisor: Prof. Dr. Geraldo Nunes Silva;
I Colleagues and professors at ICMC/USP;
I SBMAC;
I CNMAC organizers;
I Examination committee of “Premio Odelar Leite Linhares 2014”.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Acknowledgements
I Advisor: Prof. Dr. Marcos Arenales;
I Co-advisor: Prof. Dr. Jacek Gondzio;
I Earlier advisor: Prof. Dr. Geraldo Nunes Silva;
I Colleagues and professors at ICMC/USP;
I SBMAC;
I CNMAC organizers;
I Examination committee of “Premio Odelar Leite Linhares 2014”.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Acknowledgements
I Advisor: Prof. Dr. Marcos Arenales;
I Co-advisor: Prof. Dr. Jacek Gondzio;
I Earlier advisor: Prof. Dr. Geraldo Nunes Silva;
I Colleagues and professors at ICMC/USP;
I SBMAC;
I CNMAC organizers;
I Examination committee of “Premio Odelar Leite Linhares 2014”.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Acknowledgements
I Advisor: Prof. Dr. Marcos Arenales;
I Co-advisor: Prof. Dr. Jacek Gondzio;
I Earlier advisor: Prof. Dr. Geraldo Nunes Silva;
I Colleagues and professors at ICMC/USP;
I SBMAC;
I CNMAC organizers;
I Examination committee of “Premio Odelar Leite Linhares 2014”.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Acknowledgements
I Advisor: Prof. Dr. Marcos Arenales;
I Co-advisor: Prof. Dr. Jacek Gondzio;
I Earlier advisor: Prof. Dr. Geraldo Nunes Silva;
I Colleagues and professors at ICMC/USP;
I SBMAC;
I CNMAC organizers;
I Examination committee of “Premio Odelar Leite Linhares 2014”.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
Acknowledgements
I Advisor: Prof. Dr. Marcos Arenales;
I Co-advisor: Prof. Dr. Jacek Gondzio;
I Earlier advisor: Prof. Dr. Geraldo Nunes Silva;
I Colleagues and professors at ICMC/USP;
I SBMAC;
I CNMAC organizers;
I Examination committee of “Premio Odelar Leite Linhares 2014”.
Theoretical and computational issues for improving the performance of linear optimization methodsPedro Munari [[email protected]] - XXXV Congresso Nacional de Matematica Aplicada e Computacional (CNMAC 2014)
I Thank you!
I Questions?
I Funding