70
OPTI MIZATION METH O DS

14-01 - Optimization and Dicision Making

Embed Size (px)

Citation preview

Page 1: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 1/70

OPTIMIZATION METHODS

Page 2: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 2/70

DECISION MAKING

Examples:

determining which ingredients and in what quantities toadd to a mixture being made so that it will meet

specifications on its compositionallocating available funds among various competingagencies

deciding which route to take to go to a new location in thecity

Decision making always involves making a choice betweenvarious possible alternatives

Page 3: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 3/70

CATEGORIES OF DECISION MAKING

PROBLEMS

Category 1:The set of possible alternatives for the decision is a finitediscrete set typically consisting of a small number ofelements.Example: “A teenage girl knows four boys all of whom she likes,and has to decide who among them to go steady with.”

Solution:scoring methods

Category 2:The number of possible alternatives is either infinite, orfinite but very large, and the decision may be required tosatisfy some restrictions and constraints

Solution:unconstrained and constrained optimization

methods

Page 4: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 4/70

THE SCORING METHOD – AN

EXAMPLERita has been dating 4 boys off and on over the last 3 years,

and has come to know each of them very well. Who among thefour boys would be her best choice?

Page 5: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 5/70

CATEGORY 2 DECISION PROBLEMS

1. Get a precise definition of the problem, all relevant data

and information on it. Uncontrollable factors (random variables)

Controllable inputs (decision variables)

2. Construct a mathematical (optimization) model of theproblem.

Build objective functions and constraints

3. Solve the model Apply the most appropriate algorithms for the given problem

4. Implement the solution

Page 6: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 6/70

OPTIMIZATION MODELS

Single x Multiobjective models

Static x Dynamic models

Deterministic x Stochastic models

Page 7: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 7/70

PROBLEM SPECIFICATION

Suppose we have a cost function (orobjective function)

Our aim is to find values of the parameters (decision variables)x that minimize this function

Subject to the followingconstraints:

equality:

nonequality: 

If we seek a maximum of f (x) (profit function) it is equivalent toseeking a minimum of –  f (x) 

Page 8: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 8/70

BOOKS TO READ

Practical Optimization

Philip E. Gill, Walter Murray, and MargaretH. Wright, Academic Press,

1981

Practical Optimization: Algorithmsand Engineering Applications  Andreas Antoniou and Wu-Sheng Lu

2007

Both cover unconstrained and constrainedoptimization. Very clear and

comprehensive.

Page 9: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 9/70

FURTHER READING AND WEB

RESOURCES Numerical Recipes in C (or C++) : The

 Art of Scientific Computing William H. Press, Brian P. Flannery, Saul A.

Teukolsky, William T. Vetterling

Good chapter on optimization

 Available on line at

(1992 ed.)www.nrbook.com/a/bookcpdf.php(2007 ed.)www.nrbook.com

NEOS Guide

www-fp.mcs.anl.gov/OTC/Guide/

This powerpoint presentation

www.utia.cas.cz

Page 10: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 10/70

TYPES OF MINIMA

which of the minima is found depends on the starting

pointsuch minima often occur in real applications

x

 f (x)stronglocal

minimum

weak

localminimum strong

global

minimum

strong

localminimum

feasible region

Page 11: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 11/70

UNCONSTRAINEDUNIVARIATE 

OPTIMIZATION Assume we can start close to the global minimum

How to determine the minimum?

• Search methods (Dichotomous, Fibonacci, Golden-Section)

•  !!ro"imation methods#$ %ol&nomial inter!olation

'$ ewton method

• ombination of both (alg$ of Da*ies, Swann, and am!e&)

Page 12: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 12/70

SEARCH METHODS

Start with the interval (“bracket”) [xL, xU] such that the

minimum x* lies inside.Evaluate f ( x) at two point inside the bracket.

Reduce the bracket.

Repeat the process.

• an be a!!lied to an& function and differentiabilit& is not

essential$

Page 13: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 13/70

SEARCH METHODS

"+

"

"+

"

"+

"

"+

"

"+

""+

"

"+

"

# ' . /

"+

"

# ' . /

"+ "

# ' . /

"+ "

# ' . /

"+"

# ' . /

Dichotomous

Fibonacci0 # # ' . / 1

Page 14: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 14/70

1D FUNCTION

 As an example consider the function

(assume we do not know the actual function expression from now on)

Page 15: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 15/70

GRADIENT DESCENTGiven a starting location, x0, examine df/dx

and move in thedownhilldirectionto generate a new estimate, x1 = x0 +δx

How to determine the ste! si2e δ"?

Page 16: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 16/70

POLYNOMIAL INTERPOLATION

Bracket the minimum.Fit a quadratic or cubic polynomial whichinterpolates f ( x) at some points in the interval.

Jump to the (easily obtained) minimum of thepolynomial.

Throw away the worst point and repeat theprocess.

Page 17: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 17/70

POLYNOMIAL INTERPOLATION

Quadratic interpolation using 3 points, 2 iterations

Other methods to interpolate?2 points and one gradient

Cubic interpolation

Page 18: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 18/70

NEWTON METHOD

Expand f ( x) locally using a Taylor series.

Find theδx which minimizes this local quadratic

approximation.

Update x.

Fit a 3uadratic a!!ro"imation to f ( x) using both gradient and

cur*ature information at x$

Page 19: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 19/70

NEWTON METHOD

avoids the need to bracket the root

quadratic convergence (decimal accuracy doubles

at every iteration)

Page 20: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 20/70

NEWTON METHOD

Global convergence of Newton’s method is poor.

Often fails if the starting point is too far from the

minimum.

in practice, must be used with a globalization strategy

which reduces the step length until function decrease is

assured

Page 21: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 21/70

EXTENSION TO N (MULTIVARIATE)

DIMENSIONSHow big N can be?problem sizes can vary from a handful of parameters

to many thousands

We will consider examples for N=2, so that cost

function surfaces can be visualized.

Page 22: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 22/70

AN OPTIMIZATION ALGORITHM

Start atx0,k  =0.

1. Compute a search directionpk 

2.

Compute a step lengthαk , such that f (xk  + αkpk ) < f (xk )

3. Updatexk= xk  + αkpk 

4. Check for convergence (stopping criteria)e.g.d f /dx = 0

Reduces optimization in N dimensions to a series of (1D) line minimizations

k  = k +1

Page 23: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 23/70

TAYLOR EXPANSION

 A function may be approximated locally by its Taylor series

expansion about a pointx*

where the gradient is the vector

and the HessianH(x*) is the symmetric matrix

Page 24: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 24/70

QUADRATIC FUNCTIONS

The vectorg and the HessianH are constant. Second order approximation of any function by the

Taylor expansion is a quadratic function.

We will assume only quadratic functions for a while.

Page 25: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 25/70

NECESSARY CONDITIONS FOR A

MINIMUM

Expand f (x) about a stationary pointx* in direction

p

since at a stationary point

 At a stationary point the behavior is determined byH

Page 26: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 26/70

 

H is a symmetric matrix, and so has orthogonal

eigenvectors

 As |α| increases, f (x* + αui) increases,decreases or is unchanging according to

whether λi is positive, negative or zero

Page 27: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 27/70

EXAMPLES OF QUADRATIC

FUNCTIONSCase 1: both eigenvalues positive

with

minimum

!ositi*e

definite

Page 28: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 28/70

EXAMPLES OF QUADRATIC

FUNCTIONSCase 2: eigenvalues have different sign

with

saddle point

indefinite

Page 29: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 29/70

EXAMPLES OF QUADRATIC

FUNCTIONSCase 3: one eigenvalues is zero

with

parabolic cylinder

!ositi*e

semidefinite

Page 30: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 30/70

OPTIMIZATION FOR QUADRATIC

FUNCTIONS Assume thatH is positive definite

There is a unique minimum at

If N is large, it is not feasible to perform this inversion

directly.

Page 31: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 31/70

STEEPEST DESCENT

Basic principle is to minimize the N-dimensional

function by a series of 1D line-minimizations:

The steepest descent method choosespk  to be parallel

to the gradient

Step-sizeαk  is chosen to minimize f (xk  + αk pk ). For quadratic forms there is a closed form solution:

%ro*e it4

Page 32: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 32/70

STEEPEST DESCENT

The gradient is everywhere perpendicular to the contour lines.

 After each line minimization the new gradient is alwaysorthogonalto the previous step direction (true of any lineminimization).

Consequently, the iterates tend to zig-zag down the valley in avery inefficient manner

Page 33: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 33/70

CONJUGATE GRADIENT

Eachpk  is chosen to be conjugate to all previous search

directions with respect to the HessianH:

The resulting search directions are mutually linearly

independent.

 Remarkably,pk  can be chosen using only knowledge of

pk-1, , and

%ro*e it4

Page 34: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 34/70

CONJUGATE GRADIENT

 An N-dimensional quadratic form can be minimized in

at most N conjugate descent steps.

3 different starting points.

Minimum is reached in exactly 2 steps.

O A O O G A

Page 35: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 35/70

OPTIMIZATION FOR GENERAL

FUNCTIONS

 Apply methods developed using quadratic Taylor series expansion

Page 36: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 36/70

ROSENBROCK’S FUNCTION

5inimum at 6#, #7

Page 37: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 37/70

STEEPEST DESCENT

The 1D line minimization must be performed using one of

the earlier methods (usually cubic polynomial interpolation)

The zig-zag behaviour is clear in the zoomed view

The algorithm crawls down the valley

Page 38: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 38/70

CONJUGATE GRADIENT

 Again, an explicit line minimization must be used at

every step

The algorithm converges in 98 iterations

Far superior to steepest descent

Page 39: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 39/70

NEWTON METHOD

Expand f (x) by its Taylor series about the pointxk 

where the gradient is the vector

and the Hessian is the symmetric matrix

Page 40: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 40/70

NEWTON METHOD

For a minimum we require that , and so

with solution . This gives the iterative update

If f (x) is quadratic, then the solution is found in one step.

The method has quadratic convergence (as in the 1D case).

The solution is guaranteed to be a downhill direction.

Rather than jump straight to the minimum, it is better to perform a

line minimization which ensures global convergence

IfH=I then this reduces to steepest descent.

Page 41: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 41/70

NEWTON METHOD - EXAMPLE

The algorithm converges in only 18 iterationscompared to the 98 for conjugate gradients.

However, the method requires computing the Hessianmatrix at each iteration – this is not always feasible

Page 42: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 42/70

SUMMARY OF THE 1ST LECTURE

Minimization of 1-D functions Search methods

 Approximation methods

N-D functions -> finding the descent direction Taylor series -> Quadratic functions

Steepest descent

Conjugate Gradient Newton method

Page 43: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 43/70

QUASI-NEWTON METHODS

If the problem size is large and the Hessian matrix is

dense then it may be infeasible/inconvenient tocompute it directly.

Quasi-Newton methods avoid this problem by keeping

a “rolling estimate” of H(x), updated at each iteration

using new gradient information. Common schemes are due to Broyden, Goldfarb,

Fletcher and Shanno (BFGS), and also Davidson,

Fletcher and Powell (DFP).

The idea is based on the fact that for quadraticfunctions holds

and by accumulatinggk ’s andxk ’s we can calculateH. 

Page 44: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 44/70

QUASI-NEWTON BFGS METHOD

SetH0 = I.

Update according to

where

The matrix inverse can also be computed in this way.

Directionsδk ‘s form a conjugate set.   Hk+1 is positive definite ifHk  is positive definite.

The estimateHk  is used to form a local quadratic

approximation as before

Page 45: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 45/70

BFGS EXAMPLE

The method converges in 34 iterations, compared to

18 for the full-Newton method

Page 46: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 46/70

NON-LINEAR LEAST SQUARES

It isvery common in applications for a cost

function f (x) to be the sum of a large number of

squared residuals

If each residual dependsnon-linearly on the

parametersx then the minimization of f (x) is a

non-linear least squares problem.

Page 47: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 47/70

NON-LINEAR LEAST SQUARES

The M × N Jacobian of the vector of residualsr  is

defined as

Consider

Hence

Page 48: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 48/70

NON-LINEAR LEAST SQUARES

For the Hessian holds

Note that the second-order term in the Hessian is multiplied by the

residualsr i.

In most problems, the residuals will typically be small.  Also, at the minimum, the residuals will typically be distributed with

mean = 0. For these reasons, the second-order term is often ignored.

Hence, explicit computation of the full Hessian can again be avoided.

Gauss-ewton

a!!ro"imation

Page 49: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 49/70

GAUSS-NEWTON EXAMPLE

The minimization of the Rosenbrock function

can be written as a least-squares problem with

residual vector

GAUSSNEWTONEXAMPLE

Page 50: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 50/70

GAUSS-NEWTON EXAMPLE

minimization with the Gauss-Newton approximationwith line search takes only 11 iterations

COMPARISON

Page 51: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 51/70

COMPARISON

G ewton

8uasi-ewton Gauss-ewton

Page 52: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 52/70

SIMPLEX

CONSTRAINEDOPTIMIZATION

Page 53: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 53/70

CONSTRAINED OPTIMIZATION

Subject to:

Equality constraints:

Nonequality constraints:

Constraints define a feasible region, which isnonempty.

The idea is to convert it to an unconstrained optimization. 

EQUALITYCONSTRAINTS

Page 54: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 54/70

EQUALITY CONSTRAINTS

Minimize f (x) subject to: for

The gradient of f (x) at a local minimizer is equal to the

linear combination of the gradients ofai(x) with

Lagrange multipliers as the coefficients.

Page 55: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 55/70

 f 3 > f 2 > f 1

 f 3 > f 2 > f 1

 f 3 > f 2 > f 1

  is not a minimi2er 

x* is a minimi2er, λ*>0 

x* is a minimi2er, λ*<0 

x* is not a minimi2er 

3DEXAMPLE

Page 56: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 56/70

3D EXAMPLE

3DEXAMPLE

Page 57: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 57/70

3D EXAMPLE

 f (x) = 3

Gradients of constraints and ob9ecti*e function are linearl& inde!endent$ 

3DEXAMPLE

Page 58: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 58/70

3D EXAMPLE

 f (x) = 1

Gradients of constraints and ob9ecti*e function are linearl& de!endent$ 

INEQUALITYCONSTRAINTS

Page 59: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 59/70

INEQUALITY CONSTRAINTS

Minimize f (x) subject to: for

The gradient of f (x) at a local minimizer is equal to the

linear combination of the gradients ofc j(x), which are

active( c j(x) = 0 )

andLagrange multipliers must be positive,

Page 60: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 60/70

 f 3 > f 2 > f 1

 f 3 > f 2 > f 1

 f 3 > f 2 > f 1

o acti*e constraints

at x*, 

x* is not a minimi2er, μ<0

x* is a minimi2er, μ>0

LAGRANGIEN

Page 61: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 61/70

LAGRANGIEN

We can introduce the function (Lagrangien)

The necessary condition for the local minimizer is

and it must be a feasible point (i.e. constraints are

satisfied).

These are Karush-Kuhn-Tucker conditions

QUADRATICPROGRAMMING(QP)

Page 62: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 62/70

QUADRATIC PROGRAMMING (QP)

Like in the unconstrained case, it is important to study

quadratic functions.Why?

Because general nonlinear problems are solved as a

sequence of minimizations of their quadratic

approximations.

QP with constraints

Minimize

subject to linear constraints.

H is symmetric and positive semidefinite.

QPWITHEQUALITYCONSTRAINTS

Page 63: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 63/70

QP WITH EQUALITY CONSTRAINTS

Minimize

Subject to:

 Ass.:A is p × Nand has full row rank ( p< N )

Convert to unconstrained problem by variableelimination:

Minimize

This quadratic unconstrained problem can be solved,

e.g., by Newton method.

Z is the null s!ace of A

A+ is the !seudo-in*erse$

QPWITHINEQUALITYCONSTRAINTS

Page 64: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 64/70

QP WITH INEQUALITY CONSTRAINTS

Minimize

Subject to:

First we check if the unconstrained minimizer

is feasible.If yes we are done.

If not we know that the minimizer must be on the

boundary and we proceed with anactive-set method.

xk  is the current feasible point

  is the index set of active constraints atxk 

Next iterate is given by

ACTIVESETMETHOD

Page 65: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 65/70

ACTIVE-SET METHOD

  How to finddk ?

To remain active thus

The objective function atxk +d becomes

where

The major step is aQP sub-problem 

subject to:

Two situations may occur: or

ACTIVESETMETHOD

Page 66: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 66/70

ACTIVE-SET METHOD

 

We check if KKT conditions are satisfied

and

If YES we are done.If NO we remove the constraint from the active set with the

most negative and solve the QP sub-problem again but this

time with less active constraints.

 

We can move to but some inactive constraints

may be violated on the way.

In this case, we move by till the first inactive constraint

becomes active, update , and solve the QP sub-problem again

but this time with more active constraints.

GENERALNONLINEAR

Page 67: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 67/70

GENERAL NONLINEAR

OPTIMIZATION Minimize  f (x) 

subject to:

where the objective function and constraints are

nonlinear.

1. For a given approximate Lagrangien by

Taylor series → QP problem

2. Solve QP → descent direction

3. Perform line search in the direction →

4. Update Lagrange multipliers →

5. Repeat from Step 1.

GENERALNONLINEAR

Page 68: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 68/70

GENERAL NONLINEAR

OPTIMIZATIONLagrangien

 At thek th iterate:

and we want to compute a set of increments:

First order approximation of and constraints:

 

 

 

These approximate KKT conditions corresponds to a QP program

SQPEXAMPLE

Page 69: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 69/70

SQP EXAMPLE

Minimize

subject to:

LINEARPROGRAMMING(LP)

Page 70: 14-01 - Optimization and Dicision Making

7/24/2019 14-01 - Optimization and Dicision Making

http://slidepdf.com/reader/full/14-01-optimization-and-dicision-making 70/70

LINEAR PROGRAMMING (LP)

LP is common in economy and is meaningful only if it

is with constraints. Two forms:

1. Minimize

subject to:

2. Minimize

subject to:

QP can solve LP.

If the LP minimizer exists it must be one of the

vertices of the feasible region. 

 A fast method that considers vertices is the Simplex

method

A is p × N  and has

full row rank ( p< N )

%ro*e it4