2. Classical Optimization Technique

Embed Size (px)

Citation preview

  • 8/9/2019 2. Classical Optimization Technique

    1/19

    1

    Classical Optimization Techniques

    Prof. Keyur P Hirpara

    Assistant Professor

    [email protected]

    Relative and Global Optimum

    A function is said to have a relative or local minimum atx =

    x* if f(x*) f(x+h) for all sufficiently small positive and

    negative values of h, i.e. in the near vicinity of the point x.

    Similarly, a point x* is called a relative or local maximum iff(x*) f(x+h)for all values of h sufficiently close to zero.

    A function is said to have a global or absolute minimum atx

    = x* if f(x*) f(x) for all x in the domain over which f(x) is

    defined.

    Similarly, a function is said to have a global or absolute

    maximum at x = x* if f(x*)f(x) for allx in the domain over

    whichf(x) is defined.

    2

  • 8/9/2019 2. Classical Optimization Technique

    2/19

  • 8/9/2019 2. Classical Optimization Technique

    3/19

    Functions of a single variable

    Necessary condition :

    For a single variable function f(x) defined for x [a,b]

    which has a relative maximum at x = x*, x* [a,b] if the

    derivative f(x) = d f(x)/dx exists as a finite number at x =

    x* then f(x*) = 0.

    We need to keep in mind that the above theorem

    holds good for relative minimum as well.

    The theorem only considers a domain where the

    function is continuous.

    5

    Functions of a single variable

    The theorem does not say whathappens if a minimum or maximumoccurs at a point x* where thederivative fails to exist.

    The theorem does not say whathappens if a minimum or maximumoccurs at an endpoint of the intervalof definition of the function. In thisexists for positive values of h only orfor negative values of h only, andhence the derivative is not defined atthe endpoints.

    The theorem does not say that thefunction necessarily will have a

    minimum or maximum at every pointwhere the derivative is zero. For example (fig), However, this point isneither a minimum nor a maximum.

    6

  • 8/9/2019 2. Classical Optimization Technique

    4/19

    Stationary points

    Figure showing the three types of stationary points

    (a) minimum (b) maximum (c) inflection point

    In general, a pointx* at whichf(x*) = 0 is called a stationary

    point.

    7

    Functions of a single variable

    Sufficient condition:

    For the same function stated above letf (x*) = f (x*)

    = . . . = f

    (n-1)

    (x

    *

    ) = 0, but f

    (n)

    (x

    *

    )

    0, then it can besaid thatf (x*) is

    a minimum value off (x) iff (n)(x*) > 0 and n is even;

    a maximum value off (x) iff (n)(x*) < 0 and n is even;

    neither a maximum nor a minimum if n is odd.

    8

  • 8/9/2019 2. Classical Optimization Technique

    5/19

    Example

    Determine the maximum and minimum values of the function

    f (x) = 12x5 45x4 + 40x3 + 5

    f (x) = 60(x4 3x3 + 2x2)

    = 60 x2(x 1)(x 2),

    f (x) = 0 at x = 0, x = 1, and x = 2.

    The second derivative is

    f (x) = 60(4x3 9x2 + 4x)

    Atx = 1, f (x) = 60

    Here value of function is negative, hence atx = 1 is a relativemaximum.

    fmax = f (x = 1) = 12

    9

    Example

    Atx = 2, f (x) = 240

    Here value of function is positive, hencex = 2 is a relative maximum.

    fmin = f (x = 2) = 11

    Atx = 0, f (x) = 0 andHere value of function is zero, hence we must investigate the next

    derivative:

    f (x) = 60(12x2 18x + 4) = 240 at x = 0

    Sincef (x) = 0 at x = 0,

    Here n=3 (even)

    x = 0 is neither a maximum nor a minimum, and it is an inflection

    point.

    10

  • 8/9/2019 2. Classical Optimization Technique

    6/19

    Example

    Find the optimum value of the function f(x) = x2+3x-5and

    also state if the function attains a maximum or a minimum.

    Solution:

    f (x) = 2x+3 for maxima or minima

    OR x* = -3/2

    f(x) = 2

    Which is positive hence the pointx*= -3/2 is a point of minima

    and the function attains a minimum value of -29/4 at this point.

    11

    Multivariable with No constraint

    Necessary conditions

    Iff(X) has an extreme point

    (maximum or minimum) at

    X = X and if the first

    partial derivatives of f(X)

    exist atX, then

    12

  • 8/9/2019 2. Classical Optimization Technique

    7/19

    Multivariable with no constraint

    Sufficient Condition:

    A sufficient condition for a stationary point X to be an extreme point is

    that the matrix of second partial derivatives (Hessian matrix) off(X)evaluated at X is

    Positive definite when X is a relative minimum point

    And negative definite when X is a relative maximum point.

    Consider the following second order derivatives

    13

    Multivariable with no constraint

    A square matrix is positive definite if all its Eigen values are

    positive and it is negative definite if all its Eigen values are

    negative. If some of the Eigen values are positive and some

    negative then the matrix is neither positive definite or negativedefinite.

    To calculate the Eigen values of a square matrix then the

    following equation is solved. |A-I|=0

    14

  • 8/9/2019 2. Classical Optimization Technique

    8/19

    Sufficient condition

    Test that can be used to find the positive definiteness of a

    matrix A of order n involves evaluation of the determinants

    15

    Nature of the extreme points

    Point

    X=X*

    Value

    (a11)

    Value

    H(x)

    Nature of J Nature of X F(x)

    (X1,X2) + Ve + Ve Positive definite

    (Convex)

    Relative

    Minimum

    Find the function at

    minimum value

    (X1,X2) + Ve - Ve Indefinite Saddle point Find the value of

    function at [X1,X2]

    (X1,X2) - Ve - Ve Indefinite Saddle point Find the value of

    function at [X1,X2]

    (X1,X2) - Ve + Ve Negative definite

    (Concave)

    Relative

    Maximum

    Find the function at

    Maximum value

    17

  • 8/9/2019 2. Classical Optimization Technique

    9/19

    Nature of the extreme points

    In the case of a function of two variables, f (x, y), the Hessian matrix may

    be neither positive nor negative definite at a point (x, y) at which

    In such a case, the point (x, y) is called a saddle point.it corresponds to

    a relative minimum or maximum off(x, y)with respect to one variable,

    say, x (the other variable being fixed at y = y) and a relative maximum

    or minimum off(x, y)with respect to the second variable y (the other

    variable being fixed at x).

    As an example, consider the functionf (x, y) = x2 y2. For this function

    These first derivatives are zero at x = 0 a n d y = 0. The Hessian matrix of f

    at (x, y) is given by

    18

    Nature of the extreme points

    Since this matrix is neither positive definite nor negative definite, the point

    (x = 0, y = 0) is a saddle point.

    The function is shown graphically in Fig.

    It can be seen that f (x, y) = f (x, 0) has a relative minimum and f (x, y) =

    f (0, y) has a relative maximum at the saddle point (x, y).

    Saddle points may exist for functions of more than two variables also.

    19

  • 8/9/2019 2. Classical Optimization Technique

    10/19

    Example

    Figure shows two frictionless rigid bodies (carts) A and B connected

    by three linear elastic springs having spring constants k1, k2, and k3.

    The springs are at their natural positions when the applied forceP iszero. Find the displacements x1 and x2 under the forcePby using

    the principle of minimum potential energy.

    20

    Example

    The potential energy of the system is given by

    potential energy (U) = strain energy of springs work done by external

    forces

    The necessary conditions for the minimum of U are

    The values of x1 and x2 corresponding to the equilibrium state,

    obtained by solving above equations

    21

  • 8/9/2019 2. Classical Optimization Technique

    11/19

    Example

    The sufficiency conditions for the minimum at (x1* , x2*) can

    also be verified by testing the positive definiteness of the

    Hessian matrix of U. The Hessian matrix of U evaluated at(x1* , x2*) is

    The determinants of the square sub matrices of J are

    22

    Example

    Find the extreme points of the function

    The necessary conditions for the existence of an extreme point are

    These equations are satisfied at the points

    To find the nature of these extreme points, we have to use the sufficiency

    conditions. The second-order partial derivatives offare given by

    23

  • 8/9/2019 2. Classical Optimization Technique

    12/19

    Example

    The Hessian matrix of f is given by

    Nature of the extreme point are as given below

    24

    Multivariable with Equality constraint

    Direct Substitution method

    Example: Find the dimensions of a box of largest volume that can

    be inscribed in a sphere of unit radius.

    Let the origin of the Cartesian coordinate systemx1, x2, x3be at

    the center of the sphere and the sides of the box be 2x1, 2x2,

    and 2x3. The volume of the box is given by:

    f(x1, x2, x3) = 8x1x2x3

    25

  • 8/9/2019 2. Classical Optimization Technique

    13/19

    Since the corners of the box lie on the surface of the sphere ofunit radius, x1, x2, and x3 have to satisfy the constraint

    This problem has three design variables and one equalityconstraint. Hence the equality constraint can be used toeliminate any one of the design variables from the objectivefunction. If we choose to eliminate x3,

    Thus the objective function becomes

    which can be maximized as an unconstrained function in twovariables.

    Example

    26

    The necessary conditions for the maximum offgive,

    Example

    27

  • 8/9/2019 2. Classical Optimization Technique

    14/19

    To find whether the solution found corresponds to a maximum or a

    minimum, we apply the sufficiency conditions tof(x1, x2)

    Since,

    Hence the point (x1* , x2* ) corresponds to the maximum off.

    Example

    28

    Solution by Constrained Variation Method

    Variations about A

    In above figure PQ indicates the curve at each point of which constraint issatisfied. If A is taken as the base point (x1* ,x2*), the variations in x1 and x2leading to points B and C are called admissible variations. On the other hand,the variations in x1 and x2 representing point D are not admissible since

    point D does not lie on the constraint curve,g(x1, x2) = 0.Necessary condition

    29

  • 8/9/2019 2. Classical Optimization Technique

    15/19

    Example

    A beam of uniform rectangular cross section is to be cut from a log having a

    circular cross section of diameter 2a. The beam has to be used as a

    cantilever beam (the length is fixed) to carry a concentrated load at the freeend. Find the dimensions of the beam that correspond to the maximum

    tensile (bending) stress carrying capacity.

    30

    we know that the tensile stress induced in a rectangular beam () at any

    fiber located a distancey from the neutral axis is given by

    whereMis the bending moment acting andIis the moment of inertia of the

    cross section about the x axis.

    If the width and depth of the rectangular beam shown in Fig. are 2x and 2y,

    respectively, the maximum tensile stress induced is given by

    Subject to the constraint

    x2 + y2 = a2

    This problem has two variables and one constraint;f = kx1y2

    g = x2 + y2 a2 (Where, k= 3M/4)

    Example

    31

  • 8/9/2019 2. Classical Optimization Technique

    16/19

    We have equation,

    Put this value in constraint,

    Example

    32

    Solution bythe Lagrange Multipliers method

    Take example of two variables and one constraint

    Minimize f (x1, x2)

    Subject to, g(x1, x2) = 0

    The necessary condition for the existence of an extreme point atX =X* was found in previous Section

    (Eq-1)

    By defining a quantity , called theLagrange multiplier, as

    (Eq-2)

    Equation can be expressed as

    (Eq-3)

    33

  • 8/9/2019 2. Classical Optimization Technique

    17/19

    Solution bythe Lagrange Multipliers method

    Equation 2 can be expressed as

    (Eq-4)

    Hence equations (3) to (4) represent the necessary conditions

    for the point [x1*, x2*] to be an extreme point.

    Necessary conditions require that at least one of the partial

    derivatives ofg(x1, x2) be non-zero at an extreme point.

    The necessary conditions given by Eqs. (3) to (4) are more

    commonly generated by constructing a function L, known asthe Lagrange function, as

    34

    Solution bythe Lagrange Multipliers method

    By treating L as a function of the three variables x1, x2, and ,

    the necessary conditions for its extreme are given by

    35

  • 8/9/2019 2. Classical Optimization Technique

    18/19

    Find the solution of Previous example using the Lagrange multiplier

    method:

    The necessary conditions for the minimum off (x, y)

    Comparing both values of derived from first two equations and put in to

    third equation,

    Example

    36

    Necessary Condition for a GeneralProblem

    The Lagrange function, L, in this case is defined by

    introducing one Lagrange multiplier j for each constraint gj(X) as

    37

  • 8/9/2019 2. Classical Optimization Technique

    19/19

    Sufficient Condition for a GeneralProblem

    Where

    If determinant of this equation is positive function is Minimum.

    If determinant of this equation is Negative function is Maximum.

    If some of the roots of this polynomial are positive while the others arenegative, the point X* is not an extreme point.

    38

    Multivariable optimization withinequality constraint