Back Pro Bag at Ion

Embed Size (px)

Citation preview

  • 8/7/2019 Back Pro Bag at Ion

    1/8

    ENGM 646 II. Unconstrained Optimization Page I

    6.Backpropagation training algorithm: Consider a three layer neuralnetwork with the input layer, the hidden layer, and the output layer

    shown in Figure 13.6. There are n inputs, m outputs, and l neurons

    in the hidden layer.

    Input: x1, x2, , xn

    Input to the hidden layer: vj for j = 1, 2, , l

    Output: y1, y2, , ym

    Output from the hidden layer: zj for j = 1, 2, , l

    Connection weights to the hidden layer: wjih for j = 1, 2, , l and i= 1, 2, , n

    Connection weights to the output layer: wkjo

    for j = 1, 2, , l and k

    = 1, 2, , m

    Activation functions: fjh

    for j = 1, 2, , l and fso

    for s = 1, 2, , m

  • 8/7/2019 Back Pro Bag at Ion

    2/8

    ENGM 646 II. Unconstrained Optimization Page I

    nh

    j ji i

    i 1

    v w x ,=

    =

    h

    j j jz f (v ),=

    o o

    s s sj j

    j 1

    y f w zl

    =

    =

    )x,...,(xFxwfwf)(vfwfzwfy n1s1j

    n

    1i

    i

    h

    ji

    h

    j

    o

    sj

    o

    s

    1j

    j

    h

    j

    o

    sj

    o

    s

    1j

    j

    o

    sj

    o

    ss =

    =

    =

    =

    = ===

    lll

    First consider a single training data point (xd, yd), where xdnand yd

    m. We need to find the weights wji

    hfor j = 1, 2, , l

    and i = 1, 2, , n and wkjo

    for j = 1, 2, , l and k = 1, 2, , m

    such that the following objective function is minimized:

    Minimize E(w) = =

    m

    1s

    2

    sds )y(y2

    1

    where ys, whose equation is given earlier, is a function of inputdata xd and the unknown weights to be optimized. To solve this

    unconstrained optimization problem, we may use a gradient

    method with a fixed step size. An iterative procedure is needed

    with a proper stopping criterion. We need a starting point, that is,

    initial guesses of the weights of the neural network.

  • 8/7/2019 Back Pro Bag at Ion

    3/8

    ENGM 646 II. Unconstrained Optimization Page I

    Defining

    =

    =

    l

    1q

    q

    o

    sq

    'o

    ssdss zw)fy(y , s = 1, 2, , m

    we can express the gradient E(w) (with respect to wjih

    and wsjo) as

    follows:

    dij

    'h

    j

    m

    1p

    o

    pjph

    ji

    )x(vfww

    E(w)

    =

    =

    jo

    sj

    zw

    E(w)s=

    The fixed step-size gradient method uses the following iterative

    equation:

    w(k+1)

    = w(k)

    E(w(k)), k = 0, 1, 2,

    where is called the learning rate. Explicitly, we have

    di

    (k)

    j

    'h

    j

    m

    1p

    (k)o

    pj

    (k)

    p

    (k)h

    ji

    1)(kh

    ji )x(vfwww

    +=

    =

    +

    (k)

    j

    (k)

    s

    (k)o

    sj

    1)(ko

    sj zww +=+

    The update equation for the weights wsjo

    of the output layer is

    illustrated in Figure 13.7. The update equation for the weights wjih

    of the hidden layer is illustrated in Figure 13.8.

  • 8/7/2019 Back Pro Bag at Ion

    4/8

  • 8/7/2019 Back Pro Bag at Ion

    5/8

    ENGM 646 II. Unconstrained Optimization Page I

    This algorithm is called the backpropagation algorithm because the

    output errors 1, 2, , m are propagated back from the outputlayer to other layers and are used to update the weights in these

    layers.

  • 8/7/2019 Back Pro Bag at Ion

    6/8

  • 8/7/2019 Back Pro Bag at Ion

    7/8

    ENGM 646 II. Unconstrained Optimization Page

    Example II.19: Consider a neural network with 2 inputs, 2 hidden

    neurons, and 1 output neuron. The activation function for all neurons is

    given by f(v) = 1/(1+ev

    ). The starting points are (w11h(0)

    , w12h(0)

    , w21h(0)

    ,

    w22h(0)

    , w11o(0)

    , w12o(0)

    ) = (0.1, 0.3, 0.3, 0.4, 0.4, 0.6). The learning rate = 10. Consider a single training input-output pair with x = (0.2, 0.6)T andy = 0.7. See Figure 13.9. The results of 21 iterations of the

    backpropagation algorithm are given in the attached table.

  • 8/7/2019 Back Pro Bag at Ion

    8/8

    ENGM 646 II. Unconstrained Optimization Page