Neural Network and Fuzzy Logic(Lec2)

Embed Size (px)

Citation preview

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    1/40

    NN and FL -5th Software Engg. 1

    Neural Network and Fuzzy Logic EC5245

    Lecture(2)

    Dr. Tahani Abdalla Attia

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    2/40

    NN and FL -5th Software Engg. 2

    Architectures of Artificial Neural Networks:

    Artificial Neural Networks (ANNs) have grown in popularity in the

    last ten years as novel architectures and algorithms have developed

    for solving a range of different problems. Many of the applications

    fall into one of two groupings: those which involve the allocation of

    patterns to known classes (pattern classification or supervised

    learning) and those which involve clustering of patterns into similar

    groups (unsupervised learning). General schematic diagrams forarchitecture of a neural network can be as shown in the following

    figures:

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    3/40

    NN and FL -5th Software Engg. 3

    Architectures of Artificial Neural Networks:

    Three different classes of network

    architectures

    single-layer feed-forward neurons are organized multi-layer feed-forward in acyclic layers

    recurrent

    The architecture of a neural network is linked

    with the learning algorithm used to train

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    4/40

    4

    Single Layer Feed-forward

    Input layer

    of

    source nodes

    Output layerof

    neurons

    NN and FL -5th Software Engg.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    5/40

    NN and FL -5th Software Engg. 5

    Multi layer feed-forward

    Input

    layer

    Output

    layer

    Hidden Layer

    3-4-2 Network

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    6/40

    NN and FL -5th Software Engg. 6

    Feedforward Neural Network

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    7/40

    The neurons are arranged in separatelayers

    There is no connection between the

    neurons in th

    e same layer The neurons in one layer receive inputs

    from the previous layer

    The neurons in one layer delivers its

    output to th

    e next layer

    The connections are unidirectional (Hierarchical)

    Feedforward Neural Network

    7NN and FL -5th Software Engg.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    8/40

    NN and FL -5th Software Engg. 8

    Fully Connected Feedforward Multilayer Perceptron With Biases

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    9/40

    NN and FL -5th Software Engg. 9

    Neural Network With Feedback (Recurrent )

    Some connections are present from a layer to the previous layers

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    10/40

    There is no hierarchical arrangementThe connections can be bidirectional

    Associative Neural Network

    10NN and FL -5th Software Engg.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    11/40

    NN and FL -5th Software Engg. 11

    Part of a large initially random network

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    12/40

    NN and FL -5th Software Engg. 12

    Attractor Neural Network

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    13/40

    NN and FL -5th Software Engg. 13

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    14/40

    NN and FL -5th Software Engg. 14

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    15/40

    NN and FL -5th Software Engg. 15

    3-8-8-2 Neural Network

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    16/40

    NN and FL -5th Software Engg. 16

    EX. Advanced System Modeling and Control of Bioregenerative Life Support

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    17/40

    NN and FL -5th Software Engg. 17

    Ex.Feedforward ANN designed and tested for prediction oftactical air combat maneuvers .

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    18/40

    NN and FL -5th Software Engg. 18

    Computational neurobiologists have constructed very elaborate

    computer models of neurons in order to run detailed

    simulations of particular circuits in the brain. As Computer

    Scientists, we are more interested in the general properties ofneural networks, independent of how they are actually

    "implemented" in the brain. This means that we can use much

    simpler, abstract "neurons", which (hopefully) capture the essence of neural computation even if they leave out much of the

    details of how biological neurons work.

    Architectures

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    19/40

    Neuron Abstraction

    S

    19NN and FL -5th Software Engg.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    20/40

    NN and FL -5th Software Engg. 20

    Simple Artificial Neuron

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    21/40

    NN and FL -5th Software Engg. 21

    Simple Artificial NeuronOur basic computational element (model neuron) is often called a

    node orunit. It receives input from some other units, or perhaps

    from an external source. Each input has an associated weight w,

    which can be modified so as to model synaptic learning. The unit

    computes some functionfof the weighted sum of its inputs:

    Its output, in turn, can serve as input to other units.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    22/40

    NN and FL -5th Software Engg. 22

    Simple Artificial Neuron

    The weighted sum is called the net input to unit i,

    often written neti

    .

    Note that wij

    refers to the weight from unit j to unit i (not the

    other way around).

    The functionfis the unit's activation function. In the simplest

    case,fis the identity function, and the unit's output is just its net

    input. This is called a linear unit.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    23/40

    NN and FL -5th Software Engg. 23

    Simple neuron models, with and without bias

    ( )a f w p b!

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    24/40

    NN and FL -5th Software Engg. 24

    Where, the scalar input p is transmitted through a connection

    that multiplies its strength by the scalar weight w, to form the

    product wp, again a scalar. Here the weighted input wp is the

    only argument of the transfer function f, which produces the

    scalar output a of the neuron on the left. The neuron on the

    right has a scalar bias, b. One may view the bias as simply

    being added to the product wp as shown by the summing

    junction or as shifting the function f to the left by an amountb. The bias is much like a weight, except that it has a

    constant input of 1

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    25/40

    NN and FL -5th Software Engg. 25

    Most commonly used transfer functions are; the hard-limit

    transfer function, the linear transfer function, the log sigmoid

    transfer function and the hyperbolic tangential sigmoid

    transfer function.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    26/40

    NN and FL -5th Software Engg. 26

    Neuron model with vector input

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    27/40

    NN and FL -5th Software Engg. 27

    The output is then:

    a=f(W P+ b)

    Where, p is the input vector.

    n=w1,1p1+w1,2p2+... + w1,R pR + b

    If the input to a neuron is a vector, the individual element

    inputs are multiplied (dot product) by weights and the

    weighted values are fed to the summing junction. Then

    added to bias and passed to the assigned transfer function.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    28/40

    NN and FL -5th Software Engg. 28

    A layer of neurons:

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    29/40

    NN and FL -5th Software Engg. 29

    In this network, each element of the input vector p is

    connected to each neuron input through the weight matrix

    W. The ith neuron has a summer that gathers its weighted

    inputs and bias to form its own scalar output n (i). Various

    n(i)s taken together form an S-element net input vector n.

    Finally, the neuron layer outputs form a column vector a.

    The expression fora be as follows:

    a =f(W p +b)

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    30/40

    NN and FL -5th Software Engg. 30

    1,1 1,2 1,

    2,1 2,2 2,

    ,1 ,2 ,

    R

    R

    S S S R

    w w w

    w w w

    w w w

    !

    W

    K

    K

    M O M

    K

    The input vector elements enter the network through the weight

    matrix W, where W is represented as in the following equation :

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    31/40

    NN and FL -5th Software Engg. 31

    Multiple Layers Neurons

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    32/40

    NN and FL -5th Software Engg. 32

    The above example has R1 inputs, S1 neurons in the first layer, S2

    neurons in the second layer, etc. It is common for different layersto have different numbers of neurons. The output of Figure (12) is

    defined in the following equation :

    3 2 13, 2 2,1 1,1 1 2 3a ( ( ( ) ) )! 3f f f IW P b b b

    The layers of a multilayer network play different roles. A layer

    that produces the network output is called an output layer. All

    other layers are called hidden layers.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    33/40

    NN and FL -5th Software Engg. 33

    Perceptrons

    One perceptron neuron

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    34/40

    NN and FL -5th Software Engg. 34

    The most influential work on neural networks in the 60s went

    under the heading of perceptrons, a term coined by Frank

    Rosenblatt.

    Perceptron architecture can be a single neuron with single

    transfer function whereas the Least Mean Square (LMS)

    algorithm is built around it, or can be a single layer of perceptron

    neurons connected to inputs through a set of weights, or it can

    consist of input layer, one or more hidden layers of computation

    nodes and an output layer. The latter networks are commonly

    referred to as MultiLayer Perceptrons (MLPs).

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    35/40

    NN and FL -5th Software Engg. 35

    One Perceptron layer

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    36/40

    NN and FL -5th Software Engg. 36

    A layer of Perceptrons

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    37/40

    NN and FL -5th Software Engg. 37

    Multilayer Perceptron

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    38/40

    NN and FL -5th Software Engg. 38

    MultiLayer Perceptrons MLP with sigmoid functions

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    39/40

    Summary of Major Neural Networks Models

    39NN and FL -5th Software Engg.

  • 8/6/2019 Neural Network and Fuzzy Logic(Lec2)

    40/40

    Summary of Major Neural Networks Models

    40NN and FL -5th Software Engg.