Development and Implementation of Parameterized

Embed Size (px)

Citation preview

  • 8/3/2019 Development and Implementation of Parameterized

    1/26

    Presented byVishwanath c.

    Under the Guidance

    ofMr. Mohammed Riyaz Ahmed

    Asst. Professor, Department of ECEREVAITM, Bangalore-64

  • 8/3/2019 Development and Implementation of Parameterized

    2/26

    Simple perception of Neural model

  • 8/3/2019 Development and Implementation of Parameterized

    3/26

    eura mo e eve opment or

    general purpose online

    applications

    y Artificial neural network (ANN) is an efficient alternatives

    toy Numerical modeling computationally expensivey Analytical techniques difficult to obtain for new devicesy Empirical methods range and accuracy could be limitedy ANN are widely used in RF and microwave CAD because

    these can be trained to learn any arbitrary nonlinear input-output relationship from corresponding data

    y Generate smooth results for approximating discretemeasured and simulated data

  • 8/3/2019 Development and Implementation of Parameterized

    4/26

    eura mo e eve opment: ey

    issues

    y Identification of inputs and outputs to describe themodel

    y Data range and sample distribution

    y Data generation and organization

    y Data preprocessing

    y

    Neural network structurey Neural network training

    y Neural network model accuracy

  • 8/3/2019 Development and Implementation of Parameterized

    5/26

    Sequential flowchart in various

    step in neural model development

  • 8/3/2019 Development and Implementation of Parameterized

    6/26

    Motivation for implementation of

    neural network in FPGAy In the past, the size constraints and the high cost of FPGAs when confronted

    with the high computational and interconnect complexity inherent inANNshave prevented the practical use of the FPGAas a platform forANNs instead,the focus has been on development of microprocessor-based softwareimplementations for real world applications, while FPGAplatforms largelyremained as a topic for further research.

    y Despite the prevalence of software-basedANN implementations, FPGAs andsimilarly, application specific integrated circuits (ASICs) have attracted muchinterest as platforms forANNs because of the perception that their naturalpotential for parallelism and entirely hardware-based computationimplementation provide better performance than their predominantlysequential software-based counterparts.As a consequence, hardware-basedimplementations came to be preferred for high performanceANN applications. While it is broadly assumed, it should be noted that an empirical study has yetto confirm that hardware-based platforms forANNs provide higher levels ofperformance than software in all the cases .

  • 8/3/2019 Development and Implementation of Parameterized

    7/26

    Multilayer perceptions(mlps)

  • 8/3/2019 Development and Implementation of Parameterized

    8/26

    Block diagram ofBackpath

    algorithm

  • 8/3/2019 Development and Implementation of Parameterized

    9/26

    Backpath algorithmy The BP algorithm learns the weights for a multilayer network,

    given a network with a fixed set of units and interconnections. Itemploys a gradient descent to attempt to minimize the squarederror between the network output values and the target valuesfor these outputs.

    y Because we are considering networks with multiple output unitsrather than single units as before, we begin by redefining E tosum the errors over all of the network output units

    y

    E(w) = (tkd okd)2

    d D koutputswhere outputs is the set of output units in the network, and tkd and

    okd are the target and output values associated with the kthoutput unit and training example d.

  • 8/3/2019 Development and Implementation of Parameterized

    10/26

    Backpath algorithm(contd)y The BP algorithm The algorithm applies to layered

    feedforward networks containing 2 layers of sigmoidunits, with units at each layer connected to all units

    from the preceding layer.y This is an incremental gradient descent version of

    Backpropagation.y The notation is as follows:

    y xij denotes the input from node i to unitj, and wij denotes the

    corresponding weight.y Hn denotes the error term associated with unit n. It plays a

    role analogous to the quantity (t o) in our earlier discussionof the delta training rule.

  • 8/3/2019 Development and Implementation of Parameterized

    11/26

    Back propagation implementation

  • 8/3/2019 Development and Implementation of Parameterized

    12/26

  • 8/3/2019 Development and Implementation of Parameterized

    13/26

    MLP neural network structure

  • 8/3/2019 Development and Implementation of Parameterized

    14/26

    Block diagram of PE(processing

    element)

  • 8/3/2019 Development and Implementation of Parameterized

    15/26

    Block view of hardware

    architecture

  • 8/3/2019 Development and Implementation of Parameterized

    16/26

    Implementation of Hardware

    architecturey The functional units consist of signal processing operations

    (e.g., multipliers, adders, squashing function realizations, etc.)and storage components (e.g., RAM containing weights values,input buffers, etc.).

    y Control components consist of state machines generated tomatch the needs of the network as configured. During designelaboration, functional components matching the providedparameters are automatically generated and connected, and thestate machines of control components are tuned to match thegiven architecture

    y Each layer subsequently generates a teacher if learning isenabled along with a number of PEs as configured for that layer.Each PE generates a number of MACC blocks equal to the widthof the previous layer as well as a squashing function block.

  • 8/3/2019 Development and Implementation of Parameterized

    17/26

    State machines for Network

    controller

  • 8/3/2019 Development and Implementation of Parameterized

    18/26

    State machine for network

    controllery The network controller is a Mealy state machine based on a counter

    indicating the number of clock cycles that have passed in the currentiteration (in the case of an online network) or the total number of clock

    cycles passed (in the case of an off line network).y As mentioned above, for online application,we would need network

    with learning capability. It is to be noted that, for backpropagationlearning algorithm, the errorneeds to be fed back. Therefore, Mealystate machine is suitable,as it is a finite-state transducer that generatesan output based on its current state and input. For the value of thecounter to have any meaning we must be able to precalculate thelatency to reach milestones in the forward and back passes of thenetwork.

    y These milestones are calculated during elaboration of the design.Based on these milestones, the state machine outputs a set of enable

    signals to control the flow of the network. In comparison,foroffline

  • 8/3/2019 Development and Implementation of Parameterized

    19/26

    rap ca user nte ace or

    generating networks in MATH LAB

    neural tool bar

  • 8/3/2019 Development and Implementation of Parameterized

    20/26

    Experimental results

  • 8/3/2019 Development and Implementation of Parameterized

    21/26

    Average error of appropriate tangent hyperbolic function

    using a LUT with uniform LUT and linear interpolation

  • 8/3/2019 Development and Implementation of Parameterized

    22/26

    BIT RESOLUTION REQUIRED TO RESOLVE THE

    WORST CASE APPROXIMATION ERROR

  • 8/3/2019 Development and Implementation of Parameterized

    23/26

    PLATFORMS AND

    IMPLEMENTATIONS

  • 8/3/2019 Development and Implementation of Parameterized

    24/26

  • 8/3/2019 Development and Implementation of Parameterized

    25/26

    Future directionsy Implementation of momentum factor in backpath

    algorithm narrows the gap between hardware and

    software techniquesy Implementation of batch learning in FPGAalong with

    momentum factor will become a substitute forsoftware techniques in implementing neural models

    y

    Implementation of Real time input datacommunication to FPGAwhich considers dependencyof fpga on clock frequency is a huge challenge

  • 8/3/2019 Development and Implementation of Parameterized

    26/26

    Conclusiony TheArchitecture survey and study of development and

    implementation of a parameterized FPGA-basedarchitecture for feed-forward MLPs with backpropagationlearning algorithm has been done.

    y Our architecture makes native prototyping and designspace exploration in hardware possible. Testing of thesystem using the spectrometry sample application showed

    that the system can reach 530 million connections persecond offline and 140 million online applications

    y It can be further implemented by mathlab tool for neuralapplications