Major_project Final Report V_3

Embed Size (px)

Citation preview

  • 8/11/2019 Major_project Final Report V_3

    1/62

    1

    VISUAL ODOMETER

    A Project Report Submitted in the Partial fulfilment of

    Bachelor of technology

    In

    ELECTRONICS AND COMMUNICATION

    ENGINEERING

    By

    Aravinthan Ramaraj 091114038

    Pratesh Kumar Reddy Rajupalem 091114076

    Hithesh Reddivari 091114115

    Lavudi Prudhvi 091114121

    Annangarachari R 091114148

    Korrapati Chandralohit 091114149

    DEPARTMENT OF ELECTRONICS AND COMMUNICATION

    ENGINEERING

    MAULANA AZAD NATIONAL INSTITUTE OF TECHNOLOGY

    BHOPAL

    INDIA

    April 2013

  • 8/11/2019 Major_project Final Report V_3

    2/62

    ii

    DECLARATION BY THE CANDIDATES

    We hereby declare that the project work entitled Visual Odometer is our own

    work conducted under the guidance of DR.ARVIND RAJAWAT, Professor, Department

    of Electronics and Communication Engineering, MANIT, Bhopal.

    We further declare that to the best of our knowledge the thesis does not contain any

    part of any work that has been submitted for the award of any degree either in this Institute

    or in any other University withoutproper citation.

    Student name Scholar number Signature

    Aravinthan Ramaraj 091114038

    Pratesh Kumar Reddy Rajupalem 091114076

    Hithesh Reddivari 091114115

    Lavudi Prudhvi 091114121

    Annangarachari R 091114148

    Korrapati Chandralohit 091114149

    This is to certify that the declaration made above by the candidate is true to the best

    of my knowledge.

    Dr.Arvind Rajawat

    Professor,

    Department of Electronics and Communication Engineering,

    Maulana Azad National Institute of Technology, Bhopal (MP

  • 8/11/2019 Major_project Final Report V_3

    3/62

    iii

    DEDICATION

    We would like to dedicate our work to four years of college life that MANIT has

    given us. We would like to dedicate this work to the entire department of Electronics and

    Communication Engineering; that is, the faculty and the students of the 2009-2013 batch.

  • 8/11/2019 Major_project Final Report V_3

    4/62

    iv

    ACKNOWLEDGEMENT

    We would like to express our gratitude to Dr. Arvind Rajawat for his valuable

    guidance. His valuable and admonishing feedbacks are responsible for the inclusion of

    innovative developments in the project and its early completion. We are indebted to the

    professor because we have observed and learnt many qualities like perseverance and

    passion for the subject from him.

  • 8/11/2019 Major_project Final Report V_3

    5/62

    v

    ABSTRACT

    The objective of the project is to demonstrate the working along with complete

    implementation of a new odometer technology. Unlike the conventional odometric

    techniques this project uses and demonstrates Visual Odometry. This objective has

    been accomplished by using an optical flow sensor ADNS3080which is basically a mouse

    sensor used in present day optical-mouse. The optical-flow sensor uses optical flow

    techniquesto calculate the relative motion vectors in terms of relative displacement values

    delta_x and delta_y. The outputs from the optical-flow sensor are given to an Arduino (a

    microcontroller based processing unit) which wirelessly transmits the data to a computer.

    Computer plots and logs the data. MATLAB was used to write the plotting code, develop

    it, make a GUI and finally to deploy a Windows executable (.exe) file. The .exe will

    generate a figure comprising velocity vs time plot, distance vs time plot and X vs Y plot in

    real-time. The plot can be saved and used for future reference.

  • 8/11/2019 Major_project Final Report V_3

    6/62

  • 8/11/2019 Major_project Final Report V_3

    7/62

    vii

    3. PROJECT EXPLANATION ....................................................................... 15

    3.1. Block diagram ........................................................................................................... 15

    3.1.1 Explanation of Block Diagram: .......................................................................... 16

    3.2. Arrangement and Interconnections of Components ................................................. 17

    3.2.1. Vehicle Side ....................................................................................................... 17

    3.2.2. User Terminal Side: ........................................................................................... 20

    3.3. Working .................................................................................................................... 21

    3.3.1. Vehicles Portion ............................................................................................... 21

    3.3.2. Flow chart for Arduino operations ..................................................................... 22

    3.3.3. Description of Vehicle Side flow chart .............................................................. 23

    3.3.4. User Terminals Portion ..................................................................................... 23

    3.3.5. Description of User Terminal Side Flow Chart ................................................. 25

    4. CALIBRATION AND TESTING .............................................................. 27

    4.1. Calibration Tables ..................................................................................................... 27

    4.2. Conclusions after calibration: ................................................................................... 28

    5. RESULTS................................................................................................... 29

    5.1. Plots and Their Description ...................................................................................... 29

    5.1.1. Surface Quality Plot: .......................................................................................... 29

    5.1.2. Distance vs Time Plot: ....................................................................................... 30

    5.1.3. Velocity vs Time Plot: ....................................................................................... 31

    5.1.4. X vs Y Plot: ........................................................................................................ 31

    5.2. Errors results in distance measurement .................................................................... 32

    6. APPLICATIONS ........................................................................................ 33

    6.1. In Space Rovers ........................................................................................................ 33

    6.2. Other Important Applications ................................................................................... 35

    References ....................................................................................................... 36

  • 8/11/2019 Major_project Final Report V_3

    8/62

    viii

    APPENDIX-1 .................................................................................................. 37

    MATLAB code for calibration and testing: ..................................................................... 37

    APPENDIX-2 .................................................................................................. 39

    Arduino code and explanation: ........................................................................................ 39

    APPENDIX-3 .................................................................................................. 41

    MATLAB main code and explanation ............................................................................. 41

    APPENDIX-4 .................................................................................................. 43

    Step by step instruction manual for operating the project: .............................................. 43

    APPENDIX-5 .................................................................................................. 44

    UART (Universal Asynchronous Receiver Transmitter) ................................................ 44

    APPENDIX-6 .................................................................................................. 48

    SPI (Serial Peripheral Interface) communication protocol: ............................................. 48

  • 8/11/2019 Major_project Final Report V_3

    9/62

    ix

    List of Figures

    Figure 1: Rotational Encoders ............................................................................................... 2

    Figure 2: Illustration showing wheel slip .............................................................................. 3

    Figure 3: Cameras as odometric sensors................................................................................ 3

    Figure 4: Monocular, stereo and omnidirectional camera examples ..................................... 4

    Figure 5: Omni directional Camera on a car for measuring optical flow .............................. 5

    Figure 6: Optical flow between two consecutive images ...................................................... 5

    Figure 7: Optical flow sensor ................................................................................................. 7

    Figure 8: Block Diagram of ADNS-3080 .............................................................................. 8

    Figure 9: Optical Flow Sensor Pin-out .................................................................................. 9

    Figure 10: Arduino UNO ..................................................................................................... 11

    Figure 11: Arduino software snapshot ................................................................................. 12

    Figure 12: RxTx module ...................................................................................................... 12

    Figure 13: Transmitter ......................................................................................................... 13

    Figure 14: Receiver .............................................................................................................. 13

    Figure 15: Block Diagram of entire project ......................................................................... 15

    Figure 16: Vehicle ............................................................................................................... 17

    Figure 17: Illustration of Sensor and Vehicle Relative Alignment ..................................... 18

    Figure 18: Circuit diagram at vehicle side ........................................................................... 19

    Figure 19: Wireless receiver circuit ..................................................................................... 20

    Figure 20: Serial to USB converter...................................................................................... 20

    Figure 21: Circuit Diagram for the Receiving Circuit ......................................................... 21

    Figure 22: Graphical User Interface .................................................................................... 26

    Figure 23: Final result figure ............................................................................................... 29

    Figure 24: Surface quality plot ............................................................................................ 30

    Figure 25: Distance vs time plot .......................................................................................... 30

    Figure 26: Velocity vs time plot .......................................................................................... 31

    Figure 27: X vs Y plot ......................................................................................................... 31

    Figure 28: Image of Mars Curiosity Rover .......................................................................... 33

  • 8/11/2019 Major_project Final Report V_3

    10/62

    x

    Figure 29: Illustration Showing Possible Wheel Slip .......................................................... 34

    Figure 30: US Army Spy bot with camera .......................................................................... 35

    Figure 31: SPI Block Diagram ............................................................................................. 48

    Figure 32: SPI Master-Slave Interconnection ...................................................................... 50

  • 8/11/2019 Major_project Final Report V_3

    11/62

    xi

    List of Tables

    Table 1: Pin-out of optical flow sensor .................................................................................. 9

    Table 2: Transmitter Pinout ................................................................................................. 13

    Table 3: Receiver Pin-out .................................................................................................... 14

    Table 4: Connections between Arduino and Optical flow sensor ....................................... 19

    Table 5: Experiment values for tar road under sunlight ...................................................... 27

    Table 6: Experiment values for tar road under the shade of a tree ...................................... 27

    Table 7: Experiment values for tiles in sunlight .................................................................. 27

    Table 8: Experiment values for tiles under shade of a tree .................................................. 28

    Table 9: Experiment values for cement slab in sunlight ...................................................... 28

    Table 10: Experiment values for cement slab in shade ....................................................... 28

    Table 11: Results in sunlight ............................................................................................... 32

    Table 12: Results in shade ................................................................................................... 32

  • 8/11/2019 Major_project Final Report V_3

    12/62

    xii

    List of Codes

    Code 1: Pre-processor code ................................................................................................. 39

    Code 2: Variable declaration ............................................................................................... 39

    Code 3: Initial setup ............................................................................................................. 39

    Code 4: Main loop ............................................................................................................... 40

    Code 5: Function for sending motion data ........................................................................... 40

    Code 6: MATLAB Variable declaration ............................................................................. 41

    Code 7: Figure Creation ....................................................................................................... 41

    Code 8: Serial Port Access ................................................................................................... 41

    Code 9: MATLAB Main loop ............................................................................................. 42

    Code 10: Clear Serial port ................................................................................................... 42

    http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701858http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701858http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701859http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701859http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701860http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701860http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701861http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701861http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701862http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701862http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701863http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701863http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701864http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701864http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701865http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701865http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701866http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701866http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701867http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701867http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701867http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701866http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701865http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701864http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701863http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701862http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701861http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701860http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701859http://c/Users/hitheshaum/Desktop/major_project%20final%20report%20v_3.docx%23_Toc354701858
  • 8/11/2019 Major_project Final Report V_3

    13/62

    1

    1. INTRODUCTION

    Out of many factors that played remarkable role in development and flourishing of

    human race, exploration finds a prominent position. We, the mankind would not have

    stood long without our strong desire to explore the world. Its this curiosity to explore new

    horizons made us understand the universe. This is what made us take our first step in our

    childhood and that made us different from the frogs in a well. We started with a step and

    now we are crossing planets. The newly sent mars rover aptly named CURIOSITY has

    exemplified this interest in us.

    Along with the passion we need a means, a tool that can make these explorations

    successful. It is important that we are not lost during our explorations. This can be possible

    with the aid of methods, the methods by which we can trace back our path to our initial

    location while exploring new territories, and which are called navigational systems. From

    the times of mariners compass we have now ascended to high precision GPS , INS

    technologies. And these technologies have kept us in track without disappointment. But as

    the situations take their shape, necessity for more precision is required. There is a lot more

    we will be capable of achieving with the improved algorithms and mechanisms. So it is a

    part of our continuous evolution that we aspire for new robust, reliable, precise

    navigational systems.

    Tracing the travelled path and locating the position starts with the most basic step

    of measuring the distance travelled. This process of measuring distance is called

    ODOMETRY

  • 8/11/2019 Major_project Final Report V_3

    14/62

    2

    1.1. Odometry

    Odometry is the use of data from moving sensors to estimate change in position

    over time, so as to estimate their position relative to a starting location. The word

    Odometry is composed from the Greek words hodos (meaning "travel", "journey") and

    metron (meaning "measure"). It requires rapid and accurate data collection, equipment

    calibration, and processing for optimum results. The system requires operating in real-time

    with low delay. The motion estimates when aided with information of initial position and

    velocity can be used for navigational purposes also.

    1.2. Techniques Used in Odometry

    1.2.1. Mechanical Method (Using Rotational Encoders)

    This method uses mechanical/electronic encoders which convert the number of

    rotations made by the wheels of the moving platform (viz. wheeled robot, car) into either

    analog/digital signals which in turn drive the meter displays.

    Figure 1: Rotational Encoders

    This method is largely used at present in ground vehicles due its simplicity and cost

    effectiveness. It lacks the precision and is prone to errors due to slippage and sliding of

    wheels which can happen when the terrain is sandy or smooth as shown in the figure

    below. It cannot be applied to mobile robots with non-standard locomotion methods, such

    as legged robots.

  • 8/11/2019 Major_project Final Report V_3

    15/62

    3

    Figure 2: Illustration showing wheel slip

    1.2.2. Visual Methods: (Using Camera Sensors)

    This system estimates the motion of a platform carrying stereo head or a single

    moving camera based on video input. As motion is estimated from visual input alone it is

    termed as Visual Odometry (VO). The term VO was coined in 2004 by Nister in his

    landmark paper [2]

    Figure 3: Cameras as odometric sensors

    A good amount of precision can be achieved. Even though the robustness of the

    system depends on the type of the surfaces and illumination, it is immune to errors of

    slips/slides. It can be used for even non-conventional locomotion methods. Because of

    moderate cost, they are generally used in robotic applications, such as on the Mars

    Exploration Rovers.

  • 8/11/2019 Major_project Final Report V_3

    16/62

  • 8/11/2019 Major_project Final Report V_3

    17/62

    5

    Figure 5: Omni directional Camera on a car for measuring optical flow

    The tracked features are matched between pair of images to construct optical flow

    field.

    This involves

    Using correlation to establish correspondence of two images, and no long

    term feature tracking.

    Feature extraction and correlation (LucasKanade or Horn-Schunk method).

    Construct optical flow field.

    Figure 6: Optical flow between two consecutive images

    Check flow field vectors for potential tracking errors and remove outliers.

    Estimation of the camera motion from the optical flow

    1.3.2. Optical Flow Methods

    The optical flow methods usually preferred for extracting and matching the features in

    image sequences are

  • 8/11/2019 Major_project Final Report V_3

    18/62

  • 8/11/2019 Major_project Final Report V_3

    19/62

    7

    2. PROJECT DEVICES DESCRIPTION

    In this section a detailed report of all the components used in this project and their

    respective function in this project.

    2.1. Optical Flow Sensor:

    Figure 7: Optical flow sensor

    2.1.1. Description:

    This sensor is based on ADNS3080 Mouse sensor which is used in present day

    optical mouse. It has a camera along with lens which takes the images on which optical

    flow algorithms are implemented. The sensor is programmed via registers through a four-

    wire serial port. The features of this sensor are summarised below.

    Features:

    ADNS3080 mouse sensor

    8mm lens with 11degree FOV

    Standard M12x0.5 lens mount High speed motion detectionup to 40 ips and 15g

    New architecture for greatly improved optical navigation technology

    Programmable frame rate over 6400 frames per second

    Smart Speed self-adjusting frame rate for optimum performance

    Serial port burst mode for fast data transfer

    400 or 1600 cpi1selectable resolution

    1 CPI (Counts per Inch) resolution is indicative of how many times per inch of optical mouse

    movement that the optical mouse sensor sends an image sensing report signal to an optical mousecontroller.

  • 8/11/2019 Major_project Final Report V_3

    20/62

    8

    Single 3.3 volt power supply

    Four-wire serial port along with Chip Select, Power Down, and Reset pins

    The ADNS-3080 is based on a new, faster architecture with improved navigation.

    The sensor is capable of sensing high speed mouse motion - up to 40 inches per second and

    acceleration up to 15g for increased user precision and smoothness [1]. The sensor is

    programmed via registers through a four-wire serial port. It is packaged in a 20-pin

    staggered dual inline package (DIP).

    2.1.2. Theory of Operation:

    The ADNS-3080 is based on Optical Navigation Technology which measureschanges in position by optically acquiring sequential surface images (frames) and

    mathematically determining the direction and magnitude of movement. It contains an

    Image Acquisition System (IAS), a Digital Signal Processor (DSP), and a four-wire serial

    port. The IAS acquires microscopic surface images via the lens and illumination system.

    These images are processed by the DSP to determine the direction and distance of motion.

    The DSP calculates the x and y relative displacement values. These relative

    displacement values returned by the sensor are proportional to real world distances

    provided sensors altitude is kept constant [1].So in order to get the real world distances

    (in meters) one has to find the multiplying factor.

    Figure 8: Block Diagram of ADNS-3080

  • 8/11/2019 Major_project Final Report V_3

    21/62

    9

    2.1.2 Pin-out

    Table 1: Pin-out of optical flow sensor

    Figure 9: Optical Flow Sensor Pin-out

    2.1.3. Synchronous Serial Port

    The synchronous serial port is used to set and read parameters in the ADNS-3080,

    and to read out the motion information. The serial port is also used to load SROM data into

    the ADNS-3080. The port is a four-wire, serial port. The host micro-controller always

    initiates communication; the ADNS-3080 never initiates data transfers. The serial port

    cannot be activated while the chip is in power down mode (NPD low) or reset (RESET

    high). SCLK, MOSI, and NCS may be driven directly by a 3.3V output from a micro-

    Pin Name Description

    1 GND Ground

    2 5V 5V Supply voltage

    3 3V 3VSupply voltage

    4 NCS Chip select pin

    5 MISO Serial Data Output (Master In/ Slave Out)6 MOSI Serial Data Input (Master Out/ Slave In)

    7 SCLK Serial clock

    8 RST Reset

    9 NPD Power Down (Active low input)

    10 LED Can be connected to an external LED

  • 8/11/2019 Major_project Final Report V_3

    22/62

    10

    controller, or they may be placed in an open drain configuration by enabling on-chip pull-

    up current sources. The open drain drive allows the use of a 5V micro-controller without

    any level shifting components. The port pins may be shared with other SPI slave devices.

    When the NCS pin is high, the inputs are ignored and the output is tri-stated.

    The lines which comprise the SPI port are:

    SCLK: Clock input. It is always generated by the master (the micro- controller).

    MOSI: Input data (Master Out/Slave In).

    MISO: Output data (Master In/Slave Out).

    NCS: Chip select input (active low).

    2.1.4. Important Terms and Definitions

    Delta_X: X movement is counted since last report. Absolute value is determined by

    resolution.

    Delta_Y: Y movement is counted since last report. Absolute value is determined by

    resolution.

    Surface_Quality: SQUAL (Surface Quality) is a measure of of the number of valid

    features visible by the sensor in the current frame. Use the following formula to find the

    total number of valid features.

    The maximum SQUAL register value is 169. Since small changes in the current

    frame can result in changes in SQUAL, variations in SQUAL when looking at a surface are

    expected.

    2.2. Arduino

    Arduino is a single-board microcontroller designed to make the process of using

    electronics in multidisciplinary projects more accessible. The hardware consists of a simple

    open source hardware board designed around an 8-bit Atmel AVR microcontroller. The

    software consists of a standard programming language compiler and a boot loader thatexecutes on the microcontroller. The board which we are using in our project is Arduino

  • 8/11/2019 Major_project Final Report V_3

    23/62

    11

    UNO. The Arduino UNO is a microcontroller board based on the ATmega328. It has 14

    digital input/output pins (of which 6 can be used as PWM outputs), 6 analog inputs, a 16

    MHz ceramic resonator, a USB connection, a power jack, an ICSP header, and a reset

    button. It contains everything needed to support the microcontroller [4].

    Figure 10: Arduino UNO

    Summary:

    Microcontroller ATmega328

    Operating Voltage 5V

    Input Voltage (recommended) 7-12V

    Input Voltage (limits) 6-20V

    Digital I/O Pins 14 (of which 6 provide PWM output)

    Analog Input Pins 6

    DC Current per I/O Pin 40 mA

    DC Current for 3.3V Pin 50 mA

    Flash Memory 32 KB of which 0.5 KB used by boot-loader

    SRAM 2 KB

    EEPROM 1 KB

    Clock Speed 16 MHz

  • 8/11/2019 Major_project Final Report V_3

    24/62

    12

    2.2.1. Programming

    The Arduino Uno can be programmed with the Arduino software. The ATmega328

    on the Arduino Uno comes preburned with a bootloader that allows you to upload new

    code to it without the use of an external hardware programmer. A screenshot of the

    computer interface used to write and upload programs is shown below [4].

    Figure 11: Arduino software snapshot

    2.3. Wireless Transmitter-ReceiverPair of transmitter receiver module is used wirelessly transmit data serially. This

    module operates at 2400 baud rate.

    Figure 12: RxTx module

  • 8/11/2019 Major_project Final Report V_3

    25/62

    13

    2.3.1. Features

    Operating Frequency 433Mhz

    Data Rate 2400 bps

    Operating voltage

    Transmitter 2 - 12 V

    Receiver 4.5-5.5 V

    2.3.2. Pin-out

    Transmitter:

    Figure 13: Transmitter

    Table 2: Transmitter Pinout

    Pin Name Description

    1 GND Ground

    2 DATAIN Serial input

    3 VCC Transmitter Power supply

    4 ANT Antenna

    Receiver:

    Figure 14: Receiver

  • 8/11/2019 Major_project Final Report V_3

    26/62

    14

    Table 3: Receiver Pin-out

    Pin Name Description

    1 GND Ground

    2 DATAOUT Serial data output

    3 NC No connect

    4 VCC Receiver Power supply

    5 VCC Receiver Power supply

    6 GND Ground

    7 GND Ground

    8 ANT Antenna

    2.4. Serial to USB Converter

    This device converts Serial RS232 protocol data into USB protocol data so that it

    can be used by present day laptops which doesnt come with a RS232 port.

    2.5. Application Software Description

    The serial data received by the computer is processed to produce required plots. A

    windows .exe file does all these operations after we provide the current COM port and

    number of iterations the code has to run. The code for the program is written in MATLAB

    and then it is compiled into an executable file using Deploy Tool in MATLAB.

  • 8/11/2019 Major_project Final Report V_3

    27/62

    15

    3. PROJECT EXPLANATION

    This chapter explains the entire working of the project developed both structurally

    and functionally. We start by describing the block diagram of the project which illustrates

    the basic blocks in the project along with their role in the project.

    3.1. Block diagram

    Figure 15: Block Diagram of entire project

    PROCESSING

    BLOCK

  • 8/11/2019 Major_project Final Report V_3

    28/62

    16

    3.1.1 Explanation of Block Diagram:

    Optical Flow Sensor:

    The optical flow sensor senses the motion and outputs the relative displacement

    values delta_x, delta_y and that data is sent to the processing block

    Processing Block:

    The processing block handles the communication protocol and communicates with

    the Optical Flow Sensor. This is then relayed to the computer using a wireless medium.

    433 MHz TxThis is the wireless transmission block which transmits the data wirelessly at 433

    MHz to the computer side.

    433 Mhz Rx

    This block is a wireless receiving block which receives the wireless data sent by

    wireless transmission block and then sends to Serial to USB converter for further

    processing

    Serial to USB Converter:

    This block handles the protocol conversion from UART (Universal Asynchronous

    Transmitter Receiver) protocol to USB (Universal Serial Bus) protocol. This block is

    necessary to connect with the present day computers where no RS232 port is present.

    Computer/GUI:

    The Serial to USB converter appears in computer as a serial COM port which is

    used to give data to MATLAB GUI which then plots and logs the data.

    Power system

    Not shown In the block diagram is the power system which gives power required

    for proper functioning of the hardware components used.

  • 8/11/2019 Major_project Final Report V_3

    29/62

  • 8/11/2019 Major_project Final Report V_3

    30/62

    18

    Arduino and Transmitter module. As shown in figure-16 the batteries are placed at the rear

    part of the vehicle. The Optical flow sensor is supplemented with an LED light (Not shown

    in figure) in order to illuminate the surface in case of low lightning conditions.

    Figure 17: Illustration of Sensor and Vehicle Relative Alignment

  • 8/11/2019 Major_project Final Report V_3

    31/62

    19

    Circuit Diagram at Vehicle Side:

    Figure 18: Circuit diagram at vehicle side

    The pins from Arduino to Optical Flow sensor is connected as shown in the table below

    Table 4: Connections between Arduino and Optical flow sensor

    Description Arduino Pin Optical Flow sensor pin

    SCK 13 7

    MISO 12 5

    MOSI 11 6

    NCS 10 4

  • 8/11/2019 Major_project Final Report V_3

    32/62

  • 8/11/2019 Major_project Final Report V_3

    33/62

    21

    Circuit diagram at user terminal side (Receiving Circuit):

    Figure 21: Circuit Diagram for the Receiving Circuit

    3.3. Working

    The working of the project is described by dividing it into vehicle side and user

    terminal side. Since the arrangement and interconnections are understood, now we describe

    the functional description of the project along with role played by each component.

    3.3.1. Vehicles Portion

    At the vehicle side Arduino is used as the control and processing unit. The Arduino

    sends commands to the optical flow sensor requesting data from the sensor. The optical

    flow sensor returns the relative displacement values which are transferred to the

    microcontroller on the Arduino board using an SPI communication protocol (Appendix-6).

    The specific addresses of the registers are given in the sensors datasheet.The arduino is

  • 8/11/2019 Major_project Final Report V_3

    34/62

  • 8/11/2019 Major_project Final Report V_3

    35/62

    23

    3.3.3. Description of Vehicle Side flow chart

    The first block starts when the power is turned on i.e when the batteries are

    connected to the Arduino board. It is a process of including the header files required for the

    functioning of the Arduino code.

    The second block is a variable declaration block where the memory registers are

    allocated for the further processes to take place.

    In the third block the initialization of serial port and Optical-flow sensor take place.

    The serial port is set to 2400 baud rate.

    The fourth one is an input block where sensors values are input into the Arduino.

    The Arduino communicates with the sensor using SPI protocol. The details of the protocol

    are described in Appendix-6.

    Framing:

    The data to be sent to the user terminal is first formed into frames in order to avoid

    confusions at the receiver side. De-limiters are inserted in between the data in order to

    distinguish from each other as shown in the frame structure below

    Frame structure:

    Delta_x ; Delta_y ; SQAL ; Delta_t

    Here Delta_x is the relative displacement value of x, delta_y is the relative

    displacement value of y, SQAL is the surface quality, Delta_t is the time taken to measure

    relative displacement values.

    The de-limiter ; is used to separate the values from each other. After the frame acarriage return is sent to inform the end of the frame.

    In the last block the frames generated are sent serially to the wireless transmission

    module which relays the data to the user terminal.

    3.3.4. User Terminals Portion

    The wireless receiver module receives the data and relays it to Serial to USB

    converter via a MAX232 level converter. Then the data is relayed to computer COM port

    from which the MATLAB program grabs it. The communication protocol used is UART

  • 8/11/2019 Major_project Final Report V_3

    36/62

    24

    which is described in Appendix-5. The flow chart describing MATLAB code operation is

    given in the section below.

    3.3.5. Flow Chart Describing MATLAB Code Operation

    Variable Declaration

    Creating Figure and initialising

    Subplots

    Initialising Serial Port

    Acquire Data from

    Serial Port

    Process the Data by multiplying them

    with their respective multiplying

    factors

    Plot and log the data

    Close Serial port

  • 8/11/2019 Major_project Final Report V_3

    37/62

    25

    3.3.6. Description of User Terminal Side Flow Chart

    In the first block the matrices are created as variables which have their use in

    further processes of the code

    In the second block the Figure is created along with plots and their handles are

    stored in separate variables. This plots the initial values of the plots as (0,0).

    In the third block Serial port object is created and thus a buffer space is created to

    accommodate the incoming data from the COM port. The baud rate is set to 2400 equal to

    the transmitter baudrate. After the object is created, the port is opened.

    In the fourth block the serial data which is automatically queued in the buffer is

    taken into a matrix variable as a character string. Then using a specific MATLAB function

    the data is separated into their respective values using the delimiter and stored in respective

    variable matrices.

    In the fifth block the calibration results are used to get the values of actual distance

    velocity and angle turned by the vehicle from initial position. The delta_y value returned

    by the sensor which is now stored in a new variable dy is used to calculate the real

    distance i.e distance in centi-meters. The delta_x value returned by the sensor which is now

    stored in a new variable dx is used to calculate the angle turned by the vehicle from its

    initial position (Refer Figure-17)

    In the last block the values found above are plotted in the plots. Surface quality is

    plotted with respect to iteration count. Distance is plotted with respect to time. Velocity is

    plotted with respect to time. The last plot which plots the path traversed by the vehicle

    assumes vehicles initial position to be (0,0), vehicles initial forward direction as x-axis,

    and perpendicular to it as y-axis. As the Axes of the sensor changes from time to time with

    respect to our present axis, vehicles rotation angle and distance moved are taken as

    reference to plot the path plot. The equations can be easily understood from the MATLAB

    code given in Appendix-3. The plots and their explanation is given in Chapter-5.

  • 8/11/2019 Major_project Final Report V_3

    38/62

    26

    After the code was written, it was compiled into a Graphical User Interface (GUI)

    and deployed into a Windows executable file whose screen shot is shown below. The step

    by step instruction manual to run the program is given in Appendix-4.

    Figure 22: Graphical User Interface

  • 8/11/2019 Major_project Final Report V_3

    39/62

    27

    4. CALIBRATION AND TESTING

    We have created a separate MATLAB program to calibrate the sensor by which we

    get the multiplying factors for converting sensor values to real world distances and angles.

    The code is similar to the main code and is given in the Appendix-1. The values are

    tabulated and the final multiplying factor is found out.

    4.1. Calibration Tables

    Table 5: Experiment values for tar road under sunlight

    AVERAGE

    SURFACE

    QUALITY

    ACTUAL LENGTH

    (in cm)

    SENSOR

    VALUE

    CALIBRATION

    FACTOR

    (cm/pixel)

    105 100 2377 0.04207

    99 75 1799 0.04169

    100 50 1184 0.04223

    110 25 596 0.04195

    Average calibration factor=0.041985

    Table 6: Experiment values for tar road under the shade of a tree

    AVERAGE

    SURFACE

    QUALITY

    ACTUAL LENGTH

    (in cm)

    SENSOR

    VALUE

    CALIBRATION

    FACTOR

    (cm/pixel)

    90 100 2354 0.04248

    100 75 1751 0.04283

    120 50 1194 0.04188

    102 25 578 0.04325

    Average calibration factor=0.04261

    Table 7: Experiment values for tiles in sunlight

    AVERAGE

    SURFACE

    QUALITY

    ACTUAL LENGTH

    (in cm)

    SENSOR

    VALUE

    CALIBRATION

    FACTOR

    (cm/pixel)

    65 100 2442 0.04095

    50 75 1817 0.04128

    55 50 1186 0.04216

    50 25 583 0.04288

    Average calibration factor=0.04182

  • 8/11/2019 Major_project Final Report V_3

    40/62

  • 8/11/2019 Major_project Final Report V_3

    41/62

    29

    5. RESULTS

    After all the above components are connected properly, when one executes the .exe

    file, it gives out a figure similar to the figure shown below. The application continuously

    plots the data in real time as the vehicle is running.

    Figure 23: Final result figure

    As shown in the above figure, the application returns four plots which is described

    below in detail.

    5.1. Plots and Their Description

    1. Surface Quality Plot

    2. Distance vs Time Plot

    3. Velocity vs Time Plot

    4. X vs Y Plot

    5.1.1. Surface Quality Plot:

    This plot plots in real time the variations in the surface quality on which the vehicle

    is moving. As described before in equation-1, the value of surface quality denotes one-

  • 8/11/2019 Major_project Final Report V_3

    42/62

    30

    fourth of the number of features on the surface the sensor is tracking. This graph is

    included in order to find if the surface is bright enough for visual odometry.

    Figure 24: Surface quality plot

    5.1.2. Distance vs Time Plot:

    This plot plots in real time the distance moved by the vehicle along with time taken

    from starting of the vehicle. Negative distance implies that the vehicle has moved in

    backward direction. If the bot makes a 180 degree turn and moves backward, that distance

    will be included in the positive plot.

    Figure 25: Distance vs time plot

  • 8/11/2019 Major_project Final Report V_3

    43/62

    31

    5.1.3. Velocity vs Time Plot:

    This plot plots in real time the velocity of the vehicle vs the time taken from start.

    The positive values in the plot convey that the bot is moving forward with velocity given

    by y co-ordinate at that point. The negative values in the plot convey that the bot is moving

    backward with velocity given by y co-ordinate at that point.

    Figure 26: Velocity vs time plot

    5.1.4. X vs Y Plot:

    This plot plots the Path of the vehicle from the starting point so that we can track

    the vehicle on a computer. The plot always strarts at co-ordinate (0,0).

    Figure 27: X vs Y plot

  • 8/11/2019 Major_project Final Report V_3

    44/62

    32

    5.2. Errors results in distance measurement

    Table 11: Results in sunlight

    AVERAGE

    SURFACE

    QUALITY

    SENSOR

    VALUE

    CALUCULATED

    DISTANCE

    ( in cm)

    ACTUAL

    DISTANCE

    (in cm)

    ABSOLUTE

    ERROR

    (in cm)

    ERROR

    (%)

    50 500 20.85 20 0.85 4.25

    60 970 40.4781 40 0.4781 1.2

    55 1489 62.13 60 2.13 3.55

    Table 12: Results in shade

    AVERAGE

    SURFACE

    QUALITY

    SENSOR

    VALUE

    CALUCULATED

    DISTANCE

    ( in cm)

    ACTUAL

    DISTANCE

    (in cm)

    ABSOLUTE

    ERROR

    (in cm)

    ERROR

    (%)

    55 440 18.9 20 1.1 5.5

    55 902 38.1278 40 1.8722 4.6

    100 1413 59.7278 60 0.2722 0.5

  • 8/11/2019 Major_project Final Report V_3

    45/62

    33

    6. APPLICATIONS

    The technology demonstrated by this project (Visual Odometry) finds its

    applications in various fields. Application domains include robotics, augmented reality,

    and automotive. In GPS-denied environments, such as underwater and aerial, VO has

    utmost importance. A detailed description of applications with emphasis of rover

    application is given below.

    6.1. In Space Rovers

    The autonomous space exploration represents a new frontier in space missions,

    since it allows exploring unknown planets through the eyes of unmanned robotic systems.

    Following this direction, in the past years many projects were realised, in order to design

    autonomous vehicles able to autonomously explore planet surfaces [7].

    Figure 28: Image of Mars Curiosity Rover

    Due to the distance between Mars and Earth, real time tele-operation is not a

    feasible solution for controlling a planetary exploration vehicle. Consequently, planetary

    rovers must be provided with a certain level of autonomy, which allows them to

    autonomously move in an unknown environment while avoiding obstacles, for reaching the

    targets identified by the scientists working at the ground station. The problem of reaching a

  • 8/11/2019 Major_project Final Report V_3

    46/62

    34

    required target can be split in three different subproblems: the guidance, the control and the

    navigation problems. The guidance and control problems are related to the definition and

    control of the path followed by the rover, while the navigation problem is related to the

    reconstruction of the state of the rover with respect to the external environment. One of the

    fundamental aspects of the rover navigation is the localisation problem, which consists in

    reconstructing the position and the orientation of the vehicle with respect to a global

    reference frame usually defined by the initial conditions of the exploration. The

    localisation problem is typically solved by integrating the measurements of an inertial

    measurement unit or by using odometry on wheels. For the first solution, the slow motion

    of the rover forces the inertial measurement units to be very accurate to correctlyreconstruct position and velocity. In addition, the drift inherent with the integration of the

    measurements requires a redefinition of the global reference frame after a certain amount

    of time or to reset the actual position of the rover using known landmarks position

    measurements. On the contrary, odometry is a direct measure of displacement; therefore it

    is not affected by drift errors. However, the reconstruction of the global motion from the

    measures coming from different wheels can bring to errors in case of uneven terrain or

    wheels slippage.

    Figure 29: Illustration Showing Possible Wheel Slip

    The decision of developing a visual odometry system for detecting the rover

    movements has been taken considering the need of having a sensorial system not affected

    by the classical problems of drift and slide. In fact, since the proposed system is only based

    on the analysis of sequences of images acquired by the rover cameras, it only depends on

    their quality. Moreover, since the movement is directly related to the acquired images, it

  • 8/11/2019 Major_project Final Report V_3

    47/62

    35

    can be reconstructed in every situation, and independently from any sliding problem, as

    opposed to other systems (for example, if odometry is used, the wheels slippage could

    bring to a wrong movement estimation) [7].

    6.2. Other Important Applications

    Visual Odometry (VO) is also applied on-board of unmanned aerial vehicles of all

    kinds of sizes, e.g., within the Autonomous Vehicle Aerial Tracking and Reconnaissance

    [8] and Swarm of Micro Flying Robots (SFLY) [9] projects. Within the SFLY project, VO

    was used to perform autonomous take-off, point-to-point navigation, and landing of small-

    scale quadrocopters.

    Autonomous underwater vehicle is also a domain where VO plays a big role.

    Underwater vehicles cannot rely on GPS for position estimation; thus, on-board sensors

    need to be used. Cameras provide a cost-effective solution; in addition, the ocean floor

    often provides a texture-rich environment [10], which is ideal for computer vision

    methods. Applications range from coral-reef inspection (e.g., the Starbug system [10] to

    archaeological surveys [11]. VO also plays a big role in the automotive industry. Driver

    assistance systems (e.g., assisted braking) already rely on computer vision and digital

    cameras. VO for automotive market is in development, and its first demonstrations have

    been successfully shown, e.g., within the Daimler 6-D-Vision system [12] or as part of the

    VisLab autonomous vehicle [13]. Driving the development of this technology is the low

    cost of vision sensors as compared to Lidar sensors, which is an important factor for the

    automotive industry. VO dont require the assistance of any outside element (like satellite

    in GPS) to trace the travelled path. So there is no chance of hacking or jamming the

    systems. This feature of VO is very useful in Military and Spying Applications.

    Figure 30: US Army Spy bot with camera

  • 8/11/2019 Major_project Final Report V_3

    48/62

    36

    References

    1. ADNS-3080 Datasheet, Link :http://www.avagotech.com/docs/AV02-0366EN

    2. D. Nister, O. Naroditsky, and J. Bergen, Visual odometry for ground vehicle

    applications, J. Field Robot., vol. 23, no. 1, pp. 320, 2006.

    3. Visual Odometry: Part I: The First 30 Years and Fundamentals by Davide

    Scaramuzza and Friedrich Fraundorfer. IEEE ROBOTICS & AUTOMATION

    MAGAZINE, DECEMBER 2011 pp. 80-92

    4. www.arduino.cc

    5. en.wikipedia.org/wiki/LucasKanade_method

    6. en.wikipedia.org/wiki/HornSchunck_method.

    7. Fast Visual Odometry System for Planetary Exploration Rover based on Discrete

    Stereo Optical Flow, M. Massari and G. Giardini, Aerotecnica Missili & Spazio,

    The Journal of Aerospace Science, Technology and Systems, pp 131-142

    8. J. Kelly and G. S. Sukhatme. (2007. Sept.). An experimental study of aerial stereo

    visual odometry. In Proc. IFAC Int. Federation of Automatic Control Symp.

    Intelligent Autonomous Vehicles, Toulouse, France.

    9. S. Weiss, D. Scaramuzza, and R. Siegwart, Monocular-SLAM-based navigationfor autonomous micro helicopters in GPS-denied environments, J. Field Robot.,

    vol. 28, no. 6, pp. 854874, 2011.

    10.M. Dunbabin, J. Roberts, K. Usher, G. Winstanley, and P. Corke, Ahybrid AUV

    design for shallow water reef navigation, in Proc. IEEE Int. Conf. Robotics and

    Automation, ICRA, Apr. 2005, pp. 21052110.

    11.B. P. Foley, K. DellaPorta, D. Sakellariou, B. S. Bingham, R. Camilli, R. M.

    Eustice, D. Evagelistis, V. L. Ferrini, K. Katsaros, D. Kourkoumelis, A. Mallios, P.

    Micha, D. A. Mindell, C. Roman, H. Singh, D. S. Switzer, and T. Theodoulou,

    The 2005 chios ancient shipwreck survey: New methods for underwater

    archaeology, Hesperia, vol. 78, no. 2, pp. 269305, 2009.

    12.A. G. Daimler. (2011). 6d vision. [Online]. Available: http://www.6d-vision.com/

    13.M. Bertozzi, A. Broggi, E. Cardarelli, R. Fedriga, L. Mazzei, and P. Porta, Viac

    expedition toward autonomous mobility [from the field], IEEE Robot. Automat.

    Mag., vol. 18, no. 3, pp. 120124, Sept. 2011.

    http://www.avagotech.com/docs/AV02-0366ENhttp://www.avagotech.com/docs/AV02-0366ENhttp://www.avagotech.com/docs/AV02-0366ENhttp://www.arduino.cc/http://www.arduino.cc/http://www.arduino.cc/http://www.avagotech.com/docs/AV02-0366EN
  • 8/11/2019 Major_project Final Report V_3

    49/62

  • 8/11/2019 Major_project Final Report V_3

    50/62

    38

    trydata=fscanf(s);

    C = textscan(data, '%d%d%d%d','delimiter',';');[dx dy sq dt]=deal(C{:});if (isempty(dx)||isempty(dy)||isempty(sq)||isempty(dt))

    continueendmock_dx=double(-dy);mock_dy=double(dx);teta=double(mock_dy*pi/(2*tmf));t=prev_t+teta;prev_t=t;XY_x=prev_XY_x+mock_dx*cos(t);prev_XY_x=XY_x;XY_y=prev_XY_y+mock_dx*sin(t);prev_XY_y=XY_y;

    x=prev_x+dx;prev_x=x;y=prev_y-dy;prev_y=y;surfacequality=get(h1,'YData');x_disp=get(h2,'YData');y_disp=get(h3,'YData');%t_diff=get(h4,'Ydata');XY_X=get(h4,'XData');XY_Y=get(h4,'YData');X=get(h5,'YData');

    Y=get(h6,'YData');

    set(h1,'YData',[surfacequality sq]);set(h2,'YData',[x_disp dx]);set(h3,'YData',[y_disp dy]);set(h4,'YData',[XY_Y XY_y],'XData',[XY_X XY_x]);set(h5,'YData',[X x]);set(h6,'YData',[Y y]);set(f1,'Visible','on');prev_dx=dx;prev_dy=dy;

    catcherr

    continueendend%% clear serial portfclose(s);delete(s);clear('s');

  • 8/11/2019 Major_project Final Report V_3

    51/62

    39

    APPENDIX-2

    Arduino code and explanation:

    Initialise the pre-processor code (include header files and define constants)

    Variable declaration

    Initial setup

    #include "SPI.h"

    #include "ADNS3080.h"

    #define AP_SPI_DATAIN 12 //MISO

    #define AP_SPI_DATAOUT 11 //MOSI

    #define AP_SPI_CLOCK 13 //SCK

    #define ADNS3080_CHIP_SELECT 10 //SS

    #define ADNS3080_RESET 9 //RESET

    int _cs_pin=ADNS3080_CHIP_SELECT;

    int _reset_pin=1; // set to 1 if you have reset connectedint raw_dx;

    int raw_dy;

    unsigned int surface_quality;

    unsigned long us_prev=0, us_pres=0, us_dif=0

    void setup()

    {Serial.begin(2400); // initialise the serial port to 2400 baud rate

    delay(100);

    // flowSensor initialization

    if( initOF() == false )

    Serial.println("Failed to initialise ADNS3080");

    delay(100);

    }

    Code 3: Initial setup

    Code 2: Variable declaration

    Code 1: Pre-processor code

  • 8/11/2019 Major_project Final Report V_3

    52/62

  • 8/11/2019 Major_project Final Report V_3

    53/62

    41

    APPENDIX-3

    MATLAB main code and explanation

    Variable Declaration

    Figure Creation

    Serial Port Access

    time=0;prev_y=0;prev_t=0;prev_XY_x=0;prev_XY_y=0;

    f1=figure('NumberTitle','off','Visible','off','Name',

    'Visual odometry');hold on;

    subplot(221);h1=plot(0);title('Surface Quality');ylabel('Surface Quality');

    subplot(222);h2=plot(0,0);title('distance vs time');xlabel('time in s');ylabel('distance in cm');

    subplot(223);h3=plot(0,0);title('velocity vs time');xlabel('time in s');ylabel('velocity in cm/s');

    subplot(224);h4=plot(0,0);set(gca,'DataAspectRatio',[1 1 1]);title('x vs y');xlabel('x');ylabel('y');

    s=serial(com_port,'BaudRate',2400);fopen(s);

    Code 6: MATLAB Variable declaration

    Code 7: Figure Creation

    Code 8: Serial Port Access

  • 8/11/2019 Major_project Final Report V_3

    54/62

    42

    Main Loop

    Clear Serial Port

    forn=1:loop

    trydata=fscanf(s);

    C = textscan(data, '%d%d%d%d','delimiter',';');[dx dy sq dt]=deal(C{:});if (isempty(dx)||isempty(dy)||isempty(sq)||isempty(dt))

    continueendDY=(ymf*double(dy));DT=((1/1000000)*double(dt));

    %plotting x vs ymock_dx=double(-DY);

    mock_dy=(dx);teta=(double(mock_dy)*pi/(2*tmf));t=prev_t+teta;prev_t=t;XY_x=prev_XY_x+mock_dx*cos(t);prev_XY_x=XY_x;XY_y=prev_XY_y+mock_dx*sin(t);prev_XY_y=XY_y;

    %plotting distancey=prev_y-DY;prev_y=y;time=time+DT;

    surfacequality=get(h1,'YData');Time=get(h2,'XData');Y=get(h2,'YData');velocity=get(h3,'YData');XY_X=get(h4,'XData');XY_Y=get(h4,'YData');

    set(h1,'YData',[surfacequality sq]);set(h2,'YData',[Y y],'XData',[Time time]);set(h3,'YData',[velocity (-DY/DT)],'XData',[Time time]);

    set(h4,'YData',[XY_Y XY_y],'XData',[XY_X XY_x]);

    set(f1,'Visible','on');catch

    continueend

    end

    fclose(s);delete(s);clear('s');

    Code 9: MATLAB Main loop

    Code 10: Clear Serial port

  • 8/11/2019 Major_project Final Report V_3

    55/62

    43

    APPENDIX-4

    Step by step instruction manual for operating the project:

    1. Check all connections are perfect (Do not connect batteries yet)

    2. Connect the batteries.

    3. Switch on the light at the optical flow sensor.

    4. Double click major_project.exe file and wait until the GUI appears.

    5. Set the loop count to desired value and click run which starts plotting the data.

    6. Now run the bot.

    7. After all the iterations are complete save the figure to the hard disk by going to

    File>Save.

  • 8/11/2019 Major_project Final Report V_3

    56/62

    44

    APPENDIX-5

    UART (Universal Asynchronous Receiver Transmitter)

    The Universal Asynchronous Receiver/Transmitter (UART) controller is the key

    component of the serial communications subsystem of a computer. The UART takes bytes

    of data and transmits the individual bits in a sequential fashion. At the destination, a

    second UART re-assembles the bits into complete bytes.

    Asynchronous transmission allows data to be transmitted without the sender having

    to send a clock signal to the receiver. Instead, the sender and receiver must agree on timing

    parameters in advance and special bits are added to each word which are used to

    synchronize the sending and receiving units.

    When a word is given to the UART for Asynchronous transmissions, a bit called

    the "Start Bit" is added to the beginning of each word that is to be transmitted. The Start

    Bit is used to alert the receiver that a word of data is about to be sent, and to force the clock

    in the receiver into synchronization with the clock in the transmitter. These two clocks

    must be accurate enough to not have the frequency drift by more than 10% during the

    transmission of the remaining bits in the word.

    After the Start Bit, the individual bits of the word of data are sent, with the Least

    Significant Bit (LSB) being sent first. Each bit in the transmission is transmitted for

    exactly the same amount of time as all of the other bits, and the receiver looks at the wire

    at approximately halfway through the period assigned to each bit to determine if the bit is a

    1 or a 0. For example, if it takes two seconds to send each bit, the receiver will examine the

    signal to determine if it is a 1 or a 0 after one second has passed, then it will wait two

    seconds and then examine the value of the next bit, and so on.

    The sender does not know when the receiver has looked at the value of the bit.

    The sender only knows when the clock says to begin transmitting the next bit of the word.

    When the entire data word has been sent, the transmitter may add a Parity Bit that

    the transmitter generates. The Parity Bit may be used by the receiver to perform simple

    error checking. Then at least one Stop Bit is sent by the transmitter.

  • 8/11/2019 Major_project Final Report V_3

    57/62

    45

    When the receiver has received all of the bits in the data word, it may check for the

    Parity Bits (both sender and receiver must agree on whether a Parity Bit is to be used), and

    then the receiver looks for a Stop Bit. If the Stop Bit does not appear when it is supposed

    to, the UART considers the entire word to be garbled and will report a Framing Error to the

    host processor when the data word is read. The usual cause of a Framing Error is that the

    sender and receiver clocks were not running at the same speed, or that the signal was

    interrupted.

    Regardless of whether the data was received correctly or not, the UART

    automatically discards the Start, Parity and Stop bits. If the sender and receiver are

    configured identically, these bits are not passed to the host.

    If another word is ready for transmission, the Start Bit for the new word can be sent

    as soon as the Stop Bit for the previous word has been sent.

    Because asynchronous data is self-synchronizing, if there is no data to transmit,

    the transmission line can be idle.

    Bits, Baud and Symbols:

    Baud is a measurement of transmission speed in asynchronous communication.

    Because of advances in modem communication technology, this term is frequently misused

    when describing the data rates in newer devices.

    Traditionally, a Baud Rate represents the number of bits that are actually being sent

    over the media, not the amount of data that is actually moved from one DTE device to theother. The Baud count includes the overhead bits Start, Stop and Parity that are generated

    by the sending UART and removed by the receiving UART. This means that seven-bit

    words of data actually take 10 bits to be completely transmitted. Therefore, a modem

    capable of moving 300 bits per second from one place to another can normally only move

    30 7-bit words if Parity is used and one Start and Stop bit are present.

    If 8-bit data words are used and Parity bits are also used, the data rate falls to 27.27

    words per second, because it now takes 11 bits to send the eight-bit words, and the modem

    still only sends 300 bits per second.

  • 8/11/2019 Major_project Final Report V_3

    58/62

    46

    The formula for converting bytes per second into a baud rate and vice versa was

    simple until error-correcting modems came along. These modems receive the serial stream

    of bits from the UART in the host computer (even when internal modems are used the data

    is still frequently serialized) and converts the bits back into bytes. These bytes are then

    combined into packets and sent over the phone line using a Synchronous transmission

    method. This means that the Stop, Start, and Parity bits added by the UART in the DTE

    (the computer) were removed by the modem before transmission by the sending modem.

    When these bytes are received by the remote modem, the remote modem adds Start, Stop

    and Parity bits to the words, converts them to a serial format and then sends them to the

    receiving UART in the remote computer, who then strips the Start, Stop and Parity bits.

    The reason all these extra conversions are done is so that the two modems can

    perform error correction, which means that the receiving modem is able to ask the sending

    modem to resend a block of data that was not received with the correct checksum. This

    checking is handled by the modems, and the DTE devices are usually unaware that the

    process is occurring.

    By striping the Start, Stop and Parity bits, the additional bits of data that the two

    modems must share between themselves to perform error-correction are mostly concealed

    from the effective transmission rate seen by the sending and receiving DTE equipment. For

    example, if a modem sends ten 7-bit words to another modem without including the Start,

    Stop and Parity bits, the sending modem will be able to add 30 bits of its own information

    that the receiving modem can use to do error-correction without impacting the transmission

    speed of the real data.

    The use of the term Baud is further confused by modems that perform compression.A single 8-bit word passed over the telephone line might represent a dozen words that were

    transmitted to the sending modem. The receiving modem will expand the data back to its

    original content and pass that data to the receiving DTE.

    Modern modems also include buffers that allow the rate that bits move across the

    phone line (DCE to DCE) to be a different speed than the speed that the bits move between

    the DTE and DCE on both ends of the conversation. Normally the speed between the DTE

    and DCE is higher than the DCE to DCE speed because of the use of compression by the

    modems.

  • 8/11/2019 Major_project Final Report V_3

    59/62

    47

    Because the number of bits needed to describe a byte varied during the trip between

    the two machines plus the differing bits-per-seconds speeds that are used present on the

    DTE-DCE and DCE-DCE links, the usage of the term Baud to describe the overall

    communication speed causes problems and can misrepresent the true transmission speed.

    So Bits Per Second (bps) is the correct term to use to describe the transmission rate seen at

    the DCE to DCE interface and Baud or Bits Per Second are acceptable terms to use when a

    connection is made between two systems with a wired connection, or if a modem is in use

    that is not performing error-correction or compression.

    Modern high speed modems (2400, 9600, 14,400, and 19,200bps) in reality still

    operate at or below 2400 baud, or more accurately, 2400 Symbols per second. High speed

    modem are able to encode more bits of data into each Symbol using a technique called

    Constellation Stuffing, which is why the effective bits per second rate of the modem is

    higher, but the modem continues to operate within the limited audio bandwidth that the

    telephone system provides. Modems operating at 28,800 and higher speeds have variable

    Symbol rates, but the technique is the same.

  • 8/11/2019 Major_project Final Report V_3

    60/62

    48

    APPENDIX-6

    SPI (Serial Peripheral Interface) communication protocol:

    The Serial Peripheral Interface (SPI) allows high-speed synchronous data transfer

    between the ATmega368 and peripheral devices or between several AVR devices. The

    ATmega8 SPI includes the following features:

    Full-duplex, Three-wire Synchronous Data Transfer

    Master or Slave Operation

    LSB First or MSB First Data Transfer

    Seven Programmable Bit Rates

    End of Transmission Interrupt Flag

    Write Collision Flag Protection

    Wake-up from Idle Mode

    Double Speed (CK/2) Master SPI Mode

    Figure 31: SPI Block Diagram

  • 8/11/2019 Major_project Final Report V_3

    61/62

    49

    The interconnection between Master and Slave CPUs with SPI is shown in Figure

    19. The system consists of two Shift Registers, and a Master clock generator. The SPI

    Master initiates the communication cycle when pulling low the Slave Select SS pin of the

    desired Slave. Master and Slave prepare the data to be sent in their respective Shift

    Registers, and the Master generates the required clock pulses on the SCK line to

    interchange data. Data is always shifted from Master to Slave on the Master Out Slave

    In, MOSI, line, and from Slave to Master on the Master In Slave Out, MISO, line. After

    each data packet, the Master will synchronize the Slave by pulling high the Slave Select,

    SS, line.

    When configured as a Master, the SPI interface has no automatic control of the SS

    line. This must be handled by user software before communication can start. When this is

    done, writing a byte to the SPI Data Register starts the SPI clock generator, and the

    hardware shifts the eight bits into the Slave. After shifting one byte, the SPI clock

    generator stops, setting the end of Transmission Flag (SPIF). If the SPI interrupt enable bit

    (SPIE) in the SPCR Register is set, an interrupt is requested. The Master may continue to

    shift the next byte by writing it into SPDR, or signal the end of packet by pulling high the

    Slave Select, SS line. The last incoming byte will be kept in the Buffer Register for later

    use.

    When configured as a Slave, the SPI interface will remain sleeping with MISO tri-

    stated as long as the SS pin is driven high. In this state, software may update the contents

    of the SPI Data Register, SPDR, but the data will not be shifted out by incoming clock

    pulses on the SCK pin until the SS pin is driven low. As one byte has been completely

    shifted, the end of Transmission Flag, SPIF is set. If the SPI interrupt enable bit, SPIE, in

    the SPCR Register is set, an interrupt is requested. The Slave may continue to place new

    data to be sent into SPDR before reading the incoming data. The last incoming byte will be

    kept in the Buffer Register for later use.

    The system is single buffered in the transmit direction and double buffered in the

    receive direction. This means that bytes to be transmitted cannot be written to the SPI Data

    Register before the entire shift cycle is completed. When receiving data, however, a

    received character must be read from the SPI Data Register before the next character has

    been completely shifted in. Otherwise, the first byte is lost.

  • 8/11/2019 Major_project Final Report V_3

    62/62

    Figure 32: SPI Master-Slave Interconnection