4
Development of a Hybrid Autonomous Underwater Vehicle for Benthic Monitoring A. Cadena, P. Teran, G. Reyes, J. Lino, V. Yaselga Marine Science Faculty Universidad Estatal Península de Santa Elena (UPSE) La Libertad, Ecuador e-mail: [email protected] S. Vera, Telecommunications and Systems Faculty Universidad Estatal Península de Santa Elena (UPSE) La Libertad, Ecuador e-mail: [email protected] Abstract—This paper reports the initial development of a man- portable hybrid autonomous underwater vehicle that combines the best characteristic of AUV and ROV, good hydrodynamics and high stability in the water column. This architecture allows navigation close to the sea bed to get images and benthic samples. The electronics is based on FPGA and ARM processor development boards. The algorithms for Guidance, Navigation, Control, Computer Vision, Bayesian Networks and Deep Neural Networks were coded by VHDL blocks and C/C++ scripts that run on a Linux Embedded System. The Inertial Navigation System complemented by GPS is implemented by an Extended Kalman Filter described by VHDL blocks working with 64 bit arithmetic floating point and CORDIC algorithm. The main purpose of this hardware- software architecture is to allow the execution of some ROV complex tasks with human operators like identify sites of scientific interest and make parking strategies to collect underwater samples. Experimental results from sea trials are shown. Keywords-Autonomous Underwater Vehicles (AUV); Remotely Operated Vehicle (ROV); Field Programmable Gate Array (FPGA); Guidance, Navigation and Control (GNC); Deep Neural Networks (DNN) I. I NTRODUCTION Autonomous Underwater Vehicles (AUV) are unmanned, self-propelled underwater robots that can operate independently for periods of a few hours to several days. The AUVs have sophisticate onboard computers and sensors to follow a preprogrammed path and execute sequential behaviors programmed by a mission planner interface. AUV are playing a major role in ocean exploration [1], [2]. An important advantage of AUV respect of ROV system in the ability to operate untethered that allows the capacity of explore previously inaccessible areas like under-ice environment at Polar regions [3]. Navigate near the sea bed, less than 1 m, is a challenging task. Usually ROV pilots make small path adjustments to avoid potential navigation hazards for the vehicle, especially on irregular terrain. Routinary tasks of a ROV are get images of a specified object, identify biological, archeological or geological features and generate parking strategies to collect samples from the sea floor. Achieve these abilities with an AUV requires the execution of sophisticate algorithms with high computational costs like Machine Learning, Genetics Algorithms, Fuzzy Logic or Neural Networks [4]. Some AUVs navigate based on strapdown Inertial Navigation System which has high computational cost, because navigational devices such as the Global Positioning System have a limited usage under the sea surface [5]. Navigation near to the sea bed requires imagery from high resolution cameras or sonar equipment [6]. These images are processed by a Computer Vision System to identify potential navigation hazards and perform object identification, characterization and classification. Depending of the relevance of an underwater scientific site of interest, the mission could be reconfigured. Patterns recognition and decision making without human control typically is supported by Machine Learning or Artificial Intelligence to deal with uncertainties that appear in-situ [7]. This kind of algorithms must run real time or near real time on an onboard computer. The power consumption and size of an onboard computer is a key factor of AUV design. Real Time identification and tracking of an underwater specimen like benthos, requires image segmentation and pattern recognition that could be performed by Deep Neural Networks (DNN). DNN have shown satisfactory performance for pattern recognition and strategies generation, even in complex table games like “Go” [8]. Frequently DNN are implemented using GPU (Graphics Processing Units). A GPU based implementation is not suitable for a small underwater vehicle due its size and power consumption that can excess 100 watts. FPGA could be a better hardware platform for DNN implementation in small size vehicles [9]. This paper reports the initial development of a low cost, man portable AUV with hybrid architecture to get ROV functionality without human control based on Deep Neural Networks to avoid a tethered operation. The main role of the AUV is to make a photogrammetric survey and get specific images of a scientific site of interest identifying relevant biological, geological or archeological features. II. MECHANICAL DESIGN The vehicle is formed by three hulls, a lower hull where the batteries and main electronics are stored and two upper hulls with buoyancy modules and additional onboard electronics. Disc shaped of syntactic foam (density 192 kg/m3) are used as flotation devices. Between the two upper hulls, at the forward section is located a pressure housing made of acrylic perpendicularly to store the cameras for the Computer Vision. The vehicle length is 70 cm and mass of 15 kg. The material of the hull is PVC and acrylic allowing 437 2018 4th International Conference on Control, Automation and Robotics 978-1-5386-6337-0/18/$31.00 ©2018 IEEE

Development of a Hybrid Autonomous Underwater Vehicle for ...Autonomous Underwater Vehicles (AUV) are unmanned, self-propelled underwater robots that can operate ... algorithm implementation

  • Upload
    others

  • View
    26

  • Download
    0

Embed Size (px)

Citation preview

Development of a Hybrid Autonomous Underwater Vehicle for Benthic Monitoring

A. Cadena, P. Teran, G. Reyes, J. Lino, V. Yaselga Marine Science Faculty

Universidad Estatal Península de Santa Elena (UPSE) La Libertad, Ecuador

e-mail: [email protected]

S. Vera, Telecommunications and Systems Faculty

Universidad Estatal Península de Santa Elena (UPSE) La Libertad, Ecuador

e-mail: [email protected]

Abstract—This paper reports the initial development of a man-portable hybrid autonomous underwater vehicle that combines the best characteristic of AUV and ROV, good hydrodynamics and high stability in the water column. This architecture allows navigation close to the sea bed to get images and benthic samples. The electronics is based on FPGA and ARM processor development boards. The algorithms for Guidance, Navigation, Control, Computer Vision, Bayesian Networks and Deep Neural Networks were coded by VHDL blocks and C/C++ scripts that run on a Linux Embedded System. The Inertial Navigation System complemented by GPS is implemented by an Extended Kalman Filter described by VHDL blocks working with 64 bit arithmetic floating point and CORDIC algorithm. The main purpose of this hardware-software architecture is to allow the execution of some ROV complex tasks with human operators like identify sites of scientific interest and make parking strategies to collect underwater samples. Experimental results from sea trials are shown.

Keywords-Autonomous Underwater Vehicles (AUV); Remotely Operated Vehicle (ROV); Field Programmable Gate Array (FPGA); Guidance, Navigation and Control (GNC); Deep Neural Networks (DNN)

I. INTRODUCTION Autonomous Underwater Vehicles (AUV) are unmanned,

self-propelled underwater robots that can operate independently for periods of a few hours to several days. The AUVs have sophisticate onboard computers and sensors to follow a preprogrammed path and execute sequential behaviors programmed by a mission planner interface. AUV are playing a major role in ocean exploration [1], [2]. An important advantage of AUV respect of ROV system in the ability to operate untethered that allows the capacity of explore previously inaccessible areas like under-ice environment at Polar regions [3]. Navigate near the sea bed, less than 1 m, is a challenging task. Usually ROV pilots make small path adjustments to avoid potential navigation hazards for the vehicle, especially on irregular terrain. Routinary tasks of a ROV are get images of a specified object, identify biological, archeological or geological features and generate parking strategies to collect samples from the sea floor. Achieve these abilities with an AUV requires the execution of sophisticate algorithms with high computational costs like Machine Learning, Genetics Algorithms, Fuzzy Logic or Neural Networks [4].

Some AUVs navigate based on strapdown Inertial Navigation System which has high computational cost, because navigational devices such as the Global Positioning System have a limited usage under the sea surface [5]. Navigation near to the sea bed requires imagery from high resolution cameras or sonar equipment [6]. These images are processed by a Computer Vision System to identify potential navigation hazards and perform object identification, characterization and classification. Depending of the relevance of an underwater scientific site of interest, the mission could be reconfigured. Patterns recognition and decision making without human control typically is supported by Machine Learning or Artificial Intelligence to deal with uncertainties that appear in-situ [7]. This kind of algorithms must run real time or near real time on an onboard computer. The power consumption and size of an onboard computer is a key factor of AUV design. Real Time identification and tracking of an underwater specimen like benthos, requires image segmentation and pattern recognition that could be performed by Deep Neural Networks (DNN). DNN have shown satisfactory performance for pattern recognition and strategies generation, even in complex table games like “Go” [8]. Frequently DNN are implemented using GPU (Graphics Processing Units). A GPU based implementation is not suitable for a small underwater vehicle due its size and power consumption that can excess 100 watts. FPGA could be a better hardware platform for DNN implementation in small size vehicles [9].

This paper reports the initial development of a low cost, man portable AUV with hybrid architecture to get ROV functionality without human control based on Deep Neural Networks to avoid a tethered operation. The main role of the AUV is to make a photogrammetric survey and get specific images of a scientific site of interest identifying relevant biological, geological or archeological features.

II. MECHANICAL DESIGN The vehicle is formed by three hulls, a lower hull where

the batteries and main electronics are stored and two upper hulls with buoyancy modules and additional onboard electronics. Disc shaped of syntactic foam (density 192 kg/m3) are used as flotation devices. Between the two upper hulls, at the forward section is located a pressure housing made of acrylic perpendicularly to store the cameras for the Computer Vision. The vehicle length is 70 cm and mass of 15 kg. The material of the hull is PVC and acrylic allowing

437

2018 4th International Conference on Control, Automation and Robotics

978-1-5386-6337-0/18/$31.00 ©2018 IEEE

an operation depth up to 150 m. The Lower hull is a four inches pressure housing from Blue Robotics. The separation of the upper and lower hulls is to maximize the distance between the center of mass and center of buoyancy, generating a large righting torque to obtain a good stability in the water column during the hovering maneuvers near to the sea bed. The hulls are linked by an open-frame structure made of aluminum and plastic. Some mechanical elements were manufactured using a 3D printer with ABS filament. Figure 1 shows the developed vehicle. The propulsion system utilizes T-200 underwater thruster from Blue Robotics. Two thrusters are oriented parallel with the X axis, one thruster parallel to the Z axis. This configuration allows three degree of freedom: Yaw, surge and heave. The dimensions and performance of the vehicle are shown in Table I.

Figure 1. Developed AUV at UPSE university.

TABLE I. AUV DIMENSIONS AND PERFORMANCE

Dimensions 70 cm x 40 cm x 45 cm Cruise speed 2 knots Max depth 150 m Range 8 Nautical Miles Total Mass 15 Kg Operation temperature -10 oC to 40 oC Navigation GPS/INS Endurance 4 h Dynamic Buoyancy No Obstacle Avoidance Yes, passive system

based on the CV

III. ELECTRONICS The AUV is powered by a 16000 mAh 4S (14.8 V)

Lithium-Polymer battery that allows autonomy up to 4 hours with a speed of 2 knots. The thrusters are controlled by a set of 30 amperes Electronic Speed Controllers (ESC). The main elements of the electronics are two FPGA development board from TERASIC. The boards are TERASIC FPGA

DE0 nano development board and TERASIC FPGA-ARM DE0 nano Soc. The DE0 nano SoC has a FPGA and a Dual-core ARM Cortex-A9 physically embedded on the same chip, denominated HPS. The FPGA and HPS (ARM) are interconnected by a high-bandwidth backbone. The kinematics sensors are an Inertial Measurement Unit (IMU) with accelerometers, gyroscopes and magnetometers, a GPS receiver, Blue Robotics 300 m Pressure sensor, a pair of 5 megapixels cameras called ArduCAM and 4K “Go PRO” camera controlled via Wi-Fi. These sensors are connected directly to the DE0 nano board. The communications are supported by a RF 900 MHz transceiver with range up to 20 Km with a high gain antenna, Wi-Fi module and a satellite module Iridium RockBLOCK 7. The payload is a set of environmental sensors that could be a CTD (Conductivity, Temperature and Depth) sensor, pH probe or a mass spectrometer. Data from sensors, cameras, RF modems and generated commands are stored in a set of 16 GB SD card. The DE0 nano generates PWM control signals for the ESC. A 1TB External Hard Drive is connected with the DE0 nano SoC as a USB 2.0 device. The Hard Drive contains a Database of images and DNN configurations. Figure 2 shows the electronics Layout of the AUV.

Figure 2. AUV electronics layout.

IV. SOFTWARE The software architecture is divided in two parts: an

Autopilot system and a Deep Neural Networks unit in a client/server configuration. The Autopilot runs on the DE0 nano and the DNN on the DE0 nano SoC. When the AUV mission is a photogrammetric survey, the DNN is turned off to save energy. The Autopilot is divided in three layers. The low layer implements communication protocol (I2C, SPI, NMEA 0183), SD card routines, RAM memory management and PWM control signals generation. The mid layer contains

438

the Guidance, Navigation and Control (GNC) and Computer Vision (CV). The low layer and mid layer are coded by VHDL. The upper layer is the Mission Manager. The Mission Manager is described with C language, runs on a soft core 32 bits embedded processor Nios II, communicated with the VHDL blocks through the Avalon bus. Figure 3 shows the software architecture.

Figure 3. Software architecture.

The mathematical operations are based on IEEE 754 64-bit floating point arithmetic and floating-point CORDIC algorithm for trigonometric functions [10]. The mathematical operations use ALTERA mega functions for addition, multiplication and division, in the worst case the latency is 10 clock cycles. The onboard sensors data are converted from fixed point to 64 bits floating point and then stored in M9K internal memory blocks. A state machine runs GNC routines and finally the results are converted from floating point to fixed point for PWM signal generation. Figure 4 shows the VHDL functional partition of the GNC. The Baseline of the GNC is the Extended Kalman Filter (EKF) to implement an Inertial Navigation System (INS). The EKF fuses data from the IMU, GPS and Pressure Sensor to estimate the position, velocity and orientation of the vehicle with an inertial reference frame. The EKF has been developed from a set of differential equations that describes the vehicle’s dynamics and the noise process. The AUV automatic control system employs PID and Fuzzy Logic controllers. The PID controllers use the discrete equation type A. The purpose of the 3D Computer Vision System is create a cloud of points that represent the sea bed and surrounding objects to assist the GNC and generate depth images that are used by the DNN. A calibration matrix is applied on incoming frames, it considers the water column, window distortions, distance between cameras and others environmental factors. The CV employs the SURF (Speeded up Robust Features Up) algorithm in conjunction with

classic stereo vision techniques [11]. The SURF algorithm includes key point detectors and descriptors [12]. A SURF algorithm implementation using FPGA was developed for the ExoMars programme [13]. The SURF algorithm has got three basic steps: Interest points are found in the frame by a Fast-Hessian Detector, Haar wavelet responses for x and y directions are calculated around the interest point, the greater dominant direction is selected to get rotation invariance, finally the descriptor of the interest point surrounding area is calculated by Haar wavelet functions. The implementation of the CV is similar to the GNC, both functional partitions described by VHDL run parallelly. One of the problems for SURF implementation was the memory resources demanded by the Hessian Matrix. With the available information from GNC, CV, RF communications and Mission parameters the Mission Manager, using Bayesian Networks, generates the reference coordinates for the GNC. The Bayesian Networks help to perform an auto-test of all onboard equipment and isolate spurious information from ill-sensors [14]. Isolation is performed by changing the Bayesian network structure using interpreted evidences from onboard equipment.

Figure 4. VHDL functional partition for GNC.

The main goal of DNN usage is perform a near real time onboard classification and characterization of specimens detected by the SURF and then reconfigure the mission path and vehicle behavior according to specimen relevance. The original idea contemplated 4K frames processing with DNN, but the required computational resources, an equivalent of 20 NVIDIA GeForce GPUs, are not suitable for this AUV due size and power limitations. In our approach a preliminary image analysis is performed by the SURF algorithm. The result of the analysis is a set of ROI with 28x28 pixels with 8 bit gray scale. The DNN operates on these ROI looking for specific features to classify and characterized the object of interest using the shapes stored in the external Hard Drive, examples: a jelly fish or a piece from a shipwreck. Since the FPGA and HPS processing capacity are limited, the DNN implementation uses fixed-point arithmetic and 3-bit weights with VHDL blocks and scripts running on a Linux

439

Embedded System. The DNN algorithms and Architecture Optimization is based on [15]. The weights are trained off-line, several training images are stored in the external Hard Drive. Initially, the system was evaluated with hand writing numbers. Then images with geometrical patterns like starfish, sponge and corals were used for training. The DNN network configuration can be changed during the AUV mission reconfiguring the FPGA from the HPS by an application software. The DNN possible network configurations and weights are stored in the external Hard Drive. Example: DNN configuration dedicated for image features recognition of the 28x28 ROI or DNN configuration for AUV mission risks estimation.

V. RESULTS Sea trials in Ecuadorian waters were carried out to test

the proposed AUV at the Marine Reserve “El Pelado”, 5 NM from the coast. The initial sea trials tested the performance of the mechanical systems, GNC and SURF algorithms. After GNC and CV worked successfully, the other development board, DE0 nano Soc, was installed. The DNN processing time for recognizing 10000 images was 570 ms. The DNN successfully recognized fish in the water column and starfish on the sea floor.

Figure 5. AUV sea trials at the marine reserve “El Pelado”, ecuador.

VI. CONCLUSIONS AND FUTURE WORK This paper showed the implementation of mechanical,

electronics and software for a man-portable AUV with hybrid architecture. The FPGA development boards accomplished the computational resources demanded by the GNC, CV, Bayesian Networks and DNN algorithms. The overall systems were evaluated successfully during the sea trials. Future works includes propulsion system improvement adding more thrusters to get more control under heavy currents, more DNN testing under real work conditions and a Robotic Arm to collect benthic samples fully autonomously.

REFERENCES [1] Tamaki U. "Observation of Deep Seafloor by Autonomous

Underwater Vehicle". Indian Journal of Geo-Marine Sciences, Vol. 42 (8), pp. 1028-1033, December 2013.

[2] Russell B., Veerle A.I., Timothy P.Le, Bramley J. Douglas P., Brian J., Henry A., Kirsty J., Jeffrey P., Daniel R., Esther J., Stephen E., Robert M., James E. Autonomous Underwater Vehicles (AUVs): Their past, present and future contributions to the advancement of marine geoscience. Marine Geology. Volume 352, 1 June 2014, Pages 451-468.

[3] Kunz, C.; Murphy, C.; Singh, H.; Das, S. B.; Jackson, R. H.; Kukulya, A.; Littlefield, R.; Maksym, T. L.; Plueddemann, A. J.; Sohn, R. A.; Straneo, F.; Wilkinson, J. "Under-Ice Science in the Polar Regions with Autonomous Underwater Vehicles", American Geophysical Union, Fall Meeting 2012.

[4] Michael Cashmore, Maria Fox, Derek Long, Daniele Magazzeni, Bram Ridder. "Artificial Intelligence Planning for AUV Mission Control". IFAC-PapersOnLine Volume 48, Issue 2, 2015, Pages 262-267.

[5] M Dinc, C Hajiyev. "Integration of navigation systems for autonomous underwater vehicles". Journal of Marine Engineering & Technology Volume 14, 2015 - Issue 1.

[6] Yoerger, D. R., Jakuba, M., Bradley, A. M., & Bingham, B. (2007). Techniques for deep sea near bottom survey using an autonomous underwater vehicle. The International Journal of Robotics Research, 26(1), 41-54.

[7] Lugli, A. B., & de Melo, M. G. (2017). COMPUTER VISION AND ARTIFICIAL INTELLIGENCE TECHNIQUES APPLIED TO ROBOT SOCCER. INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 13(3), 991-1005.

[8] Silver, D., Huang, A., Maddison, C. J., Guez, A., Sifre, L., Van Den Driessche, G., ... & Dieleman, S. (2016). Mastering the game of Go with deep neural networks and tree search. Nature, 529(7587), 484-489.

[9] Nurvitadhi, E., Venkatesh, G., Sim, J., Marr, D., Huang, R., Ong Gee Hock, J., ... & Boudoukh, G. (2017, February). Can FPGAs beat GPUs in accelerating next-generation deep neural networks?. In Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays (pp. 5-14). ACM.

[10] Neji, N., Boudabous, A., Kharrat, W., & Masmoudi, N. (2011, March). Architecture and FPGA implementation of the CORDIC algorithm for fingerprints recognition systems. In Systems, Signals and Devices (SSD), 2011 8th International Multi-Conference on (pp. 1-5). IEEE.

[11] Moeslund, T. B., & Granum, E. (2001). A survey of computer vision-based human motion capture. Computer vision and image understanding, 81(3), 231-268.

[12] Battezzati, N., Colazzo, S., Maffione, M., & Senepa, L. (2012, March). SURF algorithm in FPGA: A novel architecture for high demanding industrial applications. In Design, Automation & Test in Europe Conference & Exhibition (DATE), 2012 (pp. 161-162). IEEE.

[13] Lentaris, G., Stamoulias, I., Diamantopoulos, D., Siozios, K., & Soudris, D. (2013, January). An FPGA implementation of the SURF algorithm for the ExoMars programme. In Proceecdings of the 7th HiPEAC Workshop on Reconfigurable Computing (WRC2013), Berlin, Germany (pp. 21-23).

[14] Mengshoel, O. J., Darwiche, A., & Uckun, S. (2008). Sensor validation using Bayesian networks. In Proc. 9th International Symposium on Artificial Intelligence, Robotics, and Automation in Space (iSAIRAS-08).

[15] Park, J., & Sung, W. (2016, March). FPGA based implementation of deep neural networks using on-chip memory only. In Acoustics, Speech and Signal Processing (ICASSP), 2016 IEEE International Conference on (pp. 1011-1015). IEEE.

440