5
Cloud Based Autonomous Vehicle Control and Cooperative Navigation System William Prince Smith University of California, Los Angeles Wireless Health Institute [email protected] ABSTRACT Recent advances in computing and embedded sensing have given rise to autonomous vehicle technology. If current trends continue, by the year 2045, envisioning a country where the majority of vehicles on the road are autonomous will be very feasible. This will eventually pose a serious computational challenge to the average embedded vehicle computer, which will be responsible for an increasingly greater number of decisions, all of which will be uninformed by, and independent of, other vehicles. Instead, we envision a new system that aims to exploit the Internet of Things paradigm in order to make decision- making amongst autonomous vehicles cooperative via cloud communication. We first outline a simple node based vehicle control algorithm, and then detail a prototype design of an autonomous vehicle control and navigation system using a cloud server and several RC cars embedded with Intel Edison microprocessors and a network of sensors, and discuss our progress in implementing this system. We further demonstrate possible benefits of this system by placing obstacles within a map and demonstrating cooperative obstacle avoidance and path planning in order to find unencumbered paths for each vehicle. We conclude by comparing this system to modern GPS and evaluating benefits of our system on a real-world scale. 1. INTRODUCTION The current state of the art in navigation systems is the Global Positioning System, a system which utilizes a constellation of satellites precisely located which constantly transmit a global standard time. Receivers intercept this time from various satellites in order to perform a calculation based on signal transmission time and locations relative to the satellites, which in turn allows the system to embed the user’s location within a preloaded map [1]. This technology has impacted the average user’s life in a myriad of ways, not the least of which is with respect to travel and efficient transportation. Various technologies, such as Google Maps & Google Traffic, have aimed to exploit this data in order to provide navigation assistance, as well as real-time traffic information, to vehicles on the road. In the case of Google Traffic, for example, data is obtained by crowdsourcing GPS location data from users currently using the app, analyzing it to obtain velocity information, and aggregating it on a server in order to deduce areas of high traffic congestion [2][3]. Despite the merits of this system, it has several inherent limiting factors. Crowdsourcing, for example, is limited by the size of the “crowd,” which in this case is users of the Google Maps app only. This would be fine if Google had a monopoly on the navigation app business, but it of course does not. Additionally, this technique requires a certain critical mass of vehicle congestion to be obtained before it registers any difference in road conditions on the app [4]. This puts the user at a disadvantage for situations in which an abnormal road condition has just occurred. Perhaps most important is the sustained scalability of such a system in the presence of an ever-growing population of vehicles, an increasingly greater percentage of which are using navigation apps to serve their needs. This problem will come to a head in coming years with the rising technology of autonomous vehicles in society, a technology which is predicted to shift our current transportation paradigm in as early as three decades [5]. By the time this occurs, not only will we have an estimated 200% increase in the number of vehicles on the road [6], but these vehicle will be reliant on information systems, such as Google Traffic, to provide reliable, real-time data in order to allow the vehicle to make an optimal decision without the presence of human input. One advantage of the nature of autonomous vehicles, however, is the network of sensors that they require in order to provide environment information to the control systems. Much like the sensor-laden human-operated vehicles of the present day, these sensors would be wholly underutilized. The ideal system of the future, therefore, would utilize the pre-existing sensor network installed on autonomous vehicles in order to provide data to navigation systems and, indeed, other vehicles. This vehicle-to-vehicle cooperative navigation would provide automated, real-time road conditions to other vehicles without requiring a critical mass of vehicles or a common software service. They would, instead, need only a common communication protocol and a cloud service which could serve as an intermediary between vehicles in order to provide map updates. In the following proposal, our attempt at prototyping such a system is outlined, using a system of RC vehicles, sensors, and computing devices. 1

Cloud Based Autonomous Vehicle Navigation

Embed Size (px)

Citation preview

Cloud Based Autonomous Vehicle Control and Cooperative Navigation System

William Prince SmithUniversity of California, Los Angeles

Wireless Health [email protected]

ABSTRACT Recent advances in computing and embedded sensing have given rise to autonomous vehicle technology. If current trends continue, by the year 2045, envisioning a country where the majority of vehicles on the road are autonomous will be very feasible. This will eventually pose a serious computational challenge to the average embedded vehicle computer, which will be responsible for an increasingly greater number of decisions, all of which will be uninformed by, and independent of, other vehicles.

Instead, we envision a new system that aims to exploit the Internet of Things paradigm in order to make decision- making amongst autonomous vehicles cooperative via cloud communication. We first outline a simple node based vehicle control algorithm, and then detail a prototype design of an autonomous vehicle control and navigation system using a cloud server and several RC cars embedded with Intel Edison microprocessors and a network of sensors, and discuss our progress in implementing this system.

We further demonstrate possible benefits of this system by placing obstacles within a map and demonstrating cooperative obstacle avoidance and path planning in order to find unencumbered paths for each vehicle. We conclude by comparing this system to modern GPS and evaluating benefits of our system on a real-world scale.

1. INTRODUCTION The current state of the art in navigation systems is the Global Positioning System, a system which utilizes a constellation of satellites precisely located which constantly transmit a global standard time. Receivers intercept this time from various satellites in order to perform a calculation based on signal transmission time and locations relative to the satellites, which in turn allows the system to embed the user’s location within a preloaded map [1]. This technology has impacted the average user’s life in a myriad of ways, not the least of which is with respect to travel and efficient transportation. Various technologies, such as Google Maps & Google Traffic, have aimed to exploit this data in order to provide navigation assistance, as well as real-time traffic information, to vehicles on the road. In the case of Google Traffic, for example, data is obtained by crowdsourcing GPS location data from users currently using the app, analyzing it to obtain velocity information,

and aggregating it on a server in order to deduce areas of high traffic congestion [2][3].

Despite the merits of this system, it has several inherent limiting factors. Crowdsourcing, for example, is limited by the size of the “crowd,” which in this case is users of the Google Maps app only. This would be fine if Google had a monopoly on the navigation app business, but it of course does not. Additionally, this technique requires a certain critical mass of vehicle congestion to be obtained before it registers any difference in road conditions on the app [4]. This puts the user at a disadvantage for situations in which an abnormal road condition has just occurred. Perhaps most important is the sustained scalability of such a system in the presence of an ever-growing population of vehicles, an increasingly greater percentage of which are using navigation apps to serve their needs.

This problem will come to a head in coming years with the rising technology of autonomous vehicles in society, a technology which is predicted to shift our current transportation paradigm in as early as three decades [5]. By the time this occurs, not only will we have an estimated 200% increase in the number of vehicles on the road [6], but these vehicle will be reliant on information systems, such as Google Traffic, to provide reliable, real-time data in order to allow the vehicle to make an optimal decision without the presence of human input.

One advantage of the nature of autonomous vehicles, however, is the network of sensors that they require in order to provide environment information to the control systems. Much like the sensor-laden human-operated vehicles of the present day, these sensors would be wholly underutilized. The ideal system of the future, therefore, would utilize the pre-existing sensor network installed on autonomous vehicles in order to provide data to navigation systems and, indeed, other vehicles. This vehicle-to-vehicle cooperative navigation would provide automated, real-time road conditions to other vehicles without requiring a critical mass of vehicles or a common software service. They would, instead, need only a common communication protocol and a cloud service which could serve as an intermediary between vehicles in order to provide map updates. In the following proposal, our attempt at prototyping such a system is outlined, using a system of RC vehicles, sensors, and computing devices.

�1

2. DESIGN Our system primarily makes use of a Dominus 10SC v2 RC Car, which controls a speed control unit and a steering control servo through a pulse width modulation, henceforth PWM, signal. We embed this vehicle with an Intel Edison microprocessor in order to make it programmable and, hence, autonomous. Our system further consists of three ultrasonic range sensors used to detect obstacles around the vehicle in a 270 degree arc, with the rear quadrant being omitted. We emplace the vehicle in a physical setting that has been pre-analyzed and made into a graph data structure consisting of nodes that represent street intersections. At runtime the vehicle executes a control program that, upon reaching a node, queries a remote server running a navigation algorithm, the likes of which will be discussed in the section entitled “Navigation algorithm.” As the vehicle navigates the environment, its sensors detect “anomalies” in the environment, here represented as an arbitrary object detected on the road that impedes the path of the vehicle. Upon registering such an anomaly, the vehicle informs the remote server via a cloud interface, which updates its database of available paths to reflect the change and, in turn, correctly inform other vehicles. It should be noted by the reader that, in this system, the optimal decision for a given vehicle to take is determined by the remote system, and not the vehicle, which need only know its environment data and current location. This makes the system scalable in the sense that vehicles with limited computing resources can be limited in computing responsibility, and the remote server need only know available paths in a preloaded map, rather than aggregated data from each vehicle using the system.

2.1 Vehicle Sensing and Actuation The vehicle is driven via digital PWM signal to the steering control servo and the speed control motor. During execution of the vehicle control program, the vehicle proceeds forward with the advice of constant feedback from three ultrasonic sensors mounted on the vehicle. These ultrasonic distance measuring sensors, situated around the vehicle in a 270 degree arc of coverage, provide the distance between the vehicle and an adjacent wall. The forward mounted ultrasonic sensor is programmed to detect objects within 30 centimeters of the front of the vehicle. Should this condition be satisfied, the vehicle would write a PWM value to the speed control unit predesignated as idle, effectively stopping the vehicle. At this point it would send a message to the cloud service to be routed to the remote server, notifying the server of the vehicle’s current position so that it could mark all routes to that node as unavailable.

The purpose of the two side mounted ultrasonic sensor is twofold. Their primary purpose is to provide centering data to the vehicle when it is in straight forward movement. The output of the two sensors are compared, and adjustments to the steering are made until their outputs are equal to within a threshold of 5 centimeters. The second purpose of the

sensors is to register when the vehicle has reached a node. The approximate range of the ultrasonic sensors is 3

Figure 1: Dominus 10SC v2 RC with 3 ultrasonic sensors embedded on the roof

meters. Any distance exceeding this will be registered by the sensor as a distance of 1,000+ centimeters. We therefore set a limitation of 500 centimeters. Should either side mounted sensor register a distance greater than this, we know that the vehicle has likely come into the opening of a corridor space, and that the sensor is looking down an open hallway. We register this condition as a node, write idle to the speed control unit to stop the vehicle, and query the cloud for a navigation decision.

We also used a rotary encoder to serve as a feedback mechanism for the steering control unit in order to combat loose coupling in the mechanical arms that translate the angular displacement of the steering control servo into movement in the wheels. This problem is commonly known as backlash, and was found to be a pervading issue in this particular RC model, which was designed for manual remote control. The rotary encoder provides real time data to the steering control unit in order to keep the wheels centered during straight forward operation, steer the wheels to the right and left during turns, and to provide fine tuning during centering.

2.2 Navigation Algorithm The navigation algorithm is designed to be run on a remote system which interfaces with the cloud. It serves as a traffic control system, giving vehicles decision information upon arrival at a node, based on the current position of the vehicle, the final destination node of the vehicle, and the current heading of the vehicle on an absolute scale, i.e. north, south, east or west. It uses this information in order to make an informed decision of the optimal next path branch of the vehicle at that time based on the provided vehicle information, as well as a real-time updated map database, which contains all nodes.

Nodes are represented in memory as a data structure encapsulated with 6 integer data types, namely X, Y, North, South, East, and West. X and Y represent a simple

�2

Figure 2: Intel Edison microprocessor beneath vehicle cover coordinate system to locate the node on a 2 dimensional plane. The numbers do not represent distance between adjacent nodes, but rather, serve as a sequential numbering system so that we can quantify the order of the nodes with respect to the North/South direction (Y) and the East/West direction (X). The North, South, East, and West data members are pointers to the next node in the given direction. If there is no node available in a given direction, the value of the corresponding direction member variable is set to the NULL pointer. In this way, the map is represented as a graph data structure, with the weight for adjacent edges having a value of 1. This allows us to use Dijkstra’s Algorithm to compute the shortest path between two adjacent nodes for a vehicle, and then compact this information to a simple decision for the next direction for a vehicle to take in order to follow that path.

This data is sent to the vehicle from the remote system (via the cloud service) along with the data of the node that the vehicle is headed to next. In this way, the remote server does not have to keep track of the entire path of all vehicles simultaneously. Instead, the vehicle’s current position and destination is sent, the algorithm is run based on the shortest path between those points, the vehicle position is updated to the next node that the vehicle will arrive at, and that vehicle is essentially forgotten about until the next time it queries the system. Should a vehicle reach an anomaly, the vehicle reports its location to the remote server, which updates its graph data structure by setting all direction member variables of the given node to NULL. This effectively isolates this node from the data structure so that all paths will have to be rerouted around it in the future. Because the server acts as a proxy for communication between different vehicles querying the cloud for navigation decisions, all anomalies are treated truly real time and, furthermore, only require one instance in order to be considered by other vehicles. This is effectively cooperative vehicle-to-vehicle communication and navigation, rather than data aggregation.

2.3 Cloud-Based Communication In this system, we use the Intel IoT Cloud Analytics Engine to act as the proxy between the remote system and the vehicles. On each vehicle, a simple User Datagram Protocol client is set up which communicates locally between the Intel Edison and the Intel IoT Kit Agent installed on the device, whose duty it is to communicate to the Intel Cloud. It does so through REST Interface by sending JSON messages. When a vehicle reaches a node and wishes to establish communication with the Intel Cloud, it sends a JSON message formatted as a string containing the current node, destination node, and heading of the vehicle, encoded in the form of integers, and sends it to the local IoT Kit Agent, for transmission to the cloud.

Upon receipt, this information is registered as sensor data for a virtual sensor that has been registered on the Intel Cloud Analytics Account. This information is written to a text file that is unique to the vehicle. The remote system parses this text file to extract the information, runs the navigation algorithm based on the given data, and writes the navigation command decision to a separate text file, which is subsequently passed by the Cloud Service, registered into a separate virtual sensor, and converted into JSON format. It is then sent to the corresponding vehicle, which reads the command and executes the appropriate function in order to either take a left or right turn, go straight, or stop in the case of a deadlock condition where there are no open routes.

3. IMPLEMENTATION The implementation of the system was divided into three main parts. The first is hardware, which involved the physical implementation of the system and its components as well as interfacing sensors and actuators with the computing devices through programming and firmware development. The second is the software, which involved the navigation algorithm, the cloud interaction, and the remote server implementation. Third is data analytics, which involved registering and operation of the Intel Cloud Analytics Engine as well as the design of a simple user interface to port the data to an android phone in a visual format.

3.1 Hardware Control of both the speed control motor and the steering control servo were achieved through a digital pulse width modulation output signal. However, in the case of the steering control servo, backlash was a significant obstacle, causing a given PWM duty cycle to result in varying steering positions depending on the weight of the vehicle and the previous position of the tires. In order to overcome this, we utilized a rotary encoder mounted to the underbelly of the servo and attached to the servo by way of a crown gear. The encoder keeps a running track of steering position in order to provide a feedback loop to turning and centering functions. In addition to the encoder, the ultrasonic sensors provide feedback to the Intel Edison board during straight forward operation. These sensors are interfaced via general-purpose I/O (GPIO) libraries included in the Yocto

�3

Figure 3: Rotary encoder mounted to the underbelly of the RC vehicle with crown gear attached

Embedded Linux distribution which we flashed onto the microprocessor. A simplified wiring diagram of the interaction of the sensors with the Intel Edison microprocessor is included in figure 4.

3.2 Software All software running locally on the vehicle computing device was written in C and Shell Script in order to provide a lightweight and efficient implementation. Maps were created by drafting text files containing all of the pertinent information for the nodes. The text file name is passed in the command line and parsed by the program. It is then transformed into a directed graph data structure for use by the navigation algorithm. The vehicle control code is equipped with a program to communicate with the local Intel IoT Kit Agent via REST interface. It achieves this by formatting JSON messages and communicating through a socket locally. The message is ported to the Intel Cloud via via TCP protocol, which reads the message and registers the data to the appropriate sensor. In deciding which controls should be dictated by the cloud (as opposed to locally on the vehicle computer), we realized that the best implementation was one that was the most scalable. This led us to the idea that vehicle control decisions should be done locally, as this meant the cloud would not suffer from greater computational overhead with increased users. Instead, we limited the scope of the cloud to managing the maps, keeping record of reported anomalies by simply making nodes unavailable on the map (by removing the connecting edges between that node and adjacent nodes), and performing Dijkstra’s Algorithm for vehicles on an ad hoc basis when a node is reached.

3.3 Data Analytics The Intel Cloud Analytics Engine provides data analytics tools for Intel Edison and Galileo IoT Agents registered with the servers. Using this service, users need only register new sensors and actuators, and perform a few simple commands to send data between the IoT Agent and the Intel Cloud. This data is analyzed by the cloud and useful charts and metrics are displayed. However, this data needs to be manually refreshed in order to view the most recent information. We therefore developed a

Figure 4: Simplified wiring diagram of interaction of Intel Edison microprocessor and sensor system

simple Android app that allowed the data ported to the Intel Cloud to be displayed real time in a simple user interface. We hope to extend this user interface so that it can instead create a real time map display based on nodes visited and current heading, as last updated on the remote system. This will allow us to monitor the functioning of the navigation algorithm and the traffic routing system running on the remote machine. This will allow us to physically see that vehicles are responding simultaneously, and in real-time, to reported anomalies.

4. EVALUATION Although the project has not yet been fully implemented, we were successful in controlling the vehicle via the embedded microprocessor, as well as through the cloud interaction. We were also successful in trial testing of the navigation algorithm and traffic control system running on the remote server. In the testing phase, a map was loaded and test case vehicle queries were sent to the system, which responded appropriately. We were also successful in interfacing the ultrasonic sensors with vehicles in order to detect anomalies. With more time, we will be able to test the automated cloud interaction upon registering an anomaly and hence, bring these two functioning sides of the system together.

In addition to further work required on the part of the cloud interaction, we still require further work in precisely controlling the steering.

5. CONCLUSIONS In this paper, we outlined a unique vehicle-to-vehicle cloud based navigation system that exploits embedded networked sensors and the cloud to provide real-time, individualized road status updates in a highly scalable manner. We detailed how such a system could be demonstrated using a system of RC vehicles and a network of ultrasonic sensors, and discussed our progress in implementing this system. Although our work is not yet complete, the implications of this system, if brought to full scale, could be a viable solution to the issues that arise with a growing number of autonomous vehicles on the road using a centralized and crowdsourced navigation system.

�4

References 1. Erickson, Kristen. http://spaceplace.nasa.gov/gps/en/ 2. Lardinois , Frederic. Google Maps Gets Smarter:

Crowdsources Live Traffic 3. Barth, Dave. The bright side of sitting in traffic:

Crowdsourcing road congestion data (2009) 4. https://www.ncta.com/platform/broadband-internet/

how-google-tracks-traffic/ 5. The Year is 2045. https://www.transportation.gov/sites/

dot.gov/files/docs/The%20Blue%20Paper_0.pdf 6. Sperling, D., and D. Gordon. Two Billion Cars:

Driving Toward Sustainability. Oxford University Press, 2009.

�5