20
Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS Competition Team Members: Rajiv Khatri, Jon-Erik Jaegersen, Thisara Pinto, Michael Perstin, Brian Garret, Robert Hook, Peter Carros. Embry-Riddle Aeronautical University, Daytona Beach, Florida, 32114 USA. Project Advisors: Dr. Charles Reinholtz, Dr Cameron Wang. Department of Mechanical Engineering, Embry-Riddle Aeronautical University, Daytona Beach, Florida, 32114 USA.

Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Embed Size (px)

Citation preview

Page 1: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR)

Unmanned Aerial System

2010 AUVSI Student UAS Competition

Team Members: Rajiv Khatri, Jon-Erik Jaegersen, Thisara Pinto, Michael Perstin, Brian Garret,

Robert Hook, Peter Carros. Embry-Riddle Aeronautical University, Daytona Beach, Florida, 32114 USA.

Project Advisors:

Dr. Charles Reinholtz, Dr Cameron Wang. Department of Mechanical Engineering, Embry-Riddle Aeronautical University, Daytona Beach,

Florida, 32114 USA.

Page 2: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

1. Introduction Embry-Riddle Aeronautical University (ERAU) is proud to present the Self-

Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System (UAS). In developing SOAR, the team focused on the systems nature of the problem, the Statement of Work (SOW) issued by the Seafarer Chapter of the Association for Unmanned Vehicle Systems International (AUVSI), and, most importantly, on the overall Concept of Operations.

SOAR UAS is comprised of the Airborne Subsystem, the Ground Station Subsystem, and Network Subsystem. SOAR UAS provides autonomous intelligence, surveillance and reconnaissance (ISR) while complying with the Air Tasking Order (ATO) and Special Instructions (SPINS) for departure and arrival procedures, all while remaining within assigned airspace. As demonstrated in our-proof-of-flight video, which shows typical Marine patrol sweep deployment, the system has been developed and rigorously tested in to ensure operational success.

This paper describes the systems design approach employed by the team and discusses how we expect to exceed the Key Performance Parameters (KPP). SOAR UAS aims to raise the level of competition with its blend of autonomous operations and high-quality, real-time imaging, in a safe, man-portable system. Figure 1 shows the assembled SOAR UAS Airborne System.

Figure 1: SOAR UAS Airborne Subsystem.

1.A. Mission Requirements Design, development and testing decisions for the SOAR system were

continually analyzed and updated based on current information and the Statement of Work, the Concept of Operations and the mission requirements of the competition in mind. Our goal was always the development of an autonomous UAS capable of exceeding the objective Key Performance Parameters in all five areas (autonomy, imagery, target location, mission time, and in-flight re-tasking). The target characteristics consist of background color, alphanumeric character, alphanumeric color, shape, location and orientation.

The capability of changing waypoints, updating the flight plan while in flight and determining the GPS coordinates of the extracted targets were also considered to be

Page 3: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

key elements of mission. Above all, the safety of the UAS was given priority in the design and testing process. Safety includes safety planning and procedures for routine operations as well as emergency operations such as takeover of manual control during the mission and safe flight termination procedures in the event of a lost communications link. 1.B. Systems Engineering Approach Systems engineering focuses on the design and management of complex engineering problems, such as the problem posed by this competition. Systems engineering deals with the technical and logistical aspects of a project along with the human aspects, such as project management and organization.

Team SOAR is primarily a Mechanical Engineering senior design project group. As such, the team was assembled at the beginning of the academic year and presented with the challenge of developing a complete solution to the problems posed by the Student Unmanned Air Systems (UAS) Competition. To help manage the technical and academic elements of the project, the team elected to divide into three main subsystem teams, namely, the Airborne subsystem, the Network subsystem, and Ground Station subsystem.

A quality assurance and management system, including rigorous verification and validation procedures were implemented in order to successfully test the modules developed for each subsystem. With the groups concentrating on their own subsystems, it was vital to have weekly general team meetings in which the subsystem groups would meet to discuss subsystem integration feasibility, general safety concerns and mission progress. This approach and methodology provided the SOAR system with a robust and dynamic working structure.

The SOAR design philosophy further extends into the management of specific modules within each of the subsystems. By allocation specific modules to specific members of each of the subsystem groups, the team was able to monitor overall system goals and provide additional support when an individual encountered difficulties. Table 1 shows the main subsystems of SOAR and each of the main modules assigned to these subsystems.

Each module was placed under the most appropriate subsystem group. Modules that were closely related to more than one subsystem required cross-group collaboration to ensure seamless integration with its relating subsystems.

Page 4: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Table 1: Categorization of modules of each of the three main subsystems of SOAR.

Subsystem Groups Modules Airframe Module - airframe design and modifications including power, testing and system safety evaluations.

Airborne Subsystem

Avionics System Module - autopilot system configuration, telemetry/ground control data link integration. Data Link Module - telemetry/ground control data link configuration, image data link configuration.

Network Subsystem

Camera Payload Module1 - configuring network camera and point-and-shoot camera. Ground Control Station Module – telemetry and flight operations analysis.

Ground Station Subsystem

Data Processing Module - image capture, target recognition, target characteristics recognition and target location.

Building models of the Network Subsystem and the Data Processing Module

gave the team a larger scale view of the methodology behind how each system would work. With the Data Processing Module being a more complex system, the model allowed the team to set up a step-by-step process into implementing the Data Processing and Image Acquisition concepts. This type of modeling provided SOAR with crucial insight into what areas the team should invest time and recourses on, and gave the team insight on aspects such as the feasible level of autonomy and functionality that can be implemented within the restricted time limit of development. 1.C. Testing Philosophy

Having a solid testing methodology meant that the team could minimize downtime in the development process and make sure the system met required safety standards. Each module was tested at the following three stages:

1. Individual Software/Hardware Test 2. Software/Hardware in-the-loop Simulation 3. Integrated System Field Test If a module would fail any of the tests or did not perform as expected, a report

detailing all malfunctions and issues would be generated and the module would be re-evaluated by the corresponding development team to make necessary corrections and restart the testing procedure. This entire process was repeated until all modules performed as intended.

                                                                                                               1  Since the Camera Payload Module (CPM) is directly connected to the airframe, it falls directly within the airborne subsystem. However, it is categorized in Table 1 under network subsystem solely for dividing priorities among members during the design and development process.  

Page 5: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

2. Airborne Subsystem The SOAR Airborne Subsystem consists primarily of the Airframe Module and

Avionics System Module. However, the Data Link and Camera Payload Modules play a vital part in the selection of an Airframe and Power System. In order to provide seamless module integration, the Airborne and Network Subsystem groups worked closely in the airframe testing and selection process. Figure 2 shows the modules and connections related to Airborne Subsystem.

Figure 2: Airborne Subsystem Modules and Related Connections.

A key decision made early in the development process was to use a modified commercial, off-the-shelf airframe (COTS) rather than designing a custom one. In part, this decision was made to allow the team to focus time and effort on the more critical mission-related aspects on the competition, including the KPPs.

However, the selected EasyStar™ airframe is hardly a compromise. The size, weight, durability and safety features of this aircraft make it an excellent choice for the Marine patrol ConOps. The plane is 35" long, with a 54" wingspan and a 370 sq. in. wing area. It uses a pusher prop and has a stock weight of 24 ounces. It's easily assembled and disassembled in the field, making it suitable for storage and transport in a standard-issue MOLLE pack. The EasyStar has highly stable flight characteristics and is easy to hand launch and fly. The entire airplane is made of resilient and easily repairable Elapor foam. The size and weight of this platform, combined with the pusher prop, make it relatively safe to hand launch and fly. Many of the logistical issues associated with UAS operation, such as finding a safe flying field, are partially mitigated by this aircraft.

Page 6: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

The modules incorporated in the Airborne Subsystem were limited in relation to the weight, size, and power constraints of airframe and power supply, provided that the modules in question satisfy the primary mission requirements. 2.A. Avionics System Module

The autopilot is the primary component of the Avionics System Module and a vital component in providing autonomy for a UAS. Four different autopilots were considered: the Piccolo Plus, Micropilot MP2028, Ardupilot and Ardupilot Mega. All the autopilots under consideration were adequate to satisfy the primary mission requirements, but other factors including size, weight, cost and International Traffic in Arms Regulations (ITAR) restricted status influenced the final decision. Table 2 shows the primary factors considered in the selection process.

Table 2: Comparison chart with testing and research results on select autopilots. Cloudcap

Piccolo Plus Micropilot MP2028g

Ardupilot Ardupilot Mega

Estimated Cost $6000 $5500 $3502 $450 Support Available at a

high cost. Available at a

high cost. Free/Open

Source Free/Open

Source Software

Adaptability Proprietary Software3

Proprietary Software

C-based open source/Self-

Written Software

C-based open source/Self-

Written Software

Attitude Measurement

IMU IMU Thermopile

Thermopile/ IMU

Hardware Integration

and Options.

Proprietary 3rd party hardware

available at high cost

Proprietary 3rd party hardware

available at high cost

Inexpensive 3rd party hardware

Inexpensive 3rd party hardware

2.B. Autopilot Solutions While SOAR had the privilege of testing the Cloudcap and Micropilot autopilots, theses systems came with several major disadvantages, including high cost and restrictions under ITAR. High initial cost was the major disadvantage of limiting the team to one autopilot, virtually eliminating the possibility of repair or replacement in the event of a malfunction or crash near the time of competition. As an alternative to expensive autopilot systems, the team chose to invest effort in developing the Ardupilot and the Ardupilot Mega. These small, inexpensive autopilot systems allowed the team to develop two nearly identical platforms and conduct simultaneous tests with minimal downtime. The Ardupilot open source community served as valuable source of information, providing the team with an efficient way to master the system. Having no proprietary software, the Ardupilot can be programmed according to the requirements of SOAR using C-based language. Figure 3 shows the

                                                                                                               2 Unlike the Piccolo Plus or the Micropilot MP2028g, the Arduino and the Ardupilot Mega do not come with built in receivers, GPS or an altitude measurement system. This provides a highly customizable autopilot system and in return changing the cost of the entire system. 3 Cloudcap sells a Software Development Kit (SDK) separately at an additional cost.

Page 7: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

relative dimensions of the two Ardupilot autopilot systems and the Cloudcap Piccolo Plus autopilot system.

Figure 3: Top to bottom - Ardupilot, Ardupilot Mega, Piccolo Plus Autopilot Systems.

2.C. Autopilot Development to Meet Mission Requirements Initial research on the Ardupilot revealed the potential problems of having no support for an Inertial Measurement Unit (IMU), used for attitude control, and supporting only one-way communication, which meant that SOAR would not be able to update waypoints during flight or initiate flight termination in the event of a lost communication link. Through testing, the team determined that the supported thermopile-based attitude measurement was sufficient to satisfy mission requirements. To overcome the problem of one-way communication, the team transitioned from the Ardupilot to the Ardupilot Mega. The advantages of switching to the Ardupilot Mega include IMU support and two-way communications. Although the team has purchased an IMU for integration with the Ardupilot Mega, this has been a low-priority task because the thermopile solution has proven to be accurate and reliable. Figure 4 shows the Avionics System Module, including the Ardupilot Mega, integrated into the airframe.

Page 8: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Figure 4: Assembled Avionics Hardware including the Ardupilot Mega.

2.D. Airframe Module

The selection of a feasible airframe was a vital aspect of integrating the hardware modules of the Airborne System. The decision of using a COTS airframe allowed the team to focus their attention on other vital subsystems of the SOAR UAS while having a readily available stock of airframes. By compiling a list of design requirements, the team was able to come to a conclusion on using the almost ready-to-fly Multiplex EasyStar as the new airframe. Made of Elapor Foam, the EasyStar provided a lightweight, portable and highly durable solution compared to the previously used airframes. Figure 5 shows the top view of the Multiplex EasyStar in comparison with two previously used airframes, the Hanger 9 Ultra Stick and the Senior Telemaster, combined with the specs of each airframe.

Page 9: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Figure 5: Visual Airframe Comparison with Dimensions and Weight.

2.E. Airframe Safety Considerations

The EasyStar Airframe provides the SOAR system with an unparalleled safety standard compared to previous airframe selections. The propeller, located behind the wings of the airframe, complements the fully foam body in providing minimal impact load and resulting damage in the case of a collision or crash landing. The location of the propeller also enables the safe autonomous hand launch of the aircraft. This is an important consideration in the Concept of Operation, where a prepared runway would not likely be available. Figure 6 shows the proper method of hand launching by a student posing as a Marine. Note the additional safety precautions of a helmet and safety glasses.

Page 10: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Figure 6: Aircraft hand launching stance.

2.F. Camera Payload Module The quality of aerial images plays a pivotal role in the SOAR Data Processing

Module. This system depends on both the telemetry data and on the quality of the pictures obtained from the onboard cameras. Recognizing character, shape, orientation and colors on the target requires some degree of detail in the images received to the ground station. The team decided to use two cameras for this critical portion of the system. The first camera is an Axis 207MW IP Camera, shown mounted under the wing of the EasyStar in Figure 7. It is used for Motion JPEG streaming while in flight. The second camera is a Point-and-Shoot Sony Webbie HD MHS-PM1, which provides high-resolution images for post-flight processing.

Figure 7: Axis 207MW attached on the left wing of the aircraft.

The resolution, angle of view, frame rate, data transmission rate, physical dimensions, power consumption and weight of the camera were crucial elements taken into consideration when selecting cameras. Scaled, simulated targets, shown in Table 3, were tested in the laboratory in different lighting conditions and at different scaled heights to determine how many pixels each target (ranging from 4x4 ft to 8x8 ft) would occupy, and how many pixels would be sufficient for the SOAR Data Processing Module to recognize the target characteristics.

Page 11: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Table 3: Pictures of test targets to determine minimum required pixel density. 4x4 feet Targets at 150 ft using different resolutions.

320x240

Target Pixel Density: 6x6

320x240

Target Pixel Density: 7x7

800x600

Target Pixel Density: 16x16

800x600

Target Pixel Density: 17x17

After extensive testing in varying lighting conditions, vibration testing, and

indoor versus outdoor image comparisons, the effective minimum number of pixels needed to identify all the characteristics of a 4x4 ft target at a height of 200 ft was concluded to be 16x16 pixels. It should be noted that resolution and the angle of view is of equal importance, since either property can be varied to obtain the desired pixel density. Table 4 shows the specification of the selected Axis 207MW IP camera. The SOAR Data Link Module was tested to determine the frame rate of Motion JPEG feed given in the table.

Table 4: Axis 207MW Relevant Specs Table.

Axis 207MW Video Resolution: 640x480 to 1280x1024 pixels Frame Rate: 12 fps @ 1280x1024 Horizontal Angle of View: 94.0° (for selected lens) Size: 2.2”W x 3.5”H Weight: 0.42 lb Power Consumption: 4.0 Watts

The selected Sony Webbie HD Point-and-Shoot camera, shown in Figure 8,

provided the team with high-resolution images for post-flight processing. Unfortunately, the off-the-shelf camera did not have the capability to capture auto-timed images. To overcome this limitation, the team used a Mini Arduino Pro microprocessor (also shown in Figure 8) to enable the capture of auto-timed images with an electrical system. The interval between the images was determined to be 5 seconds. At a height of 200 ft and at typical flight speeds and turn rates, testing confirmed the rate of capture was sufficient to capture a target of interest in at least two successive images.

Page 12: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Figure 8: Left to Right: Mini Arduino Pro, Sony Webbie HD.

Table 5: Sony Webbie HD Relevant Specs Table.

Sony Webbie HD

Still Picture Resolution: 5038K  Pixels Horizontal Angle of View: 78.0°

Size: 15/16”W  x  4  1/8”H  x  2  ¼”B Weight: 0.25 lb Power Consumption: 2.0 Watts

3. Ground Station Subsystem

The Ground Control Station Subsystem is comprised of equipment to command, control, and monitor the mission progress. The subsystem includes two laptops and an Antenna Tracking System (ATS).

Figure 9 shows how each component integrates into the Ground Control Station Subsystem. With the necessity for high portability of the SOAR system, the team approached a more compact design for the Ground Control Station. Using only two laptops minimizes manpower required to transport the equipment and facilitates easy setup for mission execution. The two laptops provide different functionality to the team. Namely, the Image Acquisition Laptop is used to receive the motion JPEG feed during flight, and post-flight image processing and the Autopilot Laptop runs the Ardupilot Ground Station (AGS) software monitoring the flight and receiving telemetry data. Telemetry data is transferred from the aircraft in flight to the Ground Control Station via a 900MHz connection to the Autopilot Laptop. The user then analyzes the telemetry data for the purposes of in-flight re-tasking and terminating flight. In addition to the Autopilot Laptop, the telemetry data is sent to the Image Processing Laptop for in-flight and post-flight target location analysis. The Image Acquisition Laptop sends the

Page 13: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

telemetry data to the ATS for orienting the antenna to be in range of the aircraft at all times. The ATS is responsible for pointing at the aircraft at all times to maintain a reliable 2.4GHz WiFi connection.

Figure 9: Ground Control Station.

3.A. Ardupilot Ground Station

To get a visual representation of how the aircraft is performing in flight, the autopilot onboard the aircraft sends data to the Ground Control Station which displays it to the user via the AGS, as seen in Figure 10.

The AGS provides the team with the functionality to command, control and monitor the mission during flight. Having been custom built to satisfy the team’s requirements, the AGS is capable of visually tracking the aircraft through accurate background satellite imagery of the location, in-flight re-tasking and the functionality to show the loss of communication and the resulting approximate landing site.

Figure 10: The Ardupilot Ground Station.

Page 14: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

3.B. Data Processing Module The SOAR Data Processing Module is dependent on the telemetry data from the

Ardupilot, the Motion JPEG feed from the Axis 207MW IP camera, and the high resolution imagery from the Sony Webbie HD point-and-shoot camera. The telemetry data is used for monitoring the flight and in-flight re-tasking on the Autopilot Laptop, and to determine the location of targets on the Image Acquisition Laptop. The Motion JPEG feed is used to visually recognize the targets during flight and mark the approximate target locations on a large poster-sized map of the search area. This is done manually by the team members in charge of the command and data handling during the mission. The telemetry data is recorded in conjunction with the images taken on the Sonny Webbie HD so as to provide the data relevant to each picture taken by the point and shoot camera. Figure 11 shows the telemetry data flow from the aircraft to the Ground Control Station.

Figure 11: Telemetry Data Flow.

Although the image characteristics relating to color, shape and alpha-numeric

are done manually, the GPS coordinates of the images are obtained using the SOAR Ground Plane Interpolation (SGPI) software. Since the Webbie HD is a fixed position camera, necessary steps had to be taken to accurately locate the position of the target based on the orientation of the aircraft and the position of the target in the image. The image is corrected before processing to eliminate the barrel distortion of the lens, as shown in Table 6.

The Telemetry Data from the Ardupilot Autopilot is used to perform coordinate transformations between the picture coordinate system and the ground coordinate system. Using these transformations along with camera specifications such as focal length and pixel size, the pixel coordinates are converted into a ground location relative to the aircraft. Using the position of the plane, the absolute GPS position of the target is then determined and automatically extracted to a spreadsheet with the target image and their corresponding GPS coordinates.

Page 15: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Table 6: Image Correction.

       →  

Original Image Corrected Image

4. Network Subsystem The aircraft is connected to the Ground Control Station using three different data

communication links. Each link is dedicated to a specific function to be able to transmit necessary information to and from the Ground Control Station to the aircraft. Figure 12 illustrates the data links used in the system.

Figure 12: SOAR UAS Data links

4.A. Telemetry/Ground Control Data Link The Telemetry/Ground Control Data Link is run on the 900MHz band. This data

link is designed to transmit Telemetry Data from the SOAR Airborne Subsystem to the Ground Control Station. The Ground Control Station uses the AGS software to send new waypoints and instructions to the SOAR Airborne Subsystem with an effective Line-of-Sight of 5 miles.

4.B. Imagery Data Link The imagery data link was specifically designed for the mission to allow for smooth

communication and video transmission between the Ground Control Station and the

Page 16: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

SOAR Airborne Subsystem using a high bandwidth 2.4 GHz WiFi link. To be able to communicate with the aircraft over the WiFi link, a modified Linksys WRT54G router was used. Even though many different routers were tested for this mission, the WRT54G was chosen for a number of different reasons.

Firstly, the WRT54G contains two external antennas that use the standard N-type connectors, which are compatible with many different antennas available on the market. This feature allowed the team to remove one of the router's built-in antennas and mount an external 13dBi directional WiFi Patch Antenna, vastly increasing the range provided by the router’s stock antennas. Secondly, the WRT54G provided the team of loading 3rd party firmware into the router instead of using the standard Linksys firmware that is preloaded by the factory.

Loading 3rd party firmware provided the team with full upload and download Bandwidth Monitoring, Signal Amplification and Quality of Service (QoS), allowing the customization and the optimization of the network and imagery data link for improved performance. While there are many 3rd party firmware projects that can be installed on the WRT54G routers, the DD-WRT firmware was chosen as the main firmware for the routers. The DD-WRT firmware being the biggest open-source firmware project that is currently under active heavy development contains all the features that were needed to be able to setup a powerful and reliable network between the Ground Control Station and the aircraft.

The network is configured as shown in Figure 13. The antenna allows the router to transmit the wireless signal over a very long distance and the Signal Amplifier enables the camera to connect to the wireless network by amplifying its wireless signal.

Figure 13: Imagery Data Link.

The limited field of view on the Patch Antenna meant that it was necessary for the antenna to point at the SOAR Airborne Subsystem at all times in order to maintain a stable network connection. To ensure that the antenna points to the SOAR Airborne Subsystem at all times, the ATS, shown in Figure 14, was implemented. The ATS consists of hardware and software developed specifically to follow the SOAR Airborne Subsystem using the telemetry data received from the Autopilot.

Page 17: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Figure 14: 13dBi Patch Antenna Tracking System.

4.C. Safety Link

The safety link consists of a 2.4GHz Futaba 6 channel RC controller. In case of an emergency, flight control can be taken from the autopilot and given to the RC pilot. The system is also designed to return to home and terminate flight to satisfy the mission requirements.

5. Testing and Evaluation 5.A. Key Performance Parameters

The SOAR UAS required and desired system requirements were tested for functionality and safety while closely following the team’s testing methodology. Provided below are the results obtained in testing the KPP of system.

5.A.I. Autonomy

Table 7 shows the autonomous tests carried out by SOAR UAS. Results indicate sufficient evidence of meeting the parameter objective.

Table 7: Autonomous Testing Function Number of Tests

Autonomous Waypoint Navigation 193 Autonomous Take-off 73 Autonomous Landing 44

5.A.II. Target Location: Using Manual Image Satellite Image Overlay Method.

Using Google Earth, Webster Field is promptly displayed on the width of the Image Acquisition Laptop as shown in Figure 15. Using the Motion JPEG feed transmitted from the Axis 207MW, the targets are manually overlaid onto the satellite image. Depending on the resolution of the Google satellite imagery, accuracy of the overlaid image locations may vary. During competition the team will manually record the image attributes on a high-resolution printed Image of Webster Field. Latest tests results indicate an accuracy of 28.8ft after accounting for errors from Google imagery. This method is used primarily for preliminary processing while the aircraft is in-flight.

Page 18: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

Figure 15: Image Overlay Using Google Earth.

5.A.III. Target Location: Using SOAR Ground Plane Interpolation Software.

Table 8 shows results of the preliminary test carried out in the laboratory using the SGPI Software. During development, a CAD drawing was created with two coordinate systems. This allowed us to verify the mathematical calculations by comparing results between measured CAD values and program output. Stationary pictures were then taken in the lab using orientation data from the Piccolo to ensure the code integrated with the information received. This test verified that the location being calculated is correct by means of physically measuring the distances. Finally, aerial pictures were taken from the aircraft to test the GPS accuracy. Targets were placed at known locations and the code was tested for accuracy of the calculated position. The SGPI Software will be used for post-flight processing for more accurate target location analysis.

Table 8: SGPI Software Preliminary Test Results Height Target Location Accuracy 750 ft +/- 82 ft 500 ft +/- 52 ft 200 ft +/- 16 ft

5.A.IV. Imagery

Table 9 shows an image acquired while in flight and its identifiable characteristics.

Table 9: Image Characteristics Sample Image Characteristics

Shape: Rectangle Background Color: Blue Orientation: East-West

Alphanumeric: I

(6ft x 4ft target)

Alphanumeric Color: White

Page 19: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

5.A.V. Mission Time

The time currently taken to complete a mission on a similar sized search area of Webster Field is shown in Table 10.

Table 10: Allocated Mission Time Mission Phase Time

Pre-Flight Phase: 5 mins Flight Phase: 15 mins Post-Flight Phase: 10 mins

VI. Conclusion In response to the Statement of Work issued by the Seafarer Chapter of AUVSI,

Embry-Riddle has developed the Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System (UAS). The SOAR system includes a modified ARF EasyStar air vehicle platform, an open-source Arduino Mega autopilot, custom-developed Ground Control Station and Data Processing Modules and reliable data and imagery links operating at 900 MHz and 2.4 GHz, respectively.

An innovative two-camera system provides real-time imagery at a resolution of 1280x1024 and post-flight still images at a resolution of 2592 x 1944. With a total airborne subsystem cost of less than $1000, SOAR provides a full autonomy in all phases of the mission, including takeoff and landing, at a fraction of the cost of most comparable products.

Most importantly, the SOAR system provides the soldier in the field with a simple solution to obtain accurate and easily interpreted intelligence, surveillance and reconnaissance.

Page 20: Embry Riddle Aeronautical University - AUVSI SUAS · Embry Riddle Aeronautical University Self-Operating Aerial Reconnaissance (SOAR) Unmanned Aerial System 2010 AUVSI Student UAS

VII. Acknowledgements Team SOAR would like to recognize the contributions of the Embry-Riddle Aeronautical University for providing the funds and recourses needed in making this project a success. The team would like to thank the project advisors, Dr Charles Reinholtz and Dr Cameron Wang for their guidance and support. Team SOAR would also like to acknowledge Chris Sammet, Chris Kirby and Rajan Katari for their invaluable contribution to this project.

VIII. References "2010 RFP." 2010 AUVSI PEO(UW) Student UAS Competition. AUVSI. Web.

<http://65.210.16.57/studentcomp2010/rules/2010%20RFP20090824.pdf>.

"AXIS 207MW Network Camera - Wireless Megapixel IP Camera" Axis Communications - Leader in Network Cameras and Other IP Networking Solutions. Axis Communications. Web. <http://www.axis.com/products/cam_207mw/>.

"Sony Webbie HD Camera MHS-PM1" Sony Consumer Electronics. Web. <http://www.sonystyle.com/webapp/wcs/stores/servlet/ProductDisplay?catalogId=10551&storeId=10151&langId=-1&productId=8198552921665736688>.

"TowerHobbies.com - Multiplex Easy Star Kit 54"" Tower Hobbies. Web. <http://www3.towerhobbies.com/cgi-bin/wti0001p?&I=LXFRU7>.