7
A Quadrocopter Hovering Above a Person Wearing a Modified Cap Karl E. Wenzel, Andreas Masselli, Andreas Zell Chair of Computer Architecture, Department of Computer Science University of T¨ ubingen 72076 T¨ ubingen, Germany Tel: (+49/0) 7071 / 29 78989 Fax: (+49/0) 7071 / 29 5091 {karl.e.wenzel, andreas.masselli, andreas.zell} @uni-tuebingen.de Abstract We here describe an application of a quadrocopter hov- ering stably above a person wearing a cap with 4 infrared LEDs. The application has originally been developed in the context of wearable computing. Our “wearable helicopter” hovers above the user. In further work, the system could send a bird’s eye view of the area to the operator, while the robot follows him. The entire aerial robot works autonomously, all processing is done onboard. We use a miniature quadro- copter with an inertial measurement unit and a low-cost Wii remote camera, i.e., commodity consumer hardware. This proof of concept gives insight into flight control. The system is evaluated in a series of experiments. 1. Introduction In this paper, we introduce the idea of using an au- tonomous quadrocopter, flying stably above the user’s head. The user simply has to wear a modified cap. The key idea of our approach is to track a pattern of in- frared spots located at the cap. The camera looks downwards and is attached to the center of the quadrocopter frame. The pattern is designed to provide an efficient calculation of its position. In addition to the information given by the camera, we require data from the inertial measurement unit (IMU). By analyzing this information, we have four degrees of freedom (x,y,z, and yaw angle ψ) as inputs to the control loop to hover above the cap. The roll and pitch angles ϕ and θ are controlled internally by the control unit shipped with the quadrocopter, using only IMU data. ϕ and θ are adjusted in order to stay at an x and y position around 0. Our control loop is designed as a simple algorithm running on an onboard microcontroller at a high frequency. Currently, existing quadrocopters are remotely controlled by an operator. Some models offer more comfortable com- mands like setting a specific GPS position or route. Nev- ertheless, the operator is busy handling the aerial vehicle. Our system would allow for hands-free operation, while the quadrocopter holds a constant relative pose over the user. Our wearable helicopter prototype has been tested in a number of experiments. In this paper, we describe its design and our experimental results of hovering above the cap. This is a proof of concept. Criteria such as user safety, user acceptance and aesthetics have to be examined more closely later. In our experiments, the aircraft hovers above the cap at a specified height between 30 cm and 1 m. The system is capable of hovering within a root mean square deviation below 3 cm at a height of 40 cm. 2. Related Work Most flying vehicles which perform onboard vision-based tracking and navigation have a significant size and weight [1, 9, 10]: In that way, they are capable of carrying personal computers with modern gigahertz processors and accurate industrial stereo cameras. Smaller aircrafts have to cope with very limited payload capacity. To find a solution for flight control, some approaches transfer calculation to a ground station; this allows for complex flight control mechanisms [2, 5, 6], but limits autonomy. Controlling unmanned aerial vehicles without an onboard camera, but tracked by an exter- nal motion capture system, allows for accurate positioning at a high control frequency, as cameras of a high quality and fast computers can be used [3, 12]. However, the robots can only operate in a bounded laboratory environment. Mak et al. [7] describe a six degree-of-freedom visual tracking system for a miniature helicopter using only three onboard LEDs and a single on-ground camera. Roberts et al. [8] present a low-cost flight control system for a small outdoor helicopter where all processing is performed onboard. For a comparable solution on miniature flying robots as the Hummingbird quadrocopter, the system has to be even smaller and lighter. Hay et al. [4] explain how objects can be tracked optically using two Wiimotes and achieved accurate results with a minimum of hardware costs. These solutions

A Quadrocopter Hovering Above a Person Wearing a … Quadrocopter Hovering Above a Person Wearing a ... Our control loop is designed as a simple algorithm running on an onboard microcontroller

  • Upload
    ngothuy

  • View
    224

  • Download
    3

Embed Size (px)

Citation preview

Page 1: A Quadrocopter Hovering Above a Person Wearing a … Quadrocopter Hovering Above a Person Wearing a ... Our control loop is designed as a simple algorithm running on an onboard microcontroller

A Quadrocopter Hovering Above a Person Wearing a Modified Cap

Karl E. Wenzel, Andreas Masselli, Andreas Zell

Chair of Computer Architecture,Department of Computer Science

University of T̈ubingen72076 T̈ubingen, Germany

Tel: (+49/0) 7071 / 29 78989Fax: (+49/0) 7071 / 29 5091

{karl.e.wenzel, andreas.masselli, andreas.zell}@uni-tuebingen.de

Abstract

We here describe an application of a quadrocopter hov-ering stably above a person wearing a cap with 4 infraredLEDs. The application has originally been developed in thecontext of wearable computing. Our “wearable helicopter”hovers above the user. In further work, the system could senda bird’s eye view of the area to the operator, while the robotfollows him. The entire aerial robot works autonomously,all processing is done onboard. We use a miniature quadro-copter with an inertial measurement unit and a low-cost Wiiremote camera, i.e., commodity consumer hardware. Thisproof of concept gives insight into flight control. The systemis evaluated in a series of experiments.

1. Introduction

In this paper, we introduce the idea of using an au-tonomous quadrocopter, flying stably above the user’s head.The user simply has to wear a modified cap.

The key idea of our approach is to track a pattern of in-frared spots located at the cap. The camera looks downwardsand is attached to the center of the quadrocopter frame.The pattern is designed to provide an efficient calculationof its position. In addition to the information given by thecamera, we require data from the inertial measurement unit(IMU). By analyzing this information, we have four degreesof freedom (x, y, z, and yaw angleψ) as inputs to the controlloop to hover above the cap. The roll and pitch anglesϕand θ are controlled internally by the control unit shippedwith the quadrocopter, using only IMU data.ϕ and θ areadjusted in order to stay at anx and y position around0.Our control loop is designed as a simple algorithm runningon an onboard microcontroller at a high frequency.

Currently, existing quadrocopters are remotely controlledby an operator. Some models offer more comfortable com-mands like setting a specific GPS position or route. Nev-ertheless, the operator is busy handling the aerial vehicle.

Our system would allow for hands-free operation, while thequadrocopter holds a constant relative pose over the user.

Our wearable helicopter prototype has been tested in anumber of experiments. In this paper, we describe its designand our experimental results of hovering above the cap.This is a proof of concept. Criteria such as user safety, useracceptance and aesthetics have to be examined more closelylater.

In our experiments, the aircraft hovers above the cap ata specified height between 30 cm and 1 m. The system iscapable of hovering within a root mean square deviationbelow 3 cm at a height of 40 cm.

2. Related Work

Most flying vehicles which perform onboard vision-basedtracking and navigation have a significant size and weight[1, 9, 10]: In that way, they are capable of carrying personalcomputers with modern gigahertz processors and accurateindustrial stereo cameras. Smaller aircrafts have to cope withvery limited payload capacity. To find a solution for flightcontrol, some approaches transfer calculation to a groundstation; this allows for complex flight control mechanisms[2, 5, 6], but limits autonomy. Controlling unmanned aerialvehicles without an onboard camera, but tracked by an exter-nal motion capture system, allows for accurate positioningat a high control frequency, as cameras of a high qualityand fast computers can be used [3, 12]. However, the robotscan only operate in a bounded laboratory environment. Maket al. [7] describe a six degree-of-freedom visual trackingsystem for a miniature helicopter using only three onboardLEDs and a single on-ground camera. Roberts et al. [8]present a low-cost flight control system for a small outdoorhelicopter where all processing is performed onboard. Fora comparable solution on miniature flying robots as theHummingbird quadrocopter, the system has to be evensmaller and lighter. Hay et al. [4] explain how objects can betracked optically using two Wiimotes and achieved accurateresults with a minimum of hardware costs. These solutions

Page 2: A Quadrocopter Hovering Above a Person Wearing a … Quadrocopter Hovering Above a Person Wearing a ... Our control loop is designed as a simple algorithm running on an onboard microcontroller

deal with a minimum of hardware demands, but still dependon a permanent connection to the ground station to control aflying robot. Our approach is different from those describedabove, as we are using the IR camera from a Wiimoteonboard. The approach only depends on four IR diodesas external landmarks, which can easily be integrated in awearable cap, for instance.

Another project related to our approach is the topshothelmet constructed by Julius von Bismarck [11]. The helmetdisplays the user in a bird’s eye view like it is knownfrom some computer games. His first idea was to use aquadrocopter, but he did not have a proper tracking solution.The final configuration employs a balloon, connected to ahelmet.

3. System Components

The construction of our system consists of a miniaturequadrocopter, equipped with an infrared (IR) camera, and acap provided with four infrared LEDs. We use an AscTecHummingbird AutoPilot quadrotor helicopter. The camera ispart of the Wii remote controller (informally known as theWiimote), distributed by Nintendo. It is able to track infraredblobs at a frequency of 200 Hz and more. The integratedcircuit supplies the pixel position of each tracked blob.This section describes the hardware of the quadrocopter, theinfrared camera and the cap.

3.1. The Quadrocopter

The AscTec Hummingbird AutoPilot quadrocopter(Fig. 1) has a diameter of 53 cm and a weight of only0.5 kg. With these properties it is able to fly indoor andoutdoor with high agility. A 2,100 mAh lithium-polymeraccumulator powers the electronic motors and the onboardelectronics. A flight time of up to 23 minutes can beachieved, depending on additional payload and flightmaneuvers.

The Hummingbird Autopilot platform provides a three-axis gyroscope, an accelerometer, a compass module, aGPS sensor and a pressure sensor. The sensors, the micro-controller and the flight control run at 1kHz such that areliable stabilization of the quadrocopter is ensured. Hence,the Hummingbird already provides basic outdoor autonomywithout any additional hardware. Abilities like hoveringabove a given position are realized with the GPS sensor.GPS-based flight is not feasible for our application, becauseGPS is not accurate enough, erroneous near buildings andrestricted to outdoor usage.

As we also did experiments with other aircrafts, wedecided to program a separate board, populated with anATmega AVR 644P microcontroller clocked at 14 MHz. Theentire control is processed on this board. Additionally, the

Figure 1. The AscTec X3D-BL Hummingbird quadro-copter.

microcontroller sends sensor information to the optional basestation, which is solely used for monitoring purposes.

The microcontroller framework is able to communicatewith different control protocols such that our approach canbe applied to various quadrocopters. The Hummingbird hasproven to provide roll and pitch estimates of satisfactoryquality, leading to accurate position estimates.

We attached the camera in the center of the quadrocopterframe. Thus, we do not need to calculate additional transla-tions for position computations.

3.2. The Wii Remote Infrared Camera

When chosing a sensor for a miniature flying vehicle,we have to attach great importance to its size and weight.The Wiimote Infrared camera (Fig. 2) has a dimensionof 8 × 8 × 5mm at a weight of0.4 g. It is part of aBluetooth compatible controller, designed for interfacingthe Nintendo Wii game console. Beside the camera, thecontroller provides a three-axis accelerometer,12 digitalbuttons and an expansion port to connect other input devices.The price is 40e.

Figure 2. The sensor of the Wii remote.

Originally, the camera is used in conjunction with twoinfrared spots. In this way the system can determine the

Page 3: A Quadrocopter Hovering Above a Person Wearing a … Quadrocopter Hovering Above a Person Wearing a ... Our control loop is designed as a simple algorithm running on an onboard microcontroller

position and orientation of the controller to manipulate acursor on the screen.

The camera system – a multi-object tracking engine (MOTsensor) – is manufactured by PixArt Imaging1. It is alreadycapable of blob tracking of up to four IR sources. The actualresolution is internally increased eightfold to 1024×768pixels by subpixel analysis. The camera sensitivity can bevaried. Its horizontal field of view is approximately45◦. Thesensor provides information about the dot size, the intensityand the bounding box of an IR blob. We operate with abasic sensor mode, which conveys merely the pixel positionat which the blob is located. Data are transferred via I2Cbus with a refresh rate of up to 250 Hz.

Just a few electronic components are required to integratethe sensor in a microcontroller circuit. The camera runs at3.3 V and requires an external synchronisation of 24 MHz.As we know the I2C bus address and the communicationprotocol, data can be obtained easily by an I2C host viapolling.

3.3. The Cap

The system requires a pattern of infrared lights which canbe recognized by the camera. We constructed a cap withfour infrared LEDs attached on top (Fig. 3(a)). This ensuresprecise positioning of the light sources and wearing comfortfor the user. We chose a configuration with all LEDs lyingin one horizontal plane.

The pattern is arranged as a T-shape (Fig. 3). It measures90 mm from the left to the right IR spot (s1), and 60 mmfrom the middle to the front IR spot (s4). With theirdimensions it fits easily on a standard cap. Each spot isrepresented by a single 940 nm wavelength infrared LED.This configuration has proven to be of good size for tracking,when the pattern must be recognized at a relatively closedistance.

4. Pose Estimation

The quadrocopter is supposed to hover above the patternwhich is integrated in a cap. To steer it has to know the posi-tion of the pattern relative to the quadrocopter. This sectiondescribes how the current position vector~p = (x, y, z, ψ)T

is estimated.x, y andz are the Cartesian coordinates of thepattern center relative to the camera andψ is the orientationin yaw. ~p can be estimated by using the Wiimote camerainformation. Roll and pitch are estimated by the IMU.

4.1. Pattern Analysis

In the first step the four elements of the pattern have tobe identified. The size of the pattern has strong influence

1. http://www.pixart.com.tw/

(a) The cap, modified for tracking.

s1

s2 s3s4

b

Lb

Rb

M

bF

(b) The pattern configuration.

Figure 3. A pattern of infrared LEDs, enabling theaircraft to recognize the user’s position.

on tracking precision at different distances. Additionally,hovering close to the pattern is bounded by the field of view,while hovering at a large distance is limited by the intensityof the LEDs and the camera resolution.

By running the Wiimote camera in basic informationmode, the positions of four points are transmitted to themicrocontroller. To get unambiguous position information,the disordered pointsF (front), L (left), M (middle) andR (right) (Fig. 3(b)) first have to be identified. To sort thepoints, we choose an approach that focuses on fast process-ing on a microcontroller. Thus, we aim at an avoidance offloating point operations as well as trigonometric functions.

We determine a line defined by two of the four points,chosen at random. Then the distances of all remaining pointsto the line are calculated. This procedure is done for alltwelve possible combinations. AsM , L, R are lying in aline, a combination ofM , L, R leads to the minimum point-to-line distance. The point not used in this combination canbe indirectly identified asF .M can be identified by finding the maximum distance be-

tweenM , L andR. LetA andB be the remaining two pointsto recognize. By writingA = (ax, ay)

T , M = (mx,my)T

andF = (fx, fy)T , the variables is given by

s = sgn[(ax −mx)(fy −my)− (ay −my)(fx −mx)] (1)

Page 4: A Quadrocopter Hovering Above a Person Wearing a … Quadrocopter Hovering Above a Person Wearing a ... Our control loop is designed as a simple algorithm running on an onboard microcontroller

A can be identified unambiguously asA = L if s > 0,otherwiseA = R. Now, we can proceed to determine thepattern position vector~p relative to the quadrocopter. Theyaw angleψ and the distancez can be calculated directlyfrom information given from the Wiimote. By contrast, thex andy position estimation depends additionally on accurateroll and pitch angle estimates as provided by the IMU.

4.2. Analysis of the Wiimote Camera Data

The following equations can efficiently be computed ona microcontroller and assume only a small displacementfrom the target position. The system has proven satisfactoryaccuracy and thus this simplified method can be used.For a bigger working area, where the quadrocopter canhover at positions other than zero, another method has beendeveloped.

The yaw angleψ is measured relative to the line definedby MF and can be easily calculated by

ψ = atan2(mx − fx,my − fy) (2)

z-distance: The distancez from the camera to the patternis computed by incorporate the view angle of the camerawhich is given asρ = 0.000761 rad per pixel. Letsi be themetric distance between two LEDs anddi the pixel distanceof the corresponding points. The distancezi from the camerato the pattern can be estimated by:

zi =si

tan(diρ)(3)

We use this basic equation to estimate the distance, becausethe roll and pitch angles of the aircraft are usually onlybetween±3 ◦ during controlled flight assuming an stationarypattern. Thus, we can assume a nearly orthogonal view angleto the pattern. We used the following four distances in ourexperiments:d1 = |L−R|, d2 = |L−F |, d3 = |R−F |, d4 =|M − F |. Using onlyd1 proved to give satisfactory resultsin distance measuring with a stationary camera. However,there might be disturbances during flight causing noise insubpixel analysis. This noise can be decreased by using themean value:

z =1

4(z1 + z2 + z3 + z4) (4)

After the identification of the points, the point order is kept.Using this additional information for sequential analysis,most information can be extracted with only two recognizeddots. This is of great benefit, when occlusion or closedistances prevent to capture all points.

4.3. Combining Wiimote Data With InformationGiven by the IMU

The x and y position of the pattern relative to theaircraft can only be estimated when the camera orientation

is known. The camera orientation, which is identical tothe quadrocopter orientation, is estimated by the onboardIMU (roll angle ϕquad and pitch angleθquad). Additionally,the position of the pattern in the camera image has to becalculated. The position of the pattern center is defined byM . Let C = (cx, cy)

T = (384, 512)T be the center of theimage, defined by the camera resolution of1024×768 pixels.Now, we can calculate the distance betweenM andC in x-and y-direction in pixels. As we know which view angle thecamera provides per pixel, we can calculate the orientationof the pattern relative to the camera center (ϕcam andθcam).

θcam = (mx − cx)ρ (5)

ϕcam = (my − cy)ρ (6)

Potential inaccuracies in the camera arrangement arecompensated with the calibration anglesϕc and θc. Thecalibration angles are constants, determined after attachingthe camera by measuring the deviation with an adjustedrobot. By combining the angles, the total displacement ofthe pattern can be estimated:

ϕ = ϕquad+ ϕcam+ ϕc (7)

θ = θquad+ θcam+ θc (8)

x = tan(ϕ)z (9)

y = tan(θ)z (10)

5. Flight Control

The idea of autonomously hovering above a person re-quires accurate flight controllers for the quadrocopter. Withour system, even autonomous take off and landing is pos-sible, which is described in [13]. However, in this paperwe focus only on hovering, assuming the quadrocopter hasalready reached its stationary position above the cap withthe IR markers.

By estimating the current position vector~p = (x, y, z, ψ)T

of the pattern, a flight controller is able to navigate theaircraft to a desired position relative to~p. As the goal isto hover stationary above the pattern,x, y and ψ shouldstay zero andz was set to be 40 cm. Our control algorithmsare similar to the independent, fast onboard controllers men-tioned in [3], where four individual controllers are operating.

The control loop is currenty performed every 50 ms.Most of the time is needed to request the current poseestimation from the Hummingbird quadrocopter (30 ms).Additional 10 ms are required to send sensor information tothe base station, where the current status is monitored andvisualized. The remaining 10 ms are for retrieving camerasensor information and running the control algorithm. Aconsiderably higher control frequency would be possiblewith an accelerated IMU request.

Page 5: A Quadrocopter Hovering Above a Person Wearing a … Quadrocopter Hovering Above a Person Wearing a ... Our control loop is designed as a simple algorithm running on an onboard microcontroller

(a) The robot hovering

θp

b

Fb

M

(b) Schematic lateral view of the configuration

Figure 4. The robot hovering autonomously in anangular attitude above the cap.

5.1. Height Controller

The thrust control value, which has to be sent to thequadrocopter to hover at a desired height, varies while flying,as it depends on the actual payload and battery charge. Thisvalue can be determined by beginning with a default valueand increasing it whenever the robot is below the desiredheight for some cycles, and decreasing it otherwise. Weimplemented a fast PID loop which controls the thrust valuearound the hover value.KD is the major component. Fig. 5shows the structure of the thrust controller.

W +

Accu y

d

dt

∫b

b

KP

KI

KD

+ + + Systemu y

Derivative

Integral

Accumulator

Figure 5. The thrust controller

5.2. Roll, Pitch and Yaw Controller

Our aircraft has a symmetric rotor configuration. Thusthe x and y controllers are identical. Their control param-eters are more difficult to derive than the height controllerparameters, since the behavior response is not proportionalto horizontal speed but to rotational velocities. To achievea stable position hold, a predictive control, much like ahuman pilot would operate, is required. By designing acascaded control loop, where speed and acceleration arehighly weighted, the prognostic ability has been realized.Fig. 6 shows the structure of thex andy controller.

W +

d

dt

d2

dt2

∫b

b

KP

KI

KD

KD′

+ + + Systemu y

Derivative

Integral

Second Derivative

Figure 6. The x and y controller.

The integrated compass of the Hummingbird quadrocopteris used to control the coarse orientation. Our yaw controlleris able to achieve accurate orientation to the pattern, evenwhen it is rotated. Our yaw controller is implemented as aPID loop with a largeKP and a smallKI .

6. Experimental Results

Our aircraft provides basic autonomy for outdoor flights.However, GPS and air-pressure position hold are not safeand accurate enough for hovering precisely above a person.Therefore, the GPS-position controller and an air-pressure

Page 6: A Quadrocopter Hovering Above a Person Wearing a … Quadrocopter Hovering Above a Person Wearing a ... Our control loop is designed as a simple algorithm running on an onboard microcontroller

Table 1. Controller characteristics of five minutes flight40 cm above the cap

∆x/mm ∆y/mm ∆z/mm ∆ψ/ ◦

Minimum -43 -50 -122 -8.4Maximum 40 40 136 3.7Peak to peak 83 90 258 12.1Mean 0.36 -0.07 -0.23 0.0Standard deviation 13.2 13.47 27.18 0.2

sensor for height control have been disabled for our experi-ments. The yaw angle is controlled by the internal magneticcompass of the quadrocopter.

First experiments were done with a pattern constructed ona standard circuit board. Following a large number of flightsand retrieval of working parameters, the system has shownaccurate position hold capability. Details about hoveringina defined height above the pattern can be found in [14]. Thesystem was capable of autonomous take off and landingas well [13]. For our wearable helicopter, the pattern isembedded on a cap and thus had to be slightly redesigned.

The quadrocopter produces a considerable amount ofwind, leading to noticeable cooling. One problem is the loudnoise, coming with the wind, and the lack of knowlege aboutthe current behavior of the aircraft by reason that the wearerdoes not see it.

The position-hold results achieved in our experiments arecomparable to those with the slightly larger pattern used in[13] and [14]. Table 1 shows the controller characteristicsof the flight. A standard deviation below 3 cm in all threeaxes ensures proper hovering at 40 cm above the cap.

As mentioned in Section 4.3, the estimations of roll andpitch are essential for accuratex and y approximations.While the standard deviations of∆x and ∆y are rathercomparable, one can notice a larger standard deviation in∆z position control. The yaw angle is uncritical for ourintention, but affected by all balancing maneuvers in flightand thus depends on the position changes inx-,y- and z-directions.

Figure 7 shows a detailed record of one minute of a flight40 cm above the cap. The plot shows the smooth positionestimation sent to the base station while flying. Outliers,caused by external influences, such as infrequent reflectlights can easily be filtered.

As sunlight contains a notable fraction of infrared light,reflections of sunlight can lead to wrong pattern interpreta-tion. This could be reduced by stronger infrared sources andshould be investigated in future work.

7. Conclusion and Future Work

Our wearable helicopter has proven that it is possible tohover autonomously above a person wearing a special cap.The deviation from the desired position, which we achieve

025

−25−50

10 20 30 40 50 60∆x/m

m

t

025

−25−50

10 20 30 40 50 60∆y/m

m

t

025

−25−50

10 20 30 40 50 60∆z/m

m

t

0

5

−5 10 20 30 40 50 60∆ψ/◦

t

Figure 7. Position errors of five minutes hover flight40 cm above the cap.

by means of our control algorithm, is small enough for quasi-stationary flight in an indoor environment.

The Wii remote camera is capable of accurate infrared(IR) blob tracking and provides the pixel position infor-mation of each source. A microcontroller is capable ofestimating the position to the target and controlling aminiature flying robot in hovering flight without additionalground sensors and without a base station. However, in manycountries regulations on the use of model airplanes requirethat a remote control must be present to take over controlif needed. As the wearer does not see the aircraft whileoperating, a second person must be present, assuming thisrole. This rules out truly autonomous operation.

The distance to the target is limited by the dimension ofthe pattern and the IR light emission. By using multiple IRLEDs per point or stronger LEDs, the operating distance canbe increased. However, an accurate roll and pitch estimationprovided by the aircraft’s IMU is essential for stable positionhold.

The current frequency of 20 Hz is sufficient for robusthovering control. Nonetheless, when the user moves quickly,the camera loses sight of the pattern. An increased controlfrequency or a panning camera will advance the controlaccuracy and enable the wearable helicopter to track amoving person over time.

Indisputably, our approach can only be understood as aproof of concept, but reasonable tasks are also imaginable:A video camera attached to the helicopter could monitor aperson or ground vehicle better than a human pilot could doit. This enables surveillance tasks of a novel category andenables the wearer to see himself from a new perspective.

Page 7: A Quadrocopter Hovering Above a Person Wearing a … Quadrocopter Hovering Above a Person Wearing a ... Our control loop is designed as a simple algorithm running on an onboard microcontroller

The system could act as a digital periscope for observationfrom a concealed position. Inspection of places which arenormally not reachable becomes possible. A similar systemcould be used in sport events, providing unusual camerashots. One could even envision a wearable helicopter flyingabove runners in running events or above the quarterbackin football games to give a bird’s eye view from theirposition, if the current short flying times of 15 minuteswere longer and the wind from the rotors was ruled asacceptable or beneficial and not as giving unfair advantagesor disadvantages. Also police workers could make use ofsuch an aircraft for uncomplicated documentation of theirassignment.

A stationary robot at a high altitude, not stringentlydirectly above a person, could be used for enlarging mobilephone reception. So, a life-saving phone call might becomepossible in areas where connectivity is restricted.

References

[1] Eric Frew, Tim Mcgee, Zuwhan Kim, Xiao Xiao,S. Jackson, M. Morimoto, S. Rathinam, J. Padial,and R. Sengupta. Vision-based road-following usinga small autonomous aircraft. InIEEE AerospaceConference, volume 5, pages 3006–3015, 2004.

[2] Nicolas Guenard and Tarek Hamel. A practical visualservo control for an unmanned aerial vehicle. InIEEEInternational Conference on Robotics and Automation(ICRA), pages 1342–1348, 2007.

[3] Daniel Gurdan, Jan Stumpf, Michael Achtelik, Klaus-Michael Doth, Gerd Hirzinger, and Daniela Rus.Energy-efficient autonomous four-rotor flying robotcontrolled at 1 kHz. InIEEE International Conferenceon Robotics and Automation (ICRA), pages 361–366,Roma, Italy, 2007.

[4] Simon Hay, Joseph Newman, and Robert Harle. Op-tical tracking using commodity hardware. In7thIEEE/ACM International Symposium on Mixed andAugmented Reality (ISMAR), pages 159–160, 2008.

[5] Bruno Herisse, Francois-Xavier Russotto, TarekHamel, and Robert E. Mahony. Hovering flight andvertical landing control of a VTOL unmanned aerialvehicle using optical flow. InIEEE InternationalConference on Intelligent Robots and Systems (IROS),pages 801–806, 2008.

[6] Christopher Kemp. Visual Control of a MiniatureQuad-Rotor Helicopter. PhD thesis, Churchill College,University of Cambridge, 2006.

[7] Lin Chi Mak and Tomonari Furukawa. A 6 DoFvisual tracking system for a miniature helicopter. In2nd International Conference on Sensing Technology(ICST), pages 32–37, November 2007.

[8] Jonathan Roberts, Peter Corke, and G. Buskey. Low-cost flight control system for a small autonomous heli-

copter. InIEEE International Conference on Roboticsand Automation (ICRA), pages 546–551, Taipei, Tai-wan, September 2003.

[9] Omid Shakernia, Yi Ma, T. John, and Koo ShankarSastry. Landing an unmanned air vehicle: Vision basedmotion estimation and nonlinear control.Asian Journalof Control, 1:128–145, 1999.

[10] Cory S. Sharp, Omid Shakernia, and S. Shankar Sastry.A vision system for landing an unmanned aerial vehi-cle. In IEEE International Conference on Robotics andAutomation (ICRA), Seoul, Korea, pages 1720–1727,2001.

[11] Julius von Bismarck. Topshot helmet.http://www.juliusvonbismarck.com/topshot-helmet/index.html.

[12] Kei Watanabe, Yasushi Iwatani, Kenichiro Nonaka,and Koichi Hashimoto. A visual-servo-based assistantsystem for unmanned helicopter control. InIEEEInternational Conference on Intelligent Robots andSystems (IROS), pages 822–827, 2008.

[13] Karl E. Wenzel and Andreas Zell. Automatic take off,hovering and landing control for miniature helicopterswith low-cost onboard hardware. InProceedings ofthe AMS’09, Autonome Mobile Systeme 2009, pages73–80, Karlsruhe, Germany, December 3-4 2009.

[14] Karl E. Wenzel and Andreas Zell. Low-cost visualtracking of a landing place and hovering flight controlwith a microcontroller. Journal of Intelligent andRobotic Systems, 57:297–311, 2009.