71
IN DEGREE PROJECT COMPUTER SCIENCE AND ENGINEERING, SECOND CYCLE, 30 CREDITS , STOCKHOLM SWEDEN 2019 Autonomous Recharging System for Drones: Detection and Landing on the Charging Platform MARÍA ÁLVAREZ CUSTODIO KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

  • Upload
    others

  • View
    9

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

IN DEGREE PROJECT COMPUTER SCIENCE AND ENGINEERING,SECOND CYCLE, 30 CREDITS

, STOCKHOLM SWEDEN 2019

Autonomous Recharging System for Drones: Detection and Landing on the Charging Platform

MARÍA ÁLVAREZ CUSTODIO

KTH ROYAL INSTITUTE OF TECHNOLOGYSCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

Page 2: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I
Page 3: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

KTH Royal Institute of Technology

Degree program in Electrical Engineering

MASTER THESIS

Autonomous Recharging System for Drones:Detection and Landing on the Charging Platform

Marıa Alvarez Custodio

Supervisor: Patric Jensfelt

Examiner: Joakim Gustafson

2019

Page 4: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I
Page 5: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Abstract

In the last years, the use of indoor drones has increased significantly in many di↵erent

areas. However, one of the main limitations of the potential of these drones is the battery

life. This is due to the fact that the battery size has to be limited since the drones have

a maximum payload in order to be able to take-o↵ and maintain the flight. Therefore,

a recharging process need to be performed frequently, involving human intervention and

thus limiting the drones applications.

In order to solve this problem, this master thesis presents an autonomous recharging

system for a nano drone, the Crazyflie 2.0 by Bitcraze AB. By automating the battery

recharging process no human intervention will be needed, and thus overall mission time

of the drone can be considerably increased, broadening the possible applications.

The main goal of this thesis is the design and implementation of a control system for the

indoor nano drone, in order to control it towards a landing platform and accurately land

on it. The design and implementation of an actual recharging system is carried out too,

so that in the end a complete full autonomous system exists.

Before this controller and system are designed and presented, a research study is first

carried out to obtain a background and analyze existing solutions for the autonomous

landing problem.

A camera is integrated together with the Crazyflie 2.0 to detect the landing station and

control the drone with respect to this station position. A visual system is designed and

implemented for detecting the landing station. For this purpose, a marker from the ArUco

library is used to identify the station and estimate the distance to the marker and the

camera orientation with respect to it.

Finally, some tests are carried out to evaluate the system. The flight time obtained is 4.6

minutes and the landing performance (the rate of correct landings) is 80%.

Keywords

Nano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-

ing, ArUco.

I

Page 6: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Sammanfattning

Under de senaste aren har anvandningen av inomhusdronare okat betydligt pa manga

olika omraden. En av de storsta begransningarna for dessa dronare ar batteritiden.

Detta beror pa att batteristorleken maste begransas eftersom dronarna har en valdigt

begransad maximal nyttolast for att kunna flyga. Darfor maste de laddas ofta, vilket

involverar manskligt ingripande och darmed begransar dronartillampningarna.

For att losa detta problem presenterar detta examensarbete ett autonomt laddningssys-

tem for en nanodronare, Crazyflie 2.0. Genom att automatisera batteriladdningspro-

cessen behovs inget manskligt ingrepp, och darigenom kan uppdragstiden for dronaren

okas avsevart och bredda de mojliga tillampningarna.

Huvudmalet med denna avhandling ar designen och implementationen av ett styrsystem

for en inomhusdronare, for att styra den mot en landningsplattform och landa korrekt pa

den. Arbetet inkluderar det faktiska laddningssystet ocksa, sa att slutresultatet ar ett

fullstandigt autonomt system.

Innan regulatorn och systemet utformas och presenteras presenteras en genomgang av

bakgrundsmaterial och analys av befintliga losningar for problemet med autonom land-

ning.

En kamera monteras pa Crazyflie 2.0 for att kunna detektera och positionera land-

ningsstationen och styra dronaren med avseende pa detta. For detektion anvands ArUco-

bibliotekets markorer vilka ocksa gor det mojligt att rakna ut kamerans position och

orientering med avseende pa markoren och darmed laddstationen.

Slutligen utfors tester for att utvardera systemet. Den erhallna flygtiden ar 4,6 minuter

och landningsprestandan (andel korrekta landningar pa forsta forsoket) ar 80%.

Nyckelord

Nanodronare, Crazyflie 2.0, regulator, autonomt laddningssystem, autonom landning,

ArUco.

II

Page 7: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Contents

1 Introduction and goals 1

1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.2 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.3 Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.4 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2 Research 3

2.1 Basic concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2.2 Research study results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3 Platform description 8

3.1 Crazyflie 2.0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3.1.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

3.1.2 Flow deck . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.2 ROS driver for Crazyflie . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.3 Camera, transmitter and receiver . . . . . . . . . . . . . . . . . . . . . . 13

4 System design 15

4.1 Proposed solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4.2 Position controller design . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4.3 Marker detector design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4.4 Charging platform design . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

4.5 System architecture design . . . . . . . . . . . . . . . . . . . . . . . . . . 18

5 Visual system 20

5.1 Marker creation with ArUco module . . . . . . . . . . . . . . . . . . . . 20

5.2 Marker detection with ArUco module . . . . . . . . . . . . . . . . . . . . 22

5.3 Camera calibration with ArUco module . . . . . . . . . . . . . . . . . . . 23

5.4 Camera pose estimation using ArUco module . . . . . . . . . . . . . . . . 25

5.5 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

6 Drone controller 29

6.1 Pose message transformation . . . . . . . . . . . . . . . . . . . . . . . . . 29

6.2 Estimate drone distance to landing point . . . . . . . . . . . . . . . . . . 31

6.3 PID controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

6.4 Landing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

6.5 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

7 Landing and recharging platform 39

III

Page 8: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

7.1 Landing station . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

7.2 Battery and battery charger . . . . . . . . . . . . . . . . . . . . . . . . . 40

7.3 Recharging system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

7.3.1 Reverse voltage protection . . . . . . . . . . . . . . . . . . . . . . 41

7.3.2 System design and implementation . . . . . . . . . . . . . . . . . 42

8 Evaluation and testing 45

8.1 Battery life characterization . . . . . . . . . . . . . . . . . . . . . . . . . 45

8.2 Detection performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

8.3 Landing performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

9 Conclusions and future work lines 50

9.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

9.2 Future work lines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

A Annex I: Ethical, economic, social and environmental aspects 56

A.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

A.2 Description of relevant impacts related to the project . . . . . . . . . . . 57

A.3 Detailed analysis of some of the main impacts . . . . . . . . . . . . . . . 58

A.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

B Annex II: Economic budget 60

IV

Page 9: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

List of Figures

1 Direction of rotation for each quadcopter’s rotor . . . . . . . . . . . . . . 3

2 Quadcopter axes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3 Asctec Pelican quadcopter . . . . . . . . . . . . . . . . . . . . . . . . . . 6

4 3D model of Asctec Pelican base with landing foot installed inside the

landing platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

5 Crazyflie 2.0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

6 Crazyflie 2.0 expansion port pinout . . . . . . . . . . . . . . . . . . . . . 10

7 Crazyflie 2.0 system architecture . . . . . . . . . . . . . . . . . . . . . . . 10

8 Flow deck expansion board for Crazyflie 2.0 . . . . . . . . . . . . . . . . 11

9 Module of camera and transmitter VM275T . . . . . . . . . . . . . . . . 13

10 Crazyflie 2.0 with the camera on board . . . . . . . . . . . . . . . . . . . 14

11 Feature tracking over time . . . . . . . . . . . . . . . . . . . . . . . . . . 17

12 Overall system architecture . . . . . . . . . . . . . . . . . . . . . . . . . 19

13 ArUco marker to identify the landing platform . . . . . . . . . . . . . . . 21

14 Pinhole camera model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

15 ChArUco board used to calibrate the camera . . . . . . . . . . . . . . . . 24

16 PnP problem formulation . . . . . . . . . . . . . . . . . . . . . . . . . . 26

17 Flow diagram of the visual system . . . . . . . . . . . . . . . . . . . . . . 28

18 Camera and Crazyflie 2.0 coordinate system . . . . . . . . . . . . . . . . 29

19 Problem formulation for estimating the distance to the landing point . . 31

20 PID controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

21 Flow diagram of the controller node . . . . . . . . . . . . . . . . . . . . . 38

22 Landing platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

23 Alternatives for the protection circuit . . . . . . . . . . . . . . . . . . . . 42

24 Landing pads design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

25 Overall landing and recharging platform . . . . . . . . . . . . . . . . . . 44

26 Battery life characterization with and without camera . . . . . . . . . . . 46

27 Marker deformation problem . . . . . . . . . . . . . . . . . . . . . . . . . 48

28 Economic budget of the project . . . . . . . . . . . . . . . . . . . . . . . 60

V

Page 10: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

List of Tables

1 E↵ects of independently increasing the value of the PID the parameters . 35

2 Parameters of the four PID controllers . . . . . . . . . . . . . . . . . . . 35

3 Marker detection performance . . . . . . . . . . . . . . . . . . . . . . . . 47

4 Dead angle estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

5 Landing performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

VI

Page 11: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

1 Introduction and goals

1.1 Introduction

During the last years, the use of Unmanned Aerial Vehicles (UAV) or drones has in-

creased considerably in many di↵erent applications and areas, such as industry and mili-

tary. Surveillance, transportation, delivery or even search and rescue tasks are just some

examples of applications that drones have nowadays.

However, the potential of the indoor drones is limited by the flight time due to limited

battery capacity, which in many cases allow a flight of only few minutes. This is due to

the fact that the batteries can not be very heavy as the drones have a limited payload

in order to be able to take-o↵ and fly. Therefore, a recharging process need to be per-

formed frequently. Usually, these recharging processes involve direct human intervention,

thus interrupting the autonomous capability of drones and limiting their use and ap-

plications. Therefore, by automating the battery recharging process of UAVs no human

intervention will be needed, and thus overall mission time of the drone can be significantly

increased.

For this reason, one of the research topics in drones is focused on autonomous recharging

systems. In fact, there are still many open challenges regarding the detection and landing

on the platform, as it will be explained further in chapter 2.

The main di↵erence with many of the already existing solutions for autonomous flight,

landing and recharging is that in this thesis a complete system is provided, including the

ability to autonomously fly towards the charging station, accurately land on it and also

recharge its battery. This is done in a real system without human intervention for giving

any commands, like for example start the landing maneuver.

1.2 Goals

In order to solve the autonomous charging problem, the main goal of this thesis is the

design and implementation of a control system for an indoor nano drone, in order to

control it towards a landing platform whose position is roughly known, and also accurately

land on it. Another goal is the design and implementation of an actual recharging system,

1

Page 12: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

so that in the end a complete full autonomous system exists.

1.3 Assumptions

The type of drone used in this project is a nano quadcopter. A quadcopter or quadrotor

is a type of UAV that has become very popular [1], and it is widely used in research

laboratories and events of aerial vehicles.

Therefore, the proposed solution involves a nano quadrotor, specifically the Crazyflie 2.0

by Bitcraze 1. It has been decided to use the Crazyflie 2.0 because it is a low cost nano

quadcopter, and it is small and lightweight, which implies that it is safe to perform the

flights and test the solution. Moreover, it is an open source project, with source code and

hardware design both documented and available.

1.4 Outline

The document is divided in several chapters covering the di↵erent tasks carried out. First,

chapter 2 covers the results obtained after the research study. In chapter 3 the platform

used in the thesis is detailed, including the drone and all the hardware and software.

Chapter 4 explains the proposed solution and the system design. Chapters 5, 6 and 7

cover the visual system, the position controller and the recharging system, respectively.

The evaluation results are collected in chapter 8. Finally, in chapter 9 the conclusions

and future research and work lines are presented.

1https://www.bitcraze.io

2

Page 13: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

2 Research

In this chapter the research work carried out in the thesis is presented. First, a little

background is given in section 2.1, covering basics concepts of quadcopters in order to

understand the terminology and coming explanations. In section 2.2 the results of the

research study are explained.

2.1 Basic concepts

A quadcopter is a flying mechanical vehicle with four arms, each of these has a rotor

attached to a propeller. Two of these rotors turn clockwise (CW) while the other two

turn counter clockwise (CCW). Figure 1 shows the rotation direction for each rotor and

propeller.

Figure 1: Direction of rotation for each quadcopter’s rotor. (Source: Bitcraze)

Each rotating propeller produces two di↵erent forces. One is an upward thrust. The

other force is an opposing rotating spin (torque), this means that a rotor turning CW

produces a torque which causes the body of the drone to spin in CCW direction, and vice

versa. Therefore, while hovering, the moments from the two CW rotors and the two CCW

rotors compensate each other, thus avoiding that the quadcopter keep spinning around

its body axis. Therefore, by varying the speed of the four rotors di↵erent movements are

possible.

3

Page 14: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

It is worth noting that when flying a quadcopter there are four main dimensions of control:

roll, pitch, yaw and thrust. Roll is the rotation around the horizontal axis going through

the quadcopter from back to front. By controlling this parameter we can control the

movement to left or right. Pitch is the rotation around a horizontal axis going through

the quadcopter from left to right. This tilts the drone and moves it forwards or backwards.

Yaw is the rotation around a vertical axis, which rotates the quadcopter left or right, and

thus changing the direction the drone is facing. Finally the thrust adjusts the altitude

of the quadcopter. Figure 2 shows the di↵erent axes of the quadcopter that define roll,

pitch and yaw angles and thrust. The yaw, pitch and roll angles define the drone attitude,

which is its angular position.

Figure 2: Quadcopter axes. (Source: Bitcraze)

2.2 Research study results

As already explained in chapter 1, the research in the autonomy of UAVs is of great

interest in order to improve mission time and expand their applications. This is the

reason why there are many projects and contributions in this area, suggesting di↵erent

ways to address the problem and proposing di↵erent solutions.

In this section some of these projects and solutions will be explained. This research e↵ort

has been mainly focused on autonomous landing. It is worth noting that some of the

solutions found have inspired this thesis, as it will be detailed in chapter 4.

In [2] a controller for a quadcopter (UAVision Quadcopter UX-4001 mini) is developed,

4

Page 15: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

with the aim of its stabilization and autonomous landing. The robot is fully autonomous,

using just internal sensors and processing. The method for landing is based on landing

tag recognition by computer vision, then estimating the quadcopter’s relative pose with

respect to this target and controlling it to approach the landing station. The camera is

installed in the quadcopter’s bottom side and the tag (formed by several markers obtained

from the ArUco2 library [3]) is placed horizontally on the landing platform. In order to

obtain the camera pose, they formulate the problem as the Perspective-n-Point (PnP)

problem. This consists of obtaining the camera pose from the camera matrix and a set

of correspondences between image points (2D points) and real world points (3D points).

For the control they use a set of decoupled PID controllers (x, y, z and yaw). However,

in order to evaluate the system they rely on simulations. They use a physics simulator

called Gazebo, which operates on Robot Operating System (ROS). The implementation

is not done on a real quadcopter.

In [4] they implement a controller for the AR.drone for fully autonomous landing on a

given visual pattern and hovering above the pattern. However, they do not charge the

drone autonomously. The landing point is located on an horizontal plane (with respect

to the floor) and it is identified by the blob detection algorithm (method that aims at

detecting regions in an image that di↵er in properties compared to surrounding regions,

in this case these regions are two di↵erent circles, red and blue). The camera is in

the quadrotor’s bottom as in [2]. To measure the distance to the landing point they

analyze the position of this point in the image captured and apply trigonometry (making

some approximations) using also the altitude measured from the quadcopter’s ultrasound

sensor. As in [2], they use four decoupled PID controllers. The experiments in [5] show

that the success rate is around 80%, considering successful landing at most 30 cm from

the point. This is a quite high error considering the drone dimensions (52,5 x 51,5 cm).

Thus, we consider that with this error it would be hard to use this landing system for

recharging the drone using the landing platform.

In [6] a complete on-board recharging solution for the drone Asctec Pelican is given:

autonomously take-o↵, navigate and landing, recharging quadrotor’s battery by using

their custom designed landing platform. Figures 3 and 4 show respectively the quadcopter

used and a 3D model of the quadcopter’s landing foot inside the landing platform. They

use computer vision techniques to detect the landing station as in [2] and [4], by using

a tag. And apart from this vision system, they have another onboard vision system for

navigation when the target is not visible, which is based on optical flow to estimate the

2https://www.uco.es/investiga/grupos/ava/node/26

5

Page 16: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

position of the drone. For the control system, PIDs are used to define the drone position

and yaw angle. Regarding the landing and charging platform, it is based on slip-rings

installed on the platform (on the bottom of the cones that can be observed in figure 4)

and on the bottom of drone feet, connected respectively to the battery charger and the

UAV battery. A fuse is used in order to protect the system in case of short circuit. The

designed system allows a landing error of 5 cm along X and Y axis and a rotation of 10

degrees.

Figure 3: Asctec Pelican used in [6]

Figure 4: 3D model of Asctec Pelican base

with landing foot installed, once landed in-

side the landing platform, from [6]

Finally, in [7] they implement a vision based tracking and landing approach for a quadro-

tor (AR.drone). The image processing and position controlling is performed in a ground

station. The quadrotor is controlled with PIDs like in [2], [4] and [6], the output being

the attitude angle commands. The vision algorithm uses enhancement of red, green and

blue (RGB) color information, being robust under di↵erent lightning conditions. The

camera is installed in the lower part as in [2],[4] and [6], and in this case to estimate

the drone position they use the relative position in X and Y axis from the image. The

commands need to be manually sent from the ground station, that is, the drone doesn’t

land autonomously once it is in the right position until the command giving the order of

landing has not been sent.

Comparing to the project presented in this thesis, one important di↵erence with many

of the already existing solutions for autonomous flight, landing and recharging is that in

6

Page 17: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

this thesis the system is implemented with a nano quadcopter. There are many projects

and research works focus on nano quadcopters autonomous flight control, such as [8],

[9], [10] or [11]. Some projects also consider tracking capabilities, such as [12] or [13].

However, they do not focus on the autonomous charging process. Because of their size,

nano quadcopters have usually even shorter flight time than normal size quadcopters.

The battery life is usually around 10 minutes. Therefore, in this thesis we develop a full

solution so that this kind of drones could be used in cases that normal size drones can not

be used (for instance in tasks that involved being near people) even if these applications

require long mission time.

7

Page 18: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

3 Platform description

In this chapter the platform used will be described, including the Crazyflie 2.0 and the

ROS library for the Crazyflie. Moreover it will be explained how the camera is installed

on the quadcopter so that it can send images to the computer (ground station) for further

processing.

3.1 Crazyflie 2.0

The Crazyflie 2.0 is a light and small quadcopter, it weighs 27 grams and fits in the palm

of the hand [14]. It is quickly assembled by simply attaching the motors to the circuit-

board frame, no soldering is needed. In spite of its size, it has a very durable design.

The exact size of the quadrotor is 92 x 92 x 29 mm (motor-to-motor and including motor

mount feet). The figure 5 shows an image of the Crazyflie 2.0 assembled.

Figure 5: Crazyflie 2.0. (Source: Bitcraze)

It can communicate with a computer through the Crazyradio PA, which is a 2.4 GHz

radio USB dongle with a 20dBm power amplifier. According to the specification, the

range of communication between the quadrotor and the Crazyradio PA is about 1km

range line-of-sight (LOS). The Crazyflie 2.0 can be identified and accessed through a

Uniform Resource Identifier (URI), for instance: radio://0/80/250k.

The flight time is around 7 minutes and the charging time is 40 minutes, both considering

the stock battery, which is the one used in this project. The maximum recommended pay-

load weight is 15 g, which needs to be considered when choosing the camera or mounting

additional pieces or systems on the Crazyflie 2.0.

8

Page 19: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

3.1.1 Hardware

The Crazyflie 2.0 has two microcontrollers, one for the main application and other one

in charge of the radio and the power management. The microcontroller for the main

application is a STM32F405, with a Cortex-M4 core, 168MHz, 192kb SRAM and 1Mb

flash. Its functions are sensor reading and motor control, flight control, telemetry and

additional user development. The radio and power management microcontroller is a

nRF51822, with a Cortex-M0 core, 32Mhz, 16kb SRAM and128kb flash. The functions

of this microcontroller are enabling power to the rest of the system (STM32, sensors

and expansion board), battery charging management and voltage measurement, master

radio bootloader, radio and BLE communication, and detecting and checking installed

expansion boards.

It also has an on-board Lithium Polymer (LiPo) charger that can be accessed through

a microUSB connector. The memory unit is a 8KB EEPROM. The Inertial Measure-

ment Unit (IMU) is formed by a 3-axis gyroscope (MPU-9250), a 3-axis accelerometer

(MPU-9250), a 3-axis magnetometer (MPU-9250) and a high precision pressure sensor

(LPS25H).

There is an expansion port (figure 6) with the following connections:

• VCC (3.0V, max 100mA)

• GND

• VCOM (unregulated VBAT or VUSB, max 1A)

• VUSB (both for input and output)

• I2C (400kHz)

• SPI

• 2 x UART

• 4 x GPIO/CS for SPI

• 1-wire bus for expansion identification

• 2 x GPIO connected to nRF51

Figure 7 shows the Crazyflie 2.0 system architecture with all the elements described

above.

9

Page 20: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Figure 6: Crazyflie 2.0 expansion port pinout. (Source: Bitcraze wiki)

Figure 7: Crazyflie 2.0 system architecture. (Source: Bitcraze wiki)

10

Page 21: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

3.1.2 Flow deck

The Crazyflie 2.0 has several expansion decks also designed by Bitcraze that can be used

to extend the quadrotor’s functionalities and that are easy to include thanks to the 1-wire

automatic detection implemented on the Crazyflie 2.0 board.

In our case, we use the Flow deck, which gives the Crazyflie 2.0 the ability to detect its

motions along the in X, Y and Z axes. It is mounted under the quadrotor and it has two

sensors: the VL53L0x Time of Flight (ToF) sensor measures the distance to the ground

with high precision (up to 2 meters within a few millimeters) and the PMW3901 optical

flow sensor measures movements of the ground. Figure 8 shows the flow deck board.

Figure 8: Flow deck expansion board for Crazyflie 2.0. (Source: Bitcraze wiki)

With this deck we achieve a very stable quadrotor, and we can easily control it by giving

velocity commands in each axis or even the desired flight height. How and why the flow

deck is used for the position control is further explained in chapter 4.

It only weighs 1.6 g so, regarding the payload limitation, there is still enough margin to

install the camera or other necessary elements.

3.2 ROS driver for Crazyflie

The Robot Operating System 3 (ROS) is a flexible and open source framework for writing

robot software. It is formed by a collection of tools, libraries, and conventions to simplify

the task of creating complex and robust robot behavior across a wide variety of robotic

platforms.

It has been decided to use ROS because it facilitates a modular design structure, where

the software can be separated into di↵erent components and processes but it is easy to

3http://www.ros.org/about-ros/

11

Page 22: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

establish the communication among these di↵erent processes. Moreover, there already

exist several ROS drivers for the Crazyflie, thus making the development easy.

For this thesis the ROS Crazyflie driver introduced in [15] is used 4. It has been decided

to use this instead of other similar projects because it is the most complete one. The

features included that are relevant for this thesis are:

• Support for Crazyflies 1.0 and 2.0 using the stock firmware, which makes it easy to

use.

• It publishes on-board sensors in ROS standard message formats.

• A tutorial [16] which, in spite of being for a slightly older version, is useful for the

overall understanding of the di↵erent packages.

Other projects including ROS drivers for the Crazyflie do not include documentation, are

based on custom firmware or are not completed.

The main unit for organizing software in ROS are packages, which may contain ROS

runtime processes (nodes), a ROS-dependent library, datasets, configuration files, or any-

thing else that is usefully organized together 5. The Crazyflie driver is formed by six

packages:

• Crazyflie Cpp: a package that contains a C++ library for the Crazyradio and

Crazyflie. It can be used independently of ROS.

• Crazyflie driver: a driver package that contains a server (communicating with one

or more Crazyflies) and a script to add Crazyflies to be able to communicate with

them through this server.

• Crazyflie tools: this package contains tools which are helpful although not required

for normal operation, such as for instance a tool for scanning for a Crazyflie.

• Crazyflie description: this package contains a 3D model of the Crazyflie.

• Crazyflie controller: this package contains a simple PID controller for hovering or

waypoint navigation. It can be used with external motion capture systems.

• Crazyflie demo: this package contains a wide set of examples to get quickly started

with the Crazyflie.

4https://github.com/whoenig/crazyflie ros

5http://wiki.ros.org/ROS/Concepts

12

Page 23: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

The most important package for this thesis is the driver package, since it provides a simple

interface to communicate with the Crazyflie, being able to send all the required commands

as well as reading di↵erent parameters from the Crazyflie, such as battery voltage, which

is very interesting for the recharging part. Moreover, the demo package provides many

useful scripts to learn how to actually send the commands and communicate through the

server.

3.3 Camera, transmitter and receiver

The most limiting requirements when selecting the camera to install on the drone are the

weight and the power consumption. The Crazyflie payload is 15 g, but taking into account

the flow deck weight (1.6 g), this payload is reduced to 13.4 g. However, this limit should

not be reached in order to be able to fly the quadcopter without using much extra power

and thus without reducing autonomy so much. Therefore the camera should be as light

as possible. Regarding the power consumption, the logic is the same, low consumption is

needed to reduce autonomy as little as possible. Regarding the transmitter to send the

video obtained from the camera, the requirements are the same.

For the reasons explained above, it has been decided to use a module of camera plus trans-

mitter, the VM275T v1.3 with cloverleaf antenna. This module meets the requirements

needed. Figure 9 shows the module.

Figure 9: Module of camera and transmitter VM275T. (Source: HobbyKing)

The module supply voltage can vary from 2.9 to 5.5V, so it is possible to supply it from

the Crazyflie, as it will be detailed later. The power consumption is around 200 mA at 5

13

Page 24: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

V. The module dimensions are 14.5 x 12.2 x 12mm and the weight (without the antenna)

is 3.35 g. The antenna weight has not been measured but it does not increased the total

weight significantly compared to the maximum payload.

The transmitter frequency is in the 5.8 GHz band, with 48 channels range and FM

modulation. The camera resolution is 600 Television lines (TVL) and it supports both

NTSC and PAL video standards. The lens field of view is 120o horizontally and 100o

vertically.

In oder to supply the power to this module, the pin VCOM from the Crazyflie expansion

port (figure 6) can be used. The voltage of this pin is either directly the voltage coming

from the battery, unregulated, or the voltage from USB port when it is connected. This

means that the range is 3.0 - 5.5 V, and thus it meets the camera module requirements.

Therefore the camera plus the transmitter are powered from the VCOM pin and the

ground (GND) pin, which are respectively pins 9 right and 10 left of the expansion port

(figure 6). Figure 10 shows the Crazyflie 2.0 equipped with the camera.

Regarding the receiver, a 5.8 GHz receiver that can be connected through USB to the

computer is used.

Figure 10: Crazyflie 2.0 with the camera on board

14

Page 25: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

4 System design

In this chapter the proposed solution will be described and, based on this, the system

design will be explained.

4.1 Proposed solution

As already mentioned, the goal of the thesis is the design and implementation of a control

system for an indoor nano drone, in order to control it towards a landing platform,

accurately land on it and recharge the drone using this platform, so that in the end

we have a full autonomous system. With this, we address the autonomy problem of

drones.

Considering the di↵erent goals and in order to achieve them, the project can be divided

into di↵erent areas where it is needed to find a solution:

· Landing platform design and implementation

· Landing platform detection

· Controller design and implementation

Inspired by [2], [4], [6] and [7] the solution proposed in this thesis uses a camera installed

on the Crazyflie 2.0 to detect the landing station by computer vision techniques, and

the control of the landing maneuver is based on estimating the position of a tag on the

landing platform. A ground station (computer) will be used for processing the images

and carrying out the computations.

In the following sections the di↵erent parts of the system, which are designed for giving

solutions to the areas mentioned above, will be described.

4.2 Position controller design

Regarding the controller for the quadrotor, all the solutions presented in chapter 2 have

in common the use of PID controllers, as they are simple to implement but e↵ective and

accurate for controlling normal size quadrotors. However, nano quadcopters are very

15

Page 26: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

sensitive to external disturbance and can be di�cult to control their position just with

PID controllers [8].

In our case we want to control the quadcopter’s position with respect to the tag. To

achieve this, it is needed to estimate the position of the tag very accurately, in order

to make sure that this position can be used as the input to the controller. Therefore,

we suggest to address this position problem by using two cameras. One camera, placed

vertically on top of the drone, is used to determine the target location, that is, where to

land. Another camera pointing down is used to stabilize the drone by applying an optical

flow algorithm. With this combination of cameras a robust solution is obtained because

the drone is stable even if the tag goes out of the view.

In order to make the system robust, the quadcopter inner control loop (the one in charge

of stabilization control) which uses the down camera should be accurate and with the

least possible delay. For this reason, it has been decided to close the inner control loop

onboard, which implies that the implementation must require a low computation power.

This is where the flow deck, explained in the previous chapter, comes into play, since its

optical flow sensor can be used as the pointing down camera to measure movement of

the drone, helped by the ToF sensor to measure distance to the ground. The optical flow

sensor included in the flow deck provides X-Y motion information with a wide working

range of 80 mm to infinity.

The flow deck consists of a camera (optical flow sensor) to identify features in the surface

below it and track their motion between frames (basically it tracks how patterns are

moving). Figure 11 shows a feature being tracked over time. The distance sensor is then

needed to know the distance to these features and thus obtaining the real dimensions of

the movement. This approach is similar to the one in [11] where a nano quadrotor is fully

controlled based on optic flow with a downward-looking camera. It is also similar to the

solutions implemented for state estimation of drones by using Semidirect Visual Odometry

(SVO), which consists of obtaining motion from combining direct visual odometry (motion

extracted from intensity values in the image) and feature tracking, obtaining as a result a

lightweight algorithm that can be implemented onboard. An example of this is developed

in [17], where they implement SVO in a quadcopter with the camera pointing down, being

the approach similar to the one chose in this thesis.

It has been decided to use the flow deck because it meets the requirements already stated

and it is easily integrated with the Crazyflie 2.0, being possible to control the quadcopter

just by sending the desired velocity commands from the ground station.

16

Page 27: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Figure 11: Feature tracking over time (Source: Bitcraze blog)

Therefore we have two control layers, needed to ensure a complete autonomous flight.

The inner control layer, implemented onboard, ensures the stability of drone attitude and

movement. The outer control layer (o↵board) provides the stabilization of quadcopter

position in the space with respect to the tag. Since we have a control onboard, now it

is possible to use PID controllers to do the high level control (o↵board) and guide the

drone towards the landing platform.

4.3 Marker detector design

The marker to identify the landing platform will be obtained from the ArUco library

as in [2]. ArUco is an OpenSource library for camera pose estimation using squared

markers. It has been decided to use this library because it is a fast fiducial marker

system specially appropriated for camera localization in applications such as augmented

reality applications or robotics.

The library includes a complete set of functions for detecting markers and estimating

camera pose. For these reasons, contrary to using our own algorithm as in the other

solutions presented in chapter 2, the functions of the library will be used for calibrating

the vertical camera, detecting the marker and estimating the camera pose, because good

results are achieved in a simple way, with low error and fast [3]. All these computations

will be done on the ground station.

The implementation of this system will be explained in detail in chapter 5.

17

Page 28: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

4.4 Charging platform design

The concept of the charging platform designed and implemented in this thesis is similar to

the one proposed in [6]. Our system has been designed to include contacts on the platform

and the quadrotor’s legs, and also a system protection for the reverse voltage problem is

included. However, in order to keep a simple system, it has been decided to design and

implement a system with no landing error correction, but with error tolerance. Therefore

our design is enough for having a complete autonomous recharging system, with some

margin for the landing and without having to make big modifications in the Crazyflie

2.0. The design and implementation will be further detailed in chapter 7.

4.5 System architecture design

After having described the di↵erent parts of the system, it is worth summarising the

system architecture to have an overall idea on how everything is integrated and how the

communication is done.

The architecture of the overall system can be divided in the quadcopter Crazyflie 2.0

with the flow deck and the camera and transmitter installed on-board, the charging

platform with the tag and the ground station, where the video processing for the tag

recognition and the computation for the position control is done. The computer acting

as ground station includes the receiver for obtaining the video from the camera and the

Crazyradio PA in order to communicate with the Crazyflie 2.0. Therefore in this ground

station the video is processed and the camera pose is estimated to obtain the quadcopter

pose with respect to the landing station, then this pose is used as input of several PID

controllers which determine the commands that need to be sent to the quadcopter in

order to approach to the station and land on it. Figure 12 shows the overall architecture

of the system.

It would have been ideal to do all the computations on-board, as it is done in [6]. This

would avoid having to communicate with the ground station, and thus obtaining a com-

pletely autonomous system. However, our quadcopter is small and without enough com-

putation power for implementing complex algorithms and adding additional features, as

explained in [12], where an algorithm for recognizing and detecting a pattern is imple-

mented on a Crazyflie (on-board), obtaining a mean position error of 32 cm in horizontal

axis and 57 cm in vertical axis. On-board computation would have implied a landing error

18

Page 29: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Figure 12: Overall system architecture

high enough to make impossible the recharging. Therefore any useful application using

the Crazyflie would require an o↵-board computer anyway, due to the low computation

power.

The main reason to have the recharging functionality onboard is that it could be part

of the low level ”safety” functionality of the system that ideally work even if the radio

connection with the ground station is lost. However, this would most likely require a

custom hardware solution, thus increasing the complexity of the solution.

These are the reasons why it has been decided to split the control problem into two,

as already mentioned: one high bandwidth controller onboard that stabilizies the drone

position and one controller o↵board that is used mainly to specify the goal location.

Nevertheless, the solution developed here can be implemented in the future using bigger

drones which allow on-board computation.

19

Page 30: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

5 Visual system

In this chapter it will be explained the recognition system designed to detect the marker

that identifies the landing station, as well as to estimate the camera (and drone) pose

with respect to this marker.

5.1 Marker creation with ArUco module

As already stated in previous chapters, in order to identify the platform the ArUco library

is used. This library has binary square fiducial markers that can be used for camera pose

estimation. Their main benefit is that their detection is robust, fast and simple.

In our case, it is used the ArUco module for OpenCV (Open Source Computer Vision

Library) 6. It has both C++ and python interfaces, but in this project the python

interface is used. This ArUco module includes the detection of the ArUco markers and

the tools to employ them for pose estimation and camera calibration.

An ArUco marker is a square marker formed by a wide black border and an inner binary

matrix which determines its identifier. The black border contributes to its fast detection

while the binary codification allows its identification and, moreover, the application of

error detection and correction techniques. The size of the internal matrix determines the

marker size, that is, a marker size of 4x4 is composed by 16 bits.

The ArUco markers can be grouped in dictionaries, defined by the dictionary size, which

is the number of markers that compose the dictionary and the marker size of these

markers. The ArUco module used includes some predefined dictionaries covering a range

of di↵erent dictionary sizes and marker sizes. It is worth noting that a marker ID is the

marker index inside the dictionary it belongs to.

In order to detect the marker, first we have to create it and print it. To create the marker,

a function provided by the ArUco library can be used. In this project a python script

has been written with the required steps to create and save the marker image.

First, the dictionary object is created by choosing one of the predefined dictionaries in

the ArUco module. In our case we use a dictionary of 50 di↵erent 4x4 bits markers. The

6https://opencv.org/

20

Page 31: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

reason of using this marker size is because, with the same printed size, it is easier the

detection from farther than larger size markers, since it is formed by less bits and thus,

these are bigger.

In order to create the marker, the function needs the following parameters:

· The dictionary object, which is the one explained above.

· Marker ID. In our case, the range of valid IDs goes from 0 to 49 (50 markers) and

the one chosen is the marker with ID 1.

· The size of the marker image. In our case we choose 200, which means that the

image size is 200x200 pixels. In order to avoid deformations, this parameter should

be proportional to the number of bits plus border size, or at least much higher than

the marker size (like 200 here), so that deformations are insignificant.

· There is an optional parameter to specify the width of the marker black border. In

our case we use the default value, which is 1.

The marker created is shown in figure 13.

Figure 13: ArUco marker to identify the landing platform

21

Page 32: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

5.2 Marker detection with ArUco module

The detection process returns a list of detected markers. Each detected marker includes

the position of its four corners in the image and the marker ID. In the ArUco module

there is a function that performs this detection, but before detailing the parameters used

by this function, the concepts behind the detection process will be briefly explained.

The process can be divided into two steps. First, the detection of marker candidates.

In this step the image is analysed to find square shapes that can be candidates to be

markers. An adaptive thresholding is applied to segment the markers, and then the

contours are extracted to discard those that are not convex or do not approximate to

a square shape. Furthermore, some filtering is applied to remove candidates that do

not meet some requirements. After this candidate detection, the inner codification is

analysed to determine the true markers, by extracting the bits from each marker. To

do this, first, the markers in their canonical form are obtained by applying perspective

transformation. Then, the canonical image is thresholded using the Otsu’s method to

separate white and black bits. The image is divided in di↵erent cells according to the

marker size and the border size, then the amount of black or white pixels on each cell is

counted to determine if it is a black or a white bit. Finally, the bits are analysed to check

if the marker belongs to the specific dictionary. Moreover, error correction techniques are

employed when necessary.

The parameters needed by the function that performs the detection are the follow-

ing:

· The image where the markers will be detected.

· The dictionary object that contains the marker(s) used.

· There is an optional parameter, an object that includes all the parameters that can

be customized during the detection process.

The function returns a list with the corners of the detected markers in their original

order (clockwise starting with top left), and a list with the IDs of each detected marker.

Moreover, it also returns a list with the rejected candidates, that is, squares found but

that do not present a valid codification.

22

Page 33: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

5.3 Camera calibration with ArUco module

After detecting the marker, we want to estimate the camera pose using this marker.

However, to perform the camera pose estimation first it is needed to know the calibration

parameters of the camera, which are the camera matrix and the distortion coe�cients.

It is worth noting that, in order to use the functions provided in the ArUco module and

OpenCV, the model considered for the camera is the pinhole camera model, which is

shown in figure 14. It is important to take into account the camera coordinate axis, since

it di↵ers from the drone axis and this needs to be considered.

Figure 14: Pinhole camera model (Source: OpenCV documentation)

The camera matrix and distortion coe�cients expressions are:

Camera matrix =

2

664

fx � cx

0 fy cy

0 0 1

3

775

Distortion coefficients =hk1 k2 p1 p2 k3

i

In the camera matrix, the parameters fx and fy represent the focal length in pixel units,

� represents the skew coe�cient between the x and the y axis (which in our case, as it

will be seen later, it is 0, as it usually happens) and, finally, cx and cy are the camera

23

Page 34: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

center coordinates (principal point). The distortion coe�cients vector is formed by five

parameters that model the distortion produced by the camera.

In order to calibrate the camera the ArUco module is used too. It is worth noting that

the camera parameters remain fixed unless the camera optic is modified, thus camera

calibration needs to be done only once.

Performing the calibration requires some correspondences between environment points

and their projection in the camera image from di↵erent viewpoints. In general, these

correspondences are obtained from the corners of chessboard patterns when using the

OpenCV function for calibrating cameras. In this project, instead of this OpenCV func-

tion a function included in the ArUco module is used, because calibration is more ver-

satile than using traditional chessboard patterns, since it allows occlusions or partial

views.

Using the ArUco module, we have two options for calibration: based on ArUco markers

corners or ChArUco corners. However, it is recommended using the ChArUco corners

approach since the provided corners are much more accurate in comparison to the marker

corners. A ChArUco board is similar to a chessboard pattern but using the ArUco

markers instead of the white squares. An example board, which has been created in

order to calibrate the camera used in this project, is shown in figure 15.

Figure 15: ChArUco board used to calibrate the camera

24

Page 35: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Therefore, to calibrate the camera using this ChArUco board, it is necessary to detect the

board from di↵erent viewpoints. However, as already mentioned, occlusions and partial

views are allowed, and not all the corners need to be visible in all the viewpoints.

A full working example is available in the ArUco module, thus this has been used to

calibrate the camera and obtain the required parameters. First, the ChArUco board

was created also by using functions in the module and providing reasonable parameters

regarding for instance the number of horizontal and vertical squares or the marker size,

and then the camera was calibrated using this board. As a result, a yaml file is obtained

with the camera matrix, the distortion coe�cients vector and the average reprojection

error.

In this calibration process, the average reprojection error obtained was 0.38 and the

camera parameters are the following:

Camera matrix =

2

664

223.32 0 333.08

0 221.82 244.42

0 0 1

3

775

Distortion coefficients =h2.01e�01 �2.25e�01 �9.29e�04 �6.99e�04 5.56e�02

i

5.4 Camera pose estimation using ArUco module

Once we have the camera calibrated and we have detected the marker, the next step is

estimating the camera pose from this detected marker.

The approach used to estimate the camera pose is the already mentioned PnP problem,

which is a common simplification of the general version and it assumes known calibration

parameters, which in our case is correct since the camera has been calibrated and the

parameters are known. An overall view of the problem formulation is shown in figure 16.

This problem formulation consists in retrieving the pose (rotation R and translation t) of

the camera with respect to the world reference frame and the focal length, given a set of

correspondences between 3D points pi expressed in the world reference frame, and their

2D projections ui onto the image.

The ArUco function to estimate the pose is based on an OpenCV function to solve the

PnP problem already explained. However, in this case the camera pose is obtained with

25

Page 36: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Figure 16: PnP problem formulation (Source: OpenCV tutorials)

respect to the marker and not with respect to the world coordinate system. This facilitates

the drone controlling since we actually want to control the drone with respect to the

marker position, because the landing station position is related to this marker position.

Therefore the camera pose is the 3D transformation from the marker coordinate system

to the camera coordinate system, and it is also specified by a rotation and a translation

vector.

The ArUco function takes as parameters the vector of marker corners already detected,

the size of the marker side in meters (the translation vectors of the estimated poses will

be in this same unit) and the camera calibration parameters that have been obtained

after the calibration process. It returns the rotation and translation vectors for each of

the markers in corners. The marker coordinate system that is assumed by this function

is placed at the center of the marker with the Z axis pointing out.

The translation vector obtained is the (x, y, z) coordinates of the marker in the coordinate

system of the camera (figure 14), and the rotation vector is the rotation of the marker

with respect to the camera. However, we need the rotation of the camera with respect

to the marker so that the rotation of the drone with respect to the marker is directly

known.

26

Page 37: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

The rotation vector is a most compact representation of a rotation matrix (since any

rotation matrix has just 3 degrees of freedom). The Rodrigues’ rotation formula allows

to obtain the rotation matrix from the rotation vector (and vice versa). There is a

function in the OpenCV library to apply this Rodrigues’ formula (Rodrigues()), and it is

the one used here. Once the rotation matrix is obtained, the Euler angles can be obtained

by applying an OpenCV function (RQDecomp3x3()) that computes a RQ decomposition

of 3x3 matrices (decomposition of a matrix into a product of an orthogonal matrix Q and

an upper triangular matrix R) and returns the Euler angles in degrees (yaw, pitch and

roll).

Therefore, once applied all the steps explained in this section, the distance to the marker

in each axis and the rotation of the camera with respect to the marker are obtained, and

thus they can be used to control the drone with respect to the marker.

5.5 Implementation

In this section it is explained how all the steps described in the previous sections have

been implemented. Some of the tasks explained need to be done only once (marker

creation and camera calibration), but others need to be done continuously so that the

controller loop can work correctly.

It has been created a ROS package called camera aruco in order to implement all the

tasks that need to be done related to the visual system in a ROS node. A script has been

created to do the processing for detecting the marker and computing the camera pose. In

this script, after performing initial configurations, there is a loop to analyze the image,

detect the marker, compute the camera pose and send this pose in a message so that the

controller can work with it, as it will be explained in chapter 6.

The camera configuration is the following: channel frequency 5.74 GHz (group A, channel

1) and video format NTSC, whose video frequency is 60 Hz. For this reason, the frequency

of the loop is also chosen to be 60Hz. This way, every frame received is analyzed to check

if the marker appears.

The flow diagram in figure 17 shows the di↵erent steps that are carried out to perform

the tasks already explained. First, after the initialization required by ROS, the yaml

file is read to obtain the camera calibration parameters (camera matrix and distortion

27

Page 38: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

coe�cients), then the message to send the camera pose is initialize. A ROS publisher is

created and through this the message will be send to the controller, which is subscribed

to this publisher. After these configurations, the 60 Hz loop is executed to read a frame,

analyze the image to detect markers and, if the marker ID coincides with the marker of

our system, the camera pose is obtained and the message is sent.

Figure 17: Flow diagram of the visual system

28

Page 39: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

6 Drone controller

In this chapter, the ROS node created for the position controller of the quadcopter will

be explained in detail, including all the steps and tasks needed to compute the distance

to the landing point and send the control commands to the Crazyflie.

6.1 Pose message transformation

Once the message from the image processing node is received, the camera pose needs to

be saved and transformed from the camera coordinate system to the drone coordinate

system. Both the camera and the drone coordinate systems are shown in figure 18.

Figure 18: Camera and Crazyflie 2.0 coordinate system

It can be seen that in order to obtain the camera coordinate system given the drone

coordinate system, we need to perform a rotation of ⇡/2 in Y axis (drone coordinate

system) and then a rotation of �⇡/2 in Z axis (new coordinate system obtained after the

first rotation).

Although the origins of both coordinate systems do not coincide exactly, it has been

decided to consider both origins in the same position since the di↵erence is not significant,

because the camera is installed close to the drone center.

29

Page 40: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

The rotation matrix in Y axis is:

RY (✓ = ⇡/2) =

2

664

cos✓ 0 sin✓

0 1 0

�sin✓ 0 cos✓

3

775 =

2

664

0 0 1

0 1 0

�1 0 0

3

775 (1)

And the rotation matrix in Z axis is:

RZ(✓ = �⇡/2) =

2

664

cos✓ �sin✓ 0

sin✓ cos✓ 0

0 0 1

3

775 =

2

664

0 1 0

�1 0 0

0 0 1

3

775 (2)

Therefore, in order to obtain the rotation in both axis, the rotation matrix is:

R = RYRZ =

2

664

0 0 1

�1 0 0

0 �1 0

3

775 (3)

And thus, the coordinates of a point in the Crazyflie coordinate system given the coor-

dinates in the camera system can be obtained by solving the equation:2

664

Xdrone

Ydrone

Zdrone

3

775 = R

2

664

Xcamera

Ycamera

Zcamera

3

775 (4)

Therefore we have the following equivalences:0

BB@

Xdrone = Zcamera

Ydrone = �Xcamera

Zdrone = �Ycamera

1

CCA (5)

Due to the transformations performed during the image processing phase to obtain the

Euler angles (pitch, roll and yaw) no further transformations are needed for obtaining

these angles in the drone reference system.

The ROS interface allows to control the Crazyflie 2.0 (with the flow deck included) by

indicating the velocity in X and Y axis (m/s), the height in Z (meters) and the yaw rate

(degrees/s) and thus this will be the control command used. Therefore, we need to know

30

Page 41: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

the distance from the marker center in X, Y and Z axis plus the yaw angle. Thus, when

a pose message is received (at a frequency of 60 Hz) the parameters in (5) are saved, as

well as the yaw angle, for further processing to estimate the velocity, height or yaw rate

needed.

6.2 Estimate drone distance to landing point

After applying the rotations explained in the previous section, the distance from the

drone to the marker center is known. However, in order to control the drone we need

to know the distance to the landing point, which is not the same as the marker central

point. The landing point is at 40 cm from the marker as shown in figure 19. This figure

shows the problem formulation of the distance estimation from the quadcopter to the

landing point.

Figure 19: Problem formulation for estimating the distance to the landing point

First we have the distance to the marker center in the drone coordinate system (the point

(xmarker, ymarker) in the figure). We know that the landing point is at 40 cm from the

31

Page 42: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

marker in the X axis, considering a coordinate system with no rotation, that is, being

the X axis perpendicular to the marker plane and the Y axis parallel (coordinate system

X’Y’ in the figure). Therefore, since we know the Crazyflie rotation with respect to this

system because it is given by the yaw angle, we can compute a rotation to obtain the

marker center coordinates expressed in the X’Y’ system, which can be named (x0marker,

y0marker). After this rotation, we can obtain the point (x0landing point, y

0landing point) in the

figure by subtracting 40 cm to the x coordinate of the point (x0marker, y

0marker). Finally, in

order to obtain the distance to the landing point in the actual coordinate system XY, it

is needed to perform again the rotation but in this case it should be the reverse rotation,

that is, being now the rotation angle -yaw. This way we know the distance (xlanding point,

ylanding point) with respect to the real drone pose.

It is worth noting that this problem does not a↵ect the drone Z axis because the rotation

a↵ecting the distance change is in the XY plane, being the drone height independent with

regard to the yaw angle. The height estimation would be a↵ected by great pitch and roll

angles, but this is not the case since these angles will only vary for moving the drone

in one direction or other, being the variation very small for a↵ecting considerably the Z

distance measured in the image processing phase.

Therefore, starting by the measured point (xmarker, ymarker) the computations that need

to be done to obtain the true distance to the landing point are detailed below.

First, the rotation matrix is:

R =

"cos(✓) �sin(✓)

sin(✓) cos(✓)

#(6)

And thus: "x0marker

y0marker

#=

"cos(✓) �sin(✓)

sin(✓) cos(✓)

#"xmarker

ymarker

#(7)

where ✓ is the yaw angle.

Then: "x0landing point

y0landing point

#=

"x0marker

y0marker

#�"xsetpoint

ysetpoint

#(8)

where the setpoint is the goal where the drone needs to be in order to land correctly. In

32

Page 43: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

this case, as already mentioned:

"xsetpoint

ysetpoint

#=

"40cm

0cm

#(9)

Finally, the reverse rotation needs to be computed. In this case, the rotation matrix is:

R�1 = RT =

"cos(✓) sin(✓)

�sin(✓) cos(✓)

#(10)

So therefore: "xlanding point

ylanding point

#=

"cos(✓) sin(✓)

�sin(✓) cos(✓)

#"x0landing point

x0landing point

#(11)

After all this process, we have obtained the distance to the landing point in the drone

coordinate system.

6.3 PID controller

As already explained in chapter 4, it has been decided to use PID controllers to do the

o↵-board control, that is, the control of the drone to guide it to the landing point given

the distance to this point (obtained as explained in the previous section).

There are four di↵erent components in the control command (velocity in X, velocity in Y,

height in Z and yaw rate). Therefore, four decoupled PIDs have been designed to control

each of these four parameters.

A PID (Proportional Integral Derivative) controller is a control loop feedback mechanism.

It is widely used in industrial control systems, robotics and a variety of other applica-

tions requiring continuously modulated control. This controller continuously computes

an error value as the di↵erence between a desired setpoint and a measured variable, then

a correction is applied based on proportional, integral, and derivative terms. Figure 20

shows the block diagram of a typical PID controller.

In our case, we compute the error as the di↵erence between the actual distance (or yaw

angle) to the landing point and the distance (or yaw angle) at which it is desired to be

(setpoint). Therefore, the PID input is the distance or yaw angle to the landing point

33

Page 44: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Figure 20: PID controller

and the PID output is velocity in X and Y axes, height in Z axis or yaw rate, depending

on each parameter.

Each PID term a↵ects the control signal in a di↵erent way [18]:

· Proportional: this component depends only on the error. The output is directly

proportional to the error, depending on the gain Kp. In general, increasing this

proportional gain will increase the speed of the control system response. However,

if the proportional gain is too large, the process variable will begin to oscillate.

If Kp is increased further, the oscillations will become larger and the system will

become unstable and may even oscillate out of control.

· Integral: this component sums the error term over time. Therefore the result is

that even a small error term causes the integral component to increase slowly. The

integral response will continually increase over time unless the error is zero, so the

e↵ect is to drive the steady-state error to zero. This steady-state error is the final

di↵erence between the process variable and the setpoint.

· Derivative: this component causes the output to decrease if the process variable is

increasing rapidly. The derivative response is proportional to the rate of change of

the process variable. Increasing the Kd parameter will cause the control system to

react more strongly to changes in the error term and will increase the speed of the

overall control system response. However, this parameter is usually small since the

derivative response is highly sensitive to noise in the process variable signal. If the

sensor feedback signal is noisy or if the control loop rate is too slow, the derivative

response can make the control system unstable.

34

Page 45: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

The PIDs have been tuned experimentally and a good performance is obtained. In order

to make a good tuning, the e↵ects of independently increasing the value of the PID

parameters have been considered [19], which are included in table 3.

Gain Rise time Overshoot Convergence time Steady-state error Stability

Kp decrease increase small change decrease degrade

Ki decrease increase increase eliminate degrade

Kd increase decrease decrease no e↵ect improve if Kd small

Table 1: E↵ects of independently increasing the value of the PID the parameters

In our case, the setpoint is 40 cm in X axis, 0 cm in Y axis, 0 cm in Z axis and a yaw of 0o

(with respect to the center of the marker). This setpoint is above the landing point, that

means that the landing point is in the same position but at height 0 in the Z axis.

The velocity in X and Y and the yaw rate values are directly obtained from the controller

output. A limitation has been set so that the drone does not move extremely fast. These

limitations are a maximum velocity of 0.5 m/s in X and Y axes and a maximum yaw

rate of 45 degrees/s. The height in Z axis works di↵erently since the height needs to be

maintained, and thus, the output obtained from the controller is added (or subtracted)

to the height value. The initial height is decided to be 0.5 m, which is the height chosen

for taking o↵.

As already explained in chapter 4, the Crazyflie 2.0 performs on-board attitude and ve-

locity control, thanks to the flow deck used. Therefore any bias in the quadcopter, caused

for example by unbalanced motors or external sources a↵ecting the flight, is compensated

by the on-board controller, and thus the integral term of the velocity controllers (X and

Y axes) is not required, resulting in PD-controllers.

The PID parameters of each of the four controllers are summarized in table 2:

Output Input Error Kp Ki Kd

Velocity in X Distance to landing point X, dx dx � 40cm 0.4 0.0 0.005

Velocity in Y Distance to landing point Y, dy dy � 0cm 0.4 0.0 0.005

Height in Z Distance to landing point Z, dz dz � (�10cm) 0.05 0.0002 0.01

Yaw rate Yaw angle yaw � 0o 0.5 0.005 0.005

Table 2: Parameters of the four PID controllers

It is worth noting that these PID controllers are executed in a 60 Hz loop, which is the

35

Page 46: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

same frequency at which the images are received and the camera pose is estimated.

6.4 Landing

Once the drone is flying and the landing process has to start, in order to find the marker

it has been decided to rotate the quadcopter around the Z axis (yaw).

Once the setpoint is reached and the Crazyflie 2.0 is above the landing point, the drone

position must be stable during some time before starting the landing. In order to measure

this stability, a margin region has been defined, and if the Crazyflie position is within this

area during a given time the landing can start. The margin of error is ±3 cm in X axis,

±4 cm in Y axis and ±8 cm in Z axis. These values have been chosen according to the

size of the pads where the drone needs to land in order to be able to charge the battery,

as it will be explained in the next chapter. The margin error in Z axis is not as important

as the others, since it is perpendicular to the landing plane and it does not a↵ect the

landing in terms of right/left movement, thus a greater margin is chosen. If the drone is

within this region during 1.25 seconds then it starts landing. This value corresponds to

70 iterations having into account the 60 Hz loop, it has been chosen experimentally and

it has been checked that it is enough for a good landing. On one hand, if this time is

chosen to be very long, then it can take a lot of time to start landing since if the drone

is out of the region in one iteration, the count is reset. On the other hand, if the time is

very short it can not be guaranteed that the drone is stable for landing. The time chosen

proves to have an accurate landing, although similar times could be chosen.

This landing process takes 4 seconds and the goal is to gradually decrease the height until

a height of 8 cm is reached. Then the motors are shut down. The reason of this 8 cm

height is because it is the minimum altitude at which the optical flow sensor can work

and, moreover, when the drone is close to the floor its dynamics change, the propellers

cause turbulence and the drone becomes harder to control. The step size to gradually

reduce the height is computed according to the initial altitude and the landing time, and

this step size is subtracted from the current height also in a 60 Hz loop.

36

Page 47: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

6.5 Implementation

Throughout this chapter we have covered the di↵erent tasks needed to control the quad-

copter: pose estimation, distance to the lading point calculation, PID controllers design

and take-o↵ and landing processes design. After the explanation of all of them, this

section covers how the implementation and integration of the di↵erent tasks have been

done.

Di↵erent ROS nodes are needed in order to communicate with the Crazyflie and control

it. The driver already described in chapter 3 is used. This driver contains the server,

whose functions can be used to send commands to the Crazyflie (like the control command

specifying the velocities, height and yaw rate) or to receive parameters and measures from

the Crazyflie (like the battery level).

A ROS launch file, which allows to launch di↵erent ROS nodes easily at the same time,

has been created. Therefore with this file we launch the server, the node to add the

Crazyflie and be able to communicate with it (which is included also in the driver), and

finally the node that implements the controller.

It has been decided to start the landing maneuver after the battery voltage is below a

level. This battery level will be decided after carrying out some tests to characterize the

battery life in chapter 8. In the meantime, the drone is just in hovering mode, and when

the battery level is reached the rotating process for finding the marker starts.

Moreover, once the approaching process has started, in case that the marker is lost the

Crazyflie is landed for security reasons. This can happen because the marker goes out

of the image or it is not well detected. This hovering mode is entered if the marker is

lost during 20 iterations, which corresponds to approximately 0.3 seconds since the loop

frequency is 60 Hz. During the time the marker is not detected the previous command

is continuously sent (velocities, height and yaw rate are maintained).

As already said in the previous chapter, the controller node is subscribed to the camera

pose publisher. These messages are received at a 60 Hz frequency, and this is the reason

why the controller is also executed at this rate. This message is then processed at this

rate to estimate the drone pose as already explained. Furthermore, it is worth noting

that the node is also subscribed to receive the messages from the battery level publisher

(this publisher is included in the driver server) so that we can read the battery level and

check when it is time to start the approaching.

37

Page 48: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

In order to clarify the di↵erent steps followed, figure 21 shows the flow diagram for the

node that implements the drone controller.

Figure 21: Flow diagram of the controller node

38

Page 49: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

7 Landing and recharging platform

In this chapter the recharging system design and implementation will be explained in

detail, as well as the integration of this system together with a landing platform.

7.1 Landing station

In order to have a landing station where both the marker and the recharging system

can be included, it has been decided to build a platform with L shape. The marker

is attached to the vertical wall, while the charging system will be implemented on the

horizontal floor, where the drone will land. Figure 22 shows the landing station, including

also the corresponding dimensions.

Figure 22: Landing platform

39

Page 50: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

The station is built using wooden planks. The floor has been covered with a pattern to

facilitate the flow deck operation, since for a good functioning of the flow sensor the floor

should vary and not be uniform in color. This way, variations on the floor can be found

and they can be tracked to estimate the movement.

7.2 Battery and battery charger

Before designing the recharging system, an analysis of the battery and the charger in-

cluded in the Crazyflie 2.0 is done in order to know the charge and discharge character-

istics, and thus the voltage and current needed.

The battery used is the one included with the Crazyflie 2.0, which is of the type LiPo

(Lithium-Polymer). This battery type is one the most popular since it has among the

best power to weight ratios and discharge currents. It is worth noting that the Crazyflie

2.0 has a Protection Circuit Module (PCM) that will prevent the user from under/over

charging the battery.

The battery voltage is 3.7 V and it has a capacity of 240 mAh. The charge rate is 2C,

which means that it is charged at 480 mA. The discharge rate is 15C, which means that

the current is 3600 mA. The battery charging time is about 40 minutes and the battery

life about 7 minutes.

The Crazyflie 2.0 includes an integrated circuit (IC) which is an integrated Li-Ion linear

charger and system power path management device. In this case, it operates from the

USB port and supports charge currents up to 1.5 A. The chip features dynamic power path

management (DPPM) that powers the system while simultaneously and independently

charging the battery. This feature reduces the number of charge and discharge cycles on

the battery, allows for proper charge termination and enables the system to run with a

defective or absent battery pack [20].

Therefore, thanks to this charger, the battery can be charged directly by connecting the

power supply to the USB pin and the ground pin (pins 10 right and left in figure 6,

respectively). The USB standard voltage is 5.0 V.

40

Page 51: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

7.3 Recharging system

7.3.1 Reverse voltage protection

It has been decided to include a system that can protect the drone from reverse polarity

connection. Therefore, if the drone is connected in the reversed way (voltage to ground

and vice versa) due to a wrong landing or any other reason, the drone circuitry is protected

to avoid system failures.

There are di↵erent options to implement this protection system [21]:

· Using a diode in series with the supply line (figure 23c). When the power is correctly

supply, the diode is in the forward region and it conducts the current, being the

voltage supply applied to the load circuit. On the other hand, when the power

supply is reversed, the diode has a negative voltage (reverse region) and it does not

conduct the current through the load circuit. Its advantages are the simplicity and

the low cost. However, it has a big disadvantage, which is the diode high voltage

drop (typically around 0.7 V).

· Using a Schottky diode in series with the supply line. This option is similar to the

one above but with the advantage that the diode voltage drop is much lower.

· Using a PNP Transistor (figure 23a). In normal operation, the base is at a lower

voltage than the emitter and thus the transistor turns on. On the other hand,

when the circuit is reversed, the transistor is reverse biased and it shuts down the

rest of the circuit. As in the diode circuit, there is still some voltage drop, and

for higher power circuits the transistor will not be able to handle the high current

loads. Otherwise is a good option with a reasonable cost.

· Using a P-channel FET (23b). This option is more complex it is the best one

in terms of performance considering low voltage drop and high current capability.

With this configuration, the slight leakage current through the FET’s intrinsic body

diode will bias the FET on when the polarity is correct and block current when

reversed, thus shutting o↵ the FET. The diode and resistor are only needed if the

maximum gate to source voltage (VGS) is lower than the supply voltage.

After studying the di↵erent alternatives, it has been decided to use the Schottky diode,

which is the simplest solution that meets our requirements. It is simple and low cost,

and the Schottky has a lower voltage drop than a regular diode. Moreover, the usual

41

Page 52: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

(a) Protection circuit with PNP transistor (b) Protection circuit with P-channel FET

(c) Protection circuit with diode

Figure 23: Alternatives for the protection circuit. (Source: Reverse Polarity

Protection Circuits)

packaging options for PNP transistors and FETs are more di�cult to integrate with the

drone.

The Schottky diode chosen is the 1N5818, which has a maximum voltage drop of 0.55V

for 1 A current and 0.33 V for 0.1 A. The maximum battery charging current is 0.5 mA,

so the voltage drop will be between this two values, which can be a reasonable drop.

In order to integrate this diode with the drone, the diode anode has been soldered to a

metal pad and attached to one of the feet, while the cathode is attached to the VUSB

pin (pin 10 right in figure 6). As already explained, this pin is connected directly to the

charger input voltage.

7.3.2 System design and implementation

As already mentioned in chapter 4, the recharging system is based on metal pads installed

on the platform and the quadcopter feet. The overall system includes two metallic pads

connected to a power supply (Vcc and ground), configured in constant voltage mode with

42

Page 53: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

5.5 V and a maximum current of 1 A. In the drone side, metallic pads are connected to

the ground pin and to the diode that is connected to the USB pin, as already mention.

In order to have the maximum possible separation between Vcc and ground and thus

avoid possible short circuits and have a greater margin for landing, the feet chosen are in

diagonal (front right and back left feet).

The idea is to charge the drone after this is landed on the pads, due to the contact of

both metallic pieces. Therefore, the drone should land in a way that the foot connected

to the diode lands on the pad connected to Vcc, and the foot connected to ground lands

on the ground pad.

Regarding the material used for the pads on the station, it has been decided to use copper

due to its good conductivity. In a first stage of the project, aluminium pads where used

since they were available, however, this material did not work because the contact was

not enough given the low weight of the drone.

The pads design is shown in figure 24.

Figure 24: Landing pads design

43

Page 54: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

The separation between the Crazyflie 2.0 feet is 7.4 cm (blue square), so these dimensions

limit the pads size. The VCC and GND distribution allows to tolerate more error in one

of the axis, since the pads can be rectangular either in X axis (vertically) or Y axis

(horizontally). Although both options can be considered, it has been decided to make

the rectangles wider than long. The reason behind this is to allow more error in the

Y axis because it has been checked experimentally that the drone tends to move more

along the Y axis than the X axis when landing, due to the fact that the dynamics next

to the floor change and the wall in front of the drone also a↵ects. It has been decided

to separate the pads 1 cm, and thus, considering the feet separation, the maximum error

allowed is ±3.2 cm in X axis. The design has been done to tolerate an error of ±8.4 cm

in Y axis. With these dimensions a good landing performance can be achieved.

The pads have been installed at a distance of 40 cm from the wall where the tag is

attached, so that the marker appears in the image frame obtained by the camera at every

moment. Considering that the maximum distance at which the marker can be correctly

detected is around 1.4 - 1.5 m, the maximum distance to the landing point is around 1

m.

The overall system is shown in figure 25, where the drone appears on the pads and

charging its battery.

Figure 25: Overall landing and recharging platform

44

Page 55: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

8 Evaluation and testing

8.1 Battery life characterization

From the Crazyflie 2.0 specifications we have that the flight time is around 7 minutes,

however, it is necessary to characterize this battery life with the camera installed on the

drone. This is also important to decide at what battery level the approach to the landing

station and the landing maneuver should start.

Therefore, to test this, the battery voltage has been measured while the drone was contin-

uously flying. The voltage measured along time has been represented in both situations to

compare, with and without the camera. Figure 26a shows the voltage over time without

the camera on board, while the figure 26b shows the voltage with the camera installed

on the quadcopter and working.

To measure the flight time, it has been considered the time at which the voltage decreases

abruptly (approximately below 3.1 V). Considering this condition, from the figures it can

be checked that the flight time for the original Crazyflie (without the camera) is around

425 seconds (7.1 minutes) which coincides with the expected value. The flight time with

the camera decreases considerably compared to the 7 minutes in the original version.

Specifically, the duration is about 275 seconds (4.6 minutes). However, almost 5 minutes

of flight is still a good duration considering that the camera has a high power consumption

compared to the Crazyflie 2.0 power capability and that it increases considerably the

payload.

Regarding the results obtained, it has been decided to start the landing maneuver when

the battery voltage is below 3.3 V, which implies an approximate flight time of 3.75

minutes before starting the approach to the recharging station. This value provides a

safe margin to have enough time for landing. Since the maximum distance to the landing

point will be around 1 m as already said, the approach to this point will not be very time-

consuming and most of the available time can be expend on the landing procedure.

45

Page 56: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

(a) Battery life without camera on board

(b) Battery life with camera on board

Figure 26: Battery life characterization with and without camera

46

Page 57: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

8.2 Detection performance

In order to evaluate the marker detection performance, some experiments have been

carried out to check the maximum possible distance to detect the marker depending on

the size of the printed marker and the number of bits of this marker. The results are

shown in the table 3.

Marker size Number of bits Maximum distance

11 cm 6x6 0.9 - 1 m

11.8 cm 4x4 1.3 - 1.4 m

14.7 cm 4x4 1.5 - 1.6 m

Table 3: Marker detection performance

These results show that as already mentioned, a better detection is performed with smaller

markers in terms of bits, since for similar markers size (11 cm and 11.8 cm) the 4x4 marker

is detected from farther than the 6x6 marker.

From the conclusions obtained, it has been decided to use the 4x4 marker of 14.7 cm,

which can be detected approximately from 1.5 m. It is worth noting that despite the

fact that the marker can be detected from this distance in each axis, the detection is also

limited by where the marker appears in the image, which depends also on the camera

rotation with respect to this tag. Therefore, more detailed evaluations have concluded

that, although the marker can be detected from 1.5 - 1.6 m, for a continuous detection

the maximum distance between the tag and the camera should be around 1.45 m.

Apart from these tests to estimate the maximum distance, other evaluations have been

done to obtain the limits for detecting the marker in di↵erent situations. Specifically,

the dead angle has been obtained and also the detection limit due to marker deforma-

tion.

In order to obtain the dead angle, the process that has been followed is to measure the

maximum distance of the drone to the marker at which this marker can be detected, that

is, until the marker is out of the view. This measurement is done always in the condition

that there is no rotation of the drone with respect to the marker (in the drone’s Z axis),

that is, the yaw is always 0 degrees. This measurements have been taken in several points

(di↵erent distances), so that the angle can be obtained by computing the average of the

di↵erent results. Since the distance of the drone to the marker in X and Y axes is known,

the angle can be computed easily. The results are collected in table 4.

47

Page 58: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

X coordinate (m) Y coordinate (m) Angle (o)

0.37 0.42 48.65

0.57 0.69 50.44

0.83 0.93 48.62

1.07 1.44 53.39

Table 4: Dead angle estimation

From the results of the table 4 it can be concluded that the dead angle for detecting the

marker is approximately 50o, obtained as the mean of the di↵erent measured angles.

There is another type of limitation that occurs when the drone is facing towards the center

of the marker, due to deformation of this marker in the image. This deformation appears

because of the rotation of the drone (and thus the camera) with respect to the marker.

In oder to understand better this problem, in figure 27 the marker appears deformed. In

this image the marker is in the limit of being detected, which means that even with a

slightly greater rotation of the drone the marker would not be detected. This maximum

angle of rotation of the drone with respect to the marker has been obtained too. For a

reasonable distance, close enough to the marker (0.54 m in X axis and 0.04 m in Y axis)

the angle obtained is 70o.

Figure 27: Marker deformation problem

48

Page 59: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

8.3 Landing performance

After the whole system has been designed and implemented, landing tests should be

carried out to numerically evaluate the landing performance.

It has been decided to carry out several landing maneuvers starting in di↵erent points,

but all of them between 1 and 1.5 meters from the marker of the landing station. Then,

the rate of correct landings is measured as well as the landing time. A landing is counted

as correct if the drone lands and starts charging because its feet are over the pads. The

time is measured as the whole time since the drone take-o↵. The results are collected in

the table 5.

Landing attempt Correct (1=yes, 0=no) Time (s)

1 0 19

2 1 47

3 1 25

4 0 26

5 1 20

6 1 23

7 1 25

8 1 20

9 1 21

10 0 19

11 1 24

12 1 20

13 1 24

14 1 22

15 1 21

16 1 23

17 1 17

18 1 20

19 0 23

20 1 25

Table 5: Landing performance

From this table we can obtain that the rate of correct landings is 80% and the average

landing time is 23.2 seconds.

49

Page 60: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

9 Conclusions and future work lines

9.1 Conclusions

In this project a controller for an indoor nano drone, the Crazyflie 2.0, has been designed

and implemented, with the goal to control it towards a landing platform and also ac-

curately land on it. The aim is to solve the autonomous recharging problem for indoor

drones, since the battery life is a limitation of the potential applications of this type of

drones.

Before starting to design and develop the project, a research study was carried out, mainly

focused on autonomous landing. The results obtained from this study have inspired this

project.

A camera has been integrated together with the Crazyflie 2.0 in order to be able to detect

the landing station and control the drone with respect to this station position. Therefore

the project can be divided in three main sections: the visual system, the drone controller

and the landing and charging station.

A visual system has been designed and implemented for detecting the landing station.

For this purpose, the ArUco library has been used. An ArUco marker has been used to

identify the station, and functions from this library have been used in order to estimate

the camera pose, that is, the distance to the marker and the camera orientation with

respect to it. This has been executed on a computer which receives the video from the

camera on the drone.

A controller for the nano drone has been designed and implemented. In order to facilitate

the controlling and improve the stabilization, the Crazyflie 2.0 flow deck has been used.

This deck uses a camera pointing down to apply an optical flow algorithm and thus to

stabilize the drone, as well as to control the velocity and attitude of the drone. Apart

from this on-board controlling, four decoupled PID controllers have been implemented

o↵-board to control the velocity of the drone in X and Y axes, its height and the yaw

rate. This o↵-board control is also executed on the computer, which communicates with

the drone thanks to an o↵-the-shelf ROS driver designed for the Crazyflie.

A landing and charging station has been designed and built. Two copper pads (Vcc and

ground) have been used as the landing platform, connected to a power supply. Metal

50

Page 61: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

contacts have also been attached to two of the Crazyflie 2.0 feet. The drone lands on this

copper pads and thanks to the contact the battery starts charging. The copper platforms

have been dimensioned in a way that allows 3.2 cm error in X axis and 8.4 cm error in Y

axis.

Finally, some tests have been carried out to evaluate the system. The battery life has

been characterized, so that the maximum flight time and the battery voltage at which

the landing process should start have been obtained. The landing performance has been

measured too, by carrying out several landing maneuvers, counting the successful landings

and measuring the total time of the process.

Therefore, in this thesis a complete autonomous recharging system for an indoor nano

drone is designed and implemented, including the ability to autonomously fly towards

the charging station, accurately land on it and also recharge its battery. The use of a

nano drone is one of the main di↵erences with respect to other already existing solutions,

and it has a great potential since this kind of drones can be used in cases that normal

size drones can not be used. For instance, they more secure to use in tasks that involve

being close to people.

9.2 Future work lines

Although this thesis includes a complete system for autonomous recharging of a nano

drone, the project can have several future working lines to complete the project or extend

its functionality.

As already explained in chapter 4, the project can be extended to bigger drones that

work also with batteries but have more computation power, and thus the implementation

can be done fully on-board. This would avoid the communication with a ground station

and a full autonomous drone would be obtained. Although this would imply important

changes since the ROS driver and libraries used here are specific for the Crazyflie 2.0,

the same visual system and recharging station can be used, and the logic followed for the

controller would be the same.

Another future work line is that the system can be further completed with a functionality

to detect when the battery charging process is completed, so that the drone can take-o↵

again and start a new flight.

51

Page 62: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

Finally, it would be very interesting to implement a system for detecting the landing

station from farther away. This way the drone can have more applications, because right

now, it is limited to operate at 1.5 m approximately in order to be able to detect the

landing station. An option would be to implement a system to detect the landing station

using radio frequency. This type of communication is far-reaching so the drone can be

performing tasks away from the recharging station. And then, once the drone is close

enough to the landing station, the visual system already implemented can be used to

accurately land on the platform.

52

Page 63: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

References

[1] P. Castillo-Garcıa, L. E. M. Hernandez, and P. G. Gil, “Chapter 1 - state-of-the-art,”

in Indoor Navigation Strategies for Aerial Autonomous Systems, P. Castillo-Garcıa,

L. E. M. Hernandez, and P. G. Gil, Eds. Butterworth-Heinemann, 2017,

pp. 3 – 30. [Online]. Available: http://www.sciencedirect.com/science/article/pii/

B9780128051894000020

[2] T. G. Carreira, “Quadcopter automatic landing on a docking station,” 2013.

[3] S. Garrido-Jurado, R. Munoz-Salinas, F. Madrid-Cuevas, and M. Marın-Jimenez,

“Automatic generation and detection of highly reliable fiducial markers under

occlusion,” Pattern Recognition, vol. 47, no. 6, pp. 2280 – 2292, 2014. [Online].

Available: http://www.sciencedirect.com/science/article/pii/S0031320314000235

[4] R. Bartak, A. Hrasko, and D. Obdrzalek, “A controller for autonomous landing of

ar.drone,” in The 26th Chinese Control and Decision Conference (2014 CCDC), May

2014, pp. 329–334.

[5] R. Bartak, A. Hrasko, and D. Obdrzalek, “On autonomous landing of ar.drone:

Hands-on experience,” in FLAIRS Conference, 2014.

[6] F. Cocchioni, A. Mancini, and S. Longhi, “Autonomous navigation, landing and

recharge of a quadrotor using artificial vision,” in 2014 International Conference on

Unmanned Aircraft Systems (ICUAS), May 2014, pp. 418–429.

[7] Y. Bi and H. Duan, “Implementation of autonomous visual tracking and

landing for a low-cost quadrotor,” Optik - International Journal for Light and

Electron Optics, vol. 124, no. 18, pp. 3296 – 3300, 2013. [Online]. Available:

http://www.sciencedirect.com/science/article/pii/S0030402612008698

[8] X. Zhang, B. Xian, B. Zhao, and Y. Zhang, “Autonomous flight control of a nano

quadrotor helicopter in a gps-denied environment using on-board vision,” IEEE

Transactions on Industrial Electronics, vol. 62, no. 10, pp. 6392–6403, Oct 2015.

[9] O. M. W. Dunkley, “Visual inertial control of a nano-quadrotor,” Master’s thesis,

Technical University Munich, Germany, Sept. 2014.

53

Page 64: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

[10] S. Shen, Y. Mulgaonkar, N. Michael, and V. Kumar, “Vision-based state estimation

for autonomous rotorcraft mavs in complex environments,” in 2013 IEEE Interna-

tional Conference on Robotics and Automation, May 2013, pp. 1758–1764.

[11] A. Briod, J.-C. Zu↵erey, and D. Floreano, “Optic-flow based control of a 46 g quadro-

tor,” 2013.

[12] J. Singh, “Study and development of target following capability on nano-size un-

manned aerial vehicles,” Master’s thesis, Swiss Federal Institute of Technology

Zurich, March 2017.

[13] B. Muresan and S. Sadeghi Esfahlani, “Autonomous flight and real-time tracking

of unmanned aerial vehicle,” in Intelligent Computing, K. Arai, S. Kapoor, and

R. Bhatia, Eds. Cham: Springer International Publishing, 2019, pp. 945–956.

[14] Bitcraze, “Bitcraze wiki: Crazyflie 2.0.” [Online]. Available: https://wiki.bitcraze.

io/projects:crazyflie2:index

[15] W. Hoenig, C. Milanes, L. Scaria, T. Phan, M. Bolas, and N. Ayanian, “Mixed reality

for robotics,” in IEEE/RSJ Intl Conf. Intelligent Robots and Systems, Hamburg,

Germany, Sept 2015, pp. 5382 – 5387.

[16] W. Honig and N. Ayanian, “Flying multiple uavs using ros,” Studies in Computa-

tional Intelligence, pp. 83–118, 05 2017.

[17] C. Forster, M. Pizzoli, and D. Scaramuzza, “Svo: Fast semi-direct monocular visual

odometry,” in 2014 IEEE International Conference on Robotics and Automation

(ICRA), May 2014, pp. 15–22.

[18] N. Instruments, “Pid theory explained,” Nov 2018. [Online]. Available:

http://www.ni.com/white-paper/3782/en/

[19] K. Heong Ang, G. Chong, and Y. Li, “Pid control system analysis, design, and

technology,” Control Systems Technology, IEEE Transactions on, vol. 13, pp. 559 –

576, 08 2005.

[20] T. Instruments, “bq2407x 1.5-a usb-friendly li-ion battery charger and power-path

management ic,” 2018. [Online]. Available: http://www.ti.com/lit/ds/symlink/

bq24075.pdf

54

Page 65: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

[21] S. Daniels, “Reverse polarity protection circuits,” Jul 2012. [Online]. Available:

https://provideyourown.com/2012/reverse-polarity-protection-circuits/

[22] F. J. Romero-Ramirez, R. Munoz-Salinas, and R. Medina-Carnicer, “Speeded up

detection of squared fiducial markers,” Image and Vision Computing, vol. 76, pp.

38 – 47, 2018. [Online]. Available: http://www.sciencedirect.com/science/article/

pii/S0262885618300799

[23] O. Dunkley, J. Engel, J. Sturm, and D. Cremers, “Visual-inertial navigation for a

camera-equipped 25g nano-quadrotor,” in IROS2014 Aerial Open Source Robotics

Workshop, 2014.

[24] P. Jaiswal, “Demystifying drone dynamics,” 2018. [Online]. Available: https:

//www.bitcraze.io/2018/11/demystifying-drone-dynamics/

[25] K. Richardsson, “Optical flow,” 2017. [Online]. Available: https://www.bitcraze.io/

2017/11/optical-flow/

[26] B. Rao, A. G. Gopi, and R. Maione, “The societal impact of commercial drones,”

Technology in Society, vol. 45, pp. 83–90, 05 2016.

[27] R. Clarke, “Understanding the drone epidemic,” Computer Law & Security Review,

vol. 30, p. 230–246, 06 2014.

55

Page 66: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

A Annex I: Ethical, economic, social and environ-

mental aspects

A.1 Introduction

It is undeniable that emerging technologies has economic, social, environmental and even

ethical impacts [26]. Engineers have an important role in today’s society and thus engi-

neering projects should be economically, socially and environmentally sustainable. This

annex covers the impacts that drones have in di↵erent fields and, specifically, the im-

pacts that possible applications of the system developed in this thesis can have, from an

economic, social, environmental and ethical point of view.

During the last years, the use of drones has increased considerably in many di↵erent areas,

such as industry and military. The twenty-first century is seeing a rapid proliferation of

aerial vehicles that do not have a human controller on board [27]. They have multiple

applications like for instance surveillance, transportation, delivery or search and rescue

tasks. Therefore it is important to highlight the social relevance of drones, since they

can be used to improve our daily live but also to help in important emergency situations

such as in the case of search and rescue tasks.

But, nevertheless, as already explained, the potential of many drones is limited by the

flight time due to the need for small batteries, which in many cases allow a flight of only

few minutes. Because of this fact, a recharging process need to be performed frequently.

Usually, these recharging processes involve direct human intervention, thus interrupting

the autonomous capability of drones and limiting their use and applications, since usually

human can not reach all the possible places that drones can.

In order to keep improving people live and to broaden this social impact, multiple research

topics in drones are open. Indeed one of them is focused on autonomous recharging

systems. By automating the battery recharging process of drones no human intervention

would be needed, and thus the overall mission time of the drone can be significantly

increased, leading to the possibility of extending their applications to reach multiple places

without worrying about autonomy. This has been the main goal of the thesis, to design

and implement a controller for a nano drone to perform autonomous recharging.

Moreover, from an economic point of view, the project has been carried out trying to

56

Page 67: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

maintain a low cost. In fact, the Crazyflie 2.0 used in the thesis is a low cost nano drone.

This has also been taking into account when choosing the di↵erent devices or materials

required.

Apart from this, the use of drones in general implies a series of considerations and ethical

issues. For instance, surveillance drones in civil applications have impact upon privacy

and other civil liberties.

A.2 Description of relevant impacts related to the project

The capabilities and applications of small drones have greatly increased, and thus their

manufacturing costs have been reduced. Therefore these small drones are proliferating,

also the increase in the market-size has attracted higher investment, and a leap in the

functionality-to-cost ratio has happened. These facts multiply the potential for benefits

from drones, but also aggravate the risks [27].

Drones applications in general have a positive social impact since they are used for in-

stance as delivery transport means, for monitoring crops or fires, in sports broadcasting

or in journalism. There even exist drones for emergency situations, carrying first aid kits

or even a defibrillator.

This project has been carried out in a context of university work, within a research

group. The system has not been implemented for a particular application, since the goal

is to design and implement a controller for a drone in order to obtain an autonomous

recharging system. However, although in this project we have not implemented a specific

application for the drone used, the system developed o↵ers multiple applications and

increase the possibilities of using this kind of drones in important tasks.

One possible application of the system implemented is to use it in rescue works. It

could be further extended to be used in rescue missions to identify people in collapsed

buildings. The social benefits of this application are obvious since the system would

definitely increase the quality of rescue works, increasing the possibility of finding people

and faster. This can also have an economic impact since it a↵ects the overall missions

cost.

Following the same logic, the system could be used in search tasks, for instance to locate

people with special needs such as people with Alzheimer’s disease. The main benefit here

57

Page 68: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

is also of a social nature, since it improves not only the life of these people but also their

family’s life.

Another application would be to use the drone for indoor surveillance. This can be used

by security companies in case that an important and comprehensive surveillance is needed

for certain companies or buildings. This would imply economic benefits for these security

companies but also a greater guarantee of protection for customers.

However, all these applications also require special attention to privacy policies, since the

task of recording images where people appeared is involved. This has both social and

ethical impact, as it will be analyzed in the next section.

A.3 Detailed analysis of some of the main impacts

Nowadays, the use of drones in civilian airspace has sparked debate about the challenges

to basic rights such as safety and security. The main critic with the flying of drones over

public spaces is that mistakes, however small, could result in crashes that threaten the

health, well-being and property of the public [26].

Considering the civil sphere, the wide applications of drones, and their relatively small

impact if compromised (compared for instance to military drones impact), have limited

and delayed the regulations related to the commercial and private environment. This is

also a↵ected by the open source development of the technology, which makes it di�cult to

keep track of changes. Thus, technology has already far exceeded the regulatory process.

This has implications for the widespread acceptance and adoption of drones as a viable

platform.

Regarding the di↵erent applications of our system already explained in the previous

section, and from a social and ethical point of view, personal injury and privacy invasion

are two of the most important issues.

Since the drone uses a camera and it can be used around people, it is important to comply

with privacy laws when people appears in the images. Moreover, the controller needs to

be accurate to avoid crashes with objects but more important, with people. Currently

our drone does not include this functionality, although Bitcraze o↵ers a deck to add to the

Crazyflie 2.0 that, together with the flow deck already used in this project, can be used

to avoid obstacles in any direction. However, although this point needs to be covered, it

58

Page 69: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

is worth noting that the Crazyflie 2.0 is a nano quadcopter, which means that its size is

very small, and thus possible crashes would not cause serious damages to people.

Furthermore, the system applications open up new economic opportunities in di↵erent

sectors. For instance, security and health companies can use this type of systems for

indoor surveillance or people location, respectively, as already mention. This would

expand and improve their service o↵er and thus increase their income.

The current implementation of the system uses a power supply connected to the copper

pads to charge the battery. This means that this power supply needs to be always on

and supplying the power needed if we want that the system recharges autonomously

without human intervention. Although this is a valid solution, it is not the optimal one

from an environmental and economic point of view. However, as already said, the main

goal has been the design and implementation of the controller and thus the main e↵ort

has been put on this part of the system. Moreover, the scope of the project is mainly

university, and thus the system designed for the power supply has no great impacts. But

if the system needs to be implemented for specific tasks or applications, this alternative

of power supplying should be revised to avoid energy consumption when the drone is not

charging.

A.4 Conclusions

The introduction of drones in the private and commercial sectors, and specifically the

system implemented in this project, o↵ers numerous possibilities and applications to

improve the quality of life of people, thus having a positive social impact. In addition,

it o↵ers new economic and business opportunities. However, it is necessary an ethical

regulatory framework that protects the privacy of users and civilians in general. This

process of determining these regulations is currently taking place although it is being a

slow process.

59

Page 70: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

B Annex II: Economic budget

Figure 28: Economic budget of the project

60

Page 71: Autonomous Recharging System for Drones: Detection and ...1294201/FULLTEXT01.pdfNano drone, Crazyflie 2.0, controller, autonomous recharging system, autonomous land-ing, ArUco. I

TRITA -EECS-EX-2019:32

www.kth.se