14
1 Building an Aerial-Ground Robotics System for Precision Farming Alberto Pretto 1,* , St´ ephanie Aravecchia 2 , Wolfram Burgard 3 , Nived Chebrolu 4 , Christian Dornhege 3 , Tillmann Falck 5 , Freya Fleckenstein 3 , Alessandra Fontenla 6 , Marco Imperoli 1 , Raghav Khanna 7 , Frank Liebisch 8 , Philipp Lottes 4 , Andres Milioto 4 , Daniele Nardi 1 , Sandro Nardi 6 , Johannes Pfeifer 8 , Marija Popovi´ c 7 , Ciro Potena 1 , edric Pradalier 2 , Elisa Rothacker-Feder 5 , Inkyu Sa 7 , Alexander Schaefer 3 , Roland Siegwart 7 , Cyrill Stachniss 4 , Achim Walter 8 , Wera Winterhalter 3 , Xiaolong Wu 2 and Juan Nieto 7,Abstract—The application of autonomous robots in agricul- ture is gaining more and more popularity thanks to the high impact it may have on food security, sustainability, resource use efficiency, reduction of chemical treatments, minimization of the human effort and maximization of yield. The Flourish research project faced this challenge by developing an adaptable robotic solution for precision farming that combines the aerial survey capabilities of small autonomous unmanned aerial vehicles (UAVs) with flexible targeted intervention performed by multi- purpose agricultural unmanned ground vehicles (UGVs). This paper presents an exhaustive overview of the scientific and technological advances and outcomes obtained in the Flourish project. We introduce multi-spectral perception algorithms and aerial and ground based systems developed to monitor crop density, weed pressure, crop nitrogen nutrition status, and to accurately classify and locate weeds. We then introduce the navigation and mapping systems to deal with the specificity of the employed robots and of the agricultural environment, highlighting the collaborative modules that enable the UAVs and UGVs to collect and share information in a unified environment model. We finally present the ground intervention hardware, software solutions, and interfaces we implemented and tested in different field conditions and with different crops. We describe here a real use case in which a UAV collaborates with a UGV to monitor the field and to perform selective spraying treatments in a totally autonomous way. Index Terms—Robotics in Agriculture and Forestry, Multi- Robot Systems, Autonomous Vehicle Navigation, Mapping, Com- puter Vision for Automation I. I NTRODUCTION Cooperation between aerial and ground robots undoubtedly offers benefits to many applications, thanks to the complementarity of the characteristics of these robots. This is especially useful in robotic systems applied to This work was supported by the EC under Grant H2020-ICT-644227- Flourish and by the Swiss State Secretariat for Education, Research and Innovation under contract number 15.0029 Affiliations at the time of completion of the work: 1 Department of Computer, Control, and Management Engineering, Sapienza University of Rome, Rome, Italy. 2 International Research Lab 2958 Georgia Tech-CNRS, Metz, France. 3 Department of Computer Science, University of Freiburg, Germany. Wolfram Burgard is also with the Toyota Research Institute, Los Altos, USA. 4 Department of Photogrammetry, University of Bonn, Germany. 5 Robert Bosch GmbH, Corporate Research, Renningen, Germany. 6 Agency for Agro-food Sector Services of the Marche Region (ASSAM), Osimo, Italy. 7 Autonomous Systems Lab., Department of Mechanical and Process Engineering, ETH Zurich, Z¨ urich, Switzerland. 8 Department of Environmen- tal Systems Science, Institute of Agricultural Sciences, ETH Zurich, Z¨ urich, Switzerland. * Corresponding author. Flourish project coordinator. Fig. 1. A conceptual overview of the Flourish project. A UAV continuously surveys a field over the growing season (top left), collecting data about crop density and weed pressure (top right) and coordinating and sharing information with a UGV (bottom left) that is used for targeted intervention and data analysis (bottom right). The gathered and merged information is then delivered to farm operators for high-level decision making. precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas, and then share information such as weeds distribution or crop nutrition status indicators with an agricultural UGV. The ground robot can operate for long periods of time, carry high payloads, perform targeted actions, such as selective weed treatment or fertilizer application, on the areas selected by the UAV. One of the main objectives of the Flourish project [1] was to leverage this complementarity, developing an autonomous robotic system for precision agriculture able to achieve high yields while minimizing or avoiding the application of agro-chemicals to the field. This paper provides a overview of the scientific and tech- nical outcomes obtained within the Flourish project, covering research areas ranging from robot navigation, mapping and coordination, up to robot vision, multi-spectral data analysis, and phenotyping. To develop the experimental robots, we built upon existing state-of-the-art aerial and ground platforms, extending them in various aspects (Sec. II), with the installation of a large number of specific sensors, built-in computing power, and modules for weed detection, tracking, and removal. We proposed several novel perception methods and algorithms designed to automat- arXiv:1911.03098v1 [cs.RO] 8 Nov 2019

Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

1

Building an Aerial-Ground Robotics System forPrecision Farming

Alberto Pretto1,∗, Stephanie Aravecchia2, Wolfram Burgard3, Nived Chebrolu4, Christian Dornhege3, TillmannFalck5, Freya Fleckenstein3, Alessandra Fontenla6, Marco Imperoli1, Raghav Khanna7, Frank Liebisch8, Philipp

Lottes4, Andres Milioto4, Daniele Nardi1, Sandro Nardi6, Johannes Pfeifer8, Marija Popovic7, Ciro Potena1,Cedric Pradalier2, Elisa Rothacker-Feder5, Inkyu Sa7, Alexander Schaefer3, Roland Siegwart7, Cyrill Stachniss4,

Achim Walter8, Wera Winterhalter3, Xiaolong Wu2 and Juan Nieto7,†

Abstract—The application of autonomous robots in agricul-ture is gaining more and more popularity thanks to the highimpact it may have on food security, sustainability, resourceuse efficiency, reduction of chemical treatments, minimizationof the human effort and maximization of yield. The Flourishresearch project faced this challenge by developing an adaptablerobotic solution for precision farming that combines the aerialsurvey capabilities of small autonomous unmanned aerial vehicles(UAVs) with flexible targeted intervention performed by multi-purpose agricultural unmanned ground vehicles (UGVs). Thispaper presents an exhaustive overview of the scientific andtechnological advances and outcomes obtained in the Flourishproject. We introduce multi-spectral perception algorithms andaerial and ground based systems developed to monitor cropdensity, weed pressure, crop nitrogen nutrition status, and toaccurately classify and locate weeds. We then introduce thenavigation and mapping systems to deal with the specificityof the employed robots and of the agricultural environment,highlighting the collaborative modules that enable the UAVs andUGVs to collect and share information in a unified environmentmodel. We finally present the ground intervention hardware,software solutions, and interfaces we implemented and tested indifferent field conditions and with different crops. We describehere a real use case in which a UAV collaborates with a UGV tomonitor the field and to perform selective spraying treatmentsin a totally autonomous way.

Index Terms—Robotics in Agriculture and Forestry, Multi-Robot Systems, Autonomous Vehicle Navigation, Mapping, Com-puter Vision for Automation

I. INTRODUCTION

Cooperation between aerial and ground robots undoubtedlyoffers benefits to many applications, thanks to thecomplementarity of the characteristics of these robots.This is especially useful in robotic systems applied to

This work was supported by the EC under Grant H2020-ICT-644227-Flourish and by the Swiss State Secretariat for Education, Research andInnovation under contract number 15.0029

Affiliations at the time of completion of the work: 1Department ofComputer, Control, and Management Engineering, Sapienza University ofRome, Rome, Italy. 2International Research Lab 2958 Georgia Tech-CNRS,Metz, France. 3Department of Computer Science, University of Freiburg,Germany. Wolfram Burgard is also with the Toyota Research Institute, LosAltos, USA. 4Department of Photogrammetry, University of Bonn, Germany.5Robert Bosch GmbH, Corporate Research, Renningen, Germany. 6Agencyfor Agro-food Sector Services of the Marche Region (ASSAM), Osimo,Italy. 7Autonomous Systems Lab., Department of Mechanical and ProcessEngineering, ETH Zurich, Zurich, Switzerland. 8Department of Environmen-tal Systems Science, Institute of Agricultural Sciences, ETH Zurich, Zurich,Switzerland. ∗Corresponding author.†Flourish project coordinator.

Fig. 1. A conceptual overview of the Flourish project. A UAV continuouslysurveys a field over the growing season (top left), collecting data aboutcrop density and weed pressure (top right) and coordinating and sharinginformation with a UGV (bottom left) that is used for targeted interventionand data analysis (bottom right). The gathered and merged information is thendelivered to farm operators for high-level decision making.

precision agriculture, where the areas of interest are usuallyvast. A UAV allows rapid inspections of large areas, and thenshare information such as weeds distribution or crop nutritionstatus indicators with an agricultural UGV. The ground robotcan operate for long periods of time, carry high payloads,perform targeted actions, such as selective weed treatmentor fertilizer application, on the areas selected by the UAV.One of the main objectives of the Flourish project [1] wasto leverage this complementarity, developing an autonomousrobotic system for precision agriculture able to achievehigh yields while minimizing or avoiding the application ofagro-chemicals to the field.

This paper provides a overview of the scientific and tech-nical outcomes obtained within the Flourish project, coveringresearch areas ranging from robot navigation, mapping andcoordination, up to robot vision, multi-spectral data analysis,and phenotyping.To develop the experimental robots, we built upon existingstate-of-the-art aerial and ground platforms, extending them invarious aspects (Sec. II), with the installation of a large numberof specific sensors, built-in computing power, and modules forweed detection, tracking, and removal. We proposed severalnovel perception methods and algorithms designed to automat-

arX

iv:1

911.

0309

8v1

[cs

.RO

] 8

Nov

201

9

Page 2: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

2

ically perform cyclical and exhaustive in-field measurementsand interpretations (Sec. III), such as inference of weed densityfrom multi-spectral images, mapping and classification ofcrops and weeds, and computation of plant health indica-tors. Robot positioning and cooperative environment modeling(Sec. IV) have been addressed proposing novel algorithmsthat leverage specific field representations, or building uponexisting methods tailored for the specificity of the environ-ment, e.g., by integrating multi-spectral imaging in the UAVmapping algorithms and by exploiting specific environmentpriors into the UGV positioning algorithms and the temporalmap registration algorithms. We propose a novel mission plan-ner that allows the UAV to adaptively map large areas whilerespecting battery constraints, while we addressed safe UGVnavigation in a cultivated field by integrating accurate relativelocalization, crop row detection, and an ad-hoc controller(Sec. V); the mission coordination has been assured by alightweight task scheduler and communication framework. Wefinally present a use case of ground intervention in the field, bymeans of accurate weed tracking for precision tool placement,and the development of a selective spraying and mechanicalweed treatment module that is suitable for commercialization(Sec. VI). Another important outcome of the project is thelarge amount of open source software modules released anddatasets generated, which we hope the community will benefitfrom (Sec. VII).

A. Robotics in Agriculture: An Overview of Recent Projects

The application of robots in agricultural contexts is com-monly perceived as an enabling technology to improve thepotential for intervention and farm field monitoring. Despitethis technology is still in a preliminary assessment phase, withmany possible uses yet to be explored, many research projectsand start-ups have been funded in the last years.A project similar to Flourish is RHEA [2], that aims atdiminishing the use of agricultural chemical inputs in a 75%,improving crop quality, health and safety for humans, andreducing production costs by means of sustainable crop man-agement using a fleet of small, heterogeneous robots (groundand aerial) equipped with advanced sensors, enhanced end-effectors and improved decision control algorithms. Anothersimilar project is PANTHEON [3], whose aim is to develop aSupervisory Control And Data Acquisition (SCADA) systemfor the precision farming of orchards for a team of aerial andground robots in hazelnut orchards. The information is thencollected in a central unit that performs automatic feedbackactions (e.g. to regulate the irrigation system) and to supportthe decisions of the agronomists.Other recent projects dealing with the development of au-tonomous ground platforms are the GRAPE [4] and theSWEEPER [5] projects. The former aims at creating agricul-tural service companies and equipment providers to developvineyard robots that can increase the cost effectiveness of theirproducts with respect to traditional practices. In particular,the project addresses the market of instruments for biologicalcontrol by developing the tools required to execute (semi) au-tonomous vineyard monitoring and farming tasks with UGVs

and, therefore, reducing the environmental impact with respectto traditional chemical control. Similarly, the SWEEPER mainobjective is to put the first generation greenhouse harvestingrobots onto the market.In contrast to the projects above, SAGA [6] is a collaborativeresearch project that aims at demonstrating the application ofswarm robotics principles to the agricultural domain. Specif-ically, it targets a decentralised monitoring/mapping scenario,and implements a use case for the detection and mapping ofweeds in a field by a group of small UAVs. Other interestingapplications of UAVs in the agricultural context are tree 3Dreconstruction and canopy estimation [7], fruit counting [8],yield estimation [9], and light-weight devices for automatedmonitoring [10].On the industry side, several start-ups have been have beenraised, and much more are expected to be funded. The majorservices provided are UGVs for weed removal [11][12][13],and in-season data analytics or early pest and disease detectionthrough aerial or satellite imagery.

II. EXPERIMENTAL PLATFORMS

The Flourish project exploited existing state-of-the-art farm-ing and aerial robots, extending them in various aspects toimprove both autonomous navigation and environment mod-eling capabilities, and to enable them to perform robust plantclassification and/or selective weed removal operations.

A. Multirotor Used in Flourish

The main UAV platform used in the project is a fullysensorized DJI Matrice 100 (Fig. 2, left). The platform in-cludes an Intel NUC i7 computer for on-board processing,a NVIDIA TX2 GPU for real-time weed detection on thefield, a GPS module and a visual-inertial (VI) system foregomotion estimation. We employ a VI sensor developed at theAutonomous Systems Lab [14] and also tested and integrateda commercially available sensor, the Intel ZR300, for widerusage.

B. Ground Vehicle

1) The BoniRob Farming Robot: The Bosch DeepfieldRobotics BoniRob (Fig. 2, right) is a flexible research platformfor agricultural robotics. Its four wheels can be independentlyrotated around the vertical axis, resulting in omnidirectionaldriving capabilities, and are mounted at the end of leverarms, letting the robot adjust its track width from 1m to2m. In order to be able to execute its complex tasks, theBoniRob carries a multitude of sensors: GPS, RTK-GPS, apush-broom lidar, two omnidirectional lidars, RGB cameras,a VI system, hyperspectral cameras, wheel odometers, etc.These sensors are directly connected to a set of on-board PCsthat run the Robot Operating System (ROS) and communicatethrough an internal network. The BoniRob’s batteries arecomplemented by a backup generator that facilitates long-termfield application.

Page 3: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

3

Fig. 2. The two main robots used in the experiments and demonstrations: a DJI Matrice 100 UAV multi-rotor performing an autonomous flight over a sugarbeet field (top-left); the UAV with highlighted the installed sensors (bottom-left) and the Bosch BoniRob farming UGV (right).

2) Weed Intervention Module: Supporting the target usecase of selective weed intervention, the robot is equippedwith an extension module, the Weed Intervention Module(Fig. 13). This module consists of a perception system forweed classification, multi-modal actuation systems and theirsupporting aggregates.

The main design objectives of this unit are high weedthroughput, precise treatment, and flexibility. The weeds aretreated mechanically with two ranks of stampers or chemicallywith one rank of sprayers. The weeds are detected and trackedin real-time with three cameras with non overlapping Field ofView (FoV).

The perception system of the weed unit consists of threeground-facing global shutter cameras and three narrow beamsonars. To protect this perception system from natural lightsources, the weed control unit is covered, and artificiallights have been installed to control the illumination. A firstRGB+NIR camera is used for weed detection and tracking,while other two RGB cameras are used for tracking. Thesonars help recover the absolute scale of the camera images.Further details about the weed intervention module can befound in Sec. VI-A.

III. DATA ANALYSIS AND INTERPRETATION IN A FARMINGSCENARIO

Precision farming applications aim to improve farm pro-ductivity while reducing the usage of fertilizers, herbicidesand pesticides. To meet these challenges, continuous andexhaustive in-field measurements of plant health indicatorsand weed density are required. We addressed both theserequirements from a robotic point-of-view, by proposing a set

Fig. 3. Example results obtained by our plant classification systems. Left:UGV based semantic segmentation into crop, weed, grass-weed. Middle: stemdetection providing accurate location of crops and weeds. UAV-based semanticsegmentation.

of methods to accurately detect plants and to distinguish themas crops and weeds (Sec. III-A and III-B) and to automaticallyanalyze the nitrogen status of crops from the multi-spectralaerial images (Sec. III-C).

A. Crop and Weed Detection

A prerequisite for selective and plant-specific treatments isthat farming robots need to be equipped with an effective plantclassification system providing the robot with the informationwhere and when to trigger its actuators to perform the desiredaction in real-time.

In the Flourish project, we focus on vision-based approachesfor plant classification and use machine learning techniques tocope with the large variety of different crops and weeds as wellas changing environmental conditions in an effective manner.Fig. 3 illustrates results obtained by our plant classificationsystems for both, UGV and UAV platforms. The furtherdistinction between weeds and grass-weeds allows our systemto perform different treatments in a targeted manner dependingon the type of weed. For example, local mechanical treatments

Page 4: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

4

are most effective when applied to the stem location of theplants. In contrast, grass-like weeds can effectively be treatedby spraying herbicides to their leaf surfaces.

At the beginning of the Flourish project, we developed plantclassification approaches based on handcrafted features andRandom Forests [15], [16]. During the project, the classifiersevolved to lightweight Fully Convolutional Network (FCN)approaches as these type of machine learning models are (i)easier to use, because no feature engineering is necessary,(ii) faster in run-time thanks to dedicated hardware, and(iii) provide superior performance and better generalizationcapability.

A key challenge is that the plant classification systemmust be able to generalize well to new and changing fieldenvironments. To effectively generalize to new conditions, weexploit geometric patterns that result from the fact that severalcrops are sown in rows. Within a field of row crops, theplants share a similar lattice distance along the row, whereasweeds appear randomly. In contrast to the visual cues, thisgeometric signal is much less affected by changes in the visualappearance. In Lottes et al. [17], we propose a semi-supervisedonline approach for the vision-based classification of crops andweeds by exploiting additional arrangement information of thecrops in order to adapt the visual classifier. In Lottes et al. [18],we present a novel, FCN based plant classification approachthat operates on image sequences obtained along the crop rowsallowing the classifier to learn features describing the plantarrangement. We show that incorporating the arrangementinformation boosts, the classification performance and thegeneralization capabilities of the plant classifiers.

In terms of UAVs, we deploy the same underlyinglightweight FCN architecture also on our UAV system [19].Here, we use the FCN in a classical single image fashion asthe larger footprint of the camera covers implicitly enoughinformation about the plants arrangement. Through our cropand weed classification systems, we enable UGVs to performplant-specific high precision in-field treatments and transformUAVs into an efficient system for crop monitoring applica-tions.

B. Automatic Synthetic Dataset Generation

To be effective, data driven plant classification approachesusually require large annotated datasets, acquired across differ-ent plant growth stages and weather conditions. Creating suchdatasets, e.g., with pixel-level annotations, is an extremelytime consuming task. We face this problem by proposingan automatic, model based dataset generation procedure [20]that generates large synthetic training datasets by randomizingthe key features of the target environment (i.e., crop andweed species, soil, light conditions, . . . ). We model a leafof the target plants (Fig. 4, top-left) by means of kinematicchains that span from the stem toward the leafs’ principalveins, applied over RGB textures taken from real world plantpictures. Ambient occlusions, normals and height maps aretaken into account to improve the photorealism of the resultingrendered leaf. We then model the single plants with multi-layer radial distributions of leaves (Fig. 4, top center), where

Fig. 4. An overview of the automatic model based dataset generationprocedure. A generic kinematic model of a leaf (top-left) is used to generatea number of different leaf configurations and plants (top center); by samplingplants, spatial distributions and soils, it is possible to render realistic agricul-tural scenes with associated pixel-wise segmentation masks (top-right), e.g.,to be used to train deep encoder-decoder segmentation networks (bottom).

the leaf and plant parameters are sampled over a set of rangesthat depend on both plant species and growing stage. In afinal step, we can render a virtually infinite number of realisticagricultural scenes by adding a random background soil andby sampling lightning conditions and plants spawning; we alsoautomatically generate the ground truth segmentation masks(Fig. 4, top-right). These couples of synthetic images andpixel-wise labels can be used to train modern deep learningbased image segmentation architectures (Fig. 4, bottom).

C. Multi-spectral N-Status Detection and Phenotyping

The fertilization status of a crop is an important farming pa-rameter connected to the environmental footprint of agronomy.Well fertilized crops usually produce optimal yield and qualityand stands are more stress resilient. Applying too little fertil-izer diminishes yields, whereas surplus application increasesthe risk of nutrient losses to the environment and increasessusceptibility to pests and diseases. Nitrogen application playsa prominent role for the management of most arable crops,because of the generally high fertilization demand and thevery high mobility in the soil. Although N-fertilizer (Nitrogen-fertilizer) demand by sugar beet is relatively low, yield andquality of the harvested produce is strongly dependent onoptimal N- management: too low N-application limits tuberyield, while an high N-application reduces the extractablesugar content in the tuber [21].

Therefore, it is important to apply N-fertilizer at the righttime, rate and place during the vegetation period. Thesedecisions can be supported by optical proximal or remotessensing tools making use of visible or non-visible parts ofthe spectral reflection of crop stands [22], [23]. The usedmethodology used for N-status detection in Flourish is de-scribed in [24], [25]. Additionally, plant phenotyping, theimage-based assessment of plant traits such as greenness,plant size and plant developmental stage [26] but also weedpressure, therefore plays an increasingly prominent role for thedevelopment of sustainable agronomic practices in precisionfarming approaches [27], [28].

Page 5: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

5

To validate the spectral and imaging methodology for sugarbeets, in the Flourish project randomized field trials were es-tablished in commercial sugar beet fields and different nitrogeninput treatments were applied from 2015 to 2017 . Aerialimage spectroscopy was realized with a multi-spectral GamayaVNIR 40 camera mounted on a Solo UAV from 3D Robotics,.As ground truth measure, spectral reflectance was recordedmanually with an ASD Fieldspec 4 spectroradiometer. Forthe two spectral devices [24], a set of spectral indices wascalculated and related to plant N status, tuber yield and sugarcontent.

In our results, red edge based spectral indices such asthe simple ratio [22] and the normalized difference red edgeratio [29] reflected the N-status in sugar beets successfullyfrom both spectroscopy methods (ground and UAV-based), andcan be used to calculate N-fertilizer application maps.

IV. POSITIONING AND ENVIRONMENT MODELING

The ability to localize and build a model of the surroundingenvironment is an essential requirement to support a reliablenavigation of an autonomous robot. Such tasks are even morechallenging in a farming scenario, where (a) the environmentis mainly composed by repetitive patterns, with no distinctivelandmarks; (b) multi-spectral information should be includedin the modeling process, to support decision making for farmmanagement. Moreover, in a multi-robot setup as in Flourish,the UAV and the UGV should be able to cooperatively builda shared model of the environment. This section presentsthe main contributions we proposed to localize and modelcultivated fields by a UAV (Sec. IV-A), a UGV (Sec. IV-B),and to fuse this information between robots (Sec. IV-C) andacross time (Sec. IV-D).

A. UAV Localization and Mapping

The aim of the UAV perception system is the remotecollection of high resolution spatio-temporal multi-spectralmaps of the field. In precision farming, this data is criticalas it allows for mission planning before the UGV actuallyenters the field, thereby optimizing the time and locationof ground intervention procedures, i.e., weed elimination orfertilizer application, without the risk of crop damage andsoil compaction. The perception pipeline requires two maincompetencies: (1) motion estimation and precise localizationwithin the field, and (2) multi-resolution multi-spectral aerialmapping based on the indicators needed to assess plant health.

A key challenge for vision-based localization on agriculturalfields is the homogeneous appearance of the crops. On theother hand, the accuracy of GPS alone is inadequate forconstructing high resolution maps or defining paths for UGVintervention. To address this, we develop an on-board stateestimation system which combines data from a synchronizedVI sensor, a GPS sensor, the UAV IMU and optionally adownward-facing laser altimeter, to estimate the 6 DoF pose.Fig. 5 (top) provides an overview of the major hardware andsoftware components. The Robust Visual Inertial Odometry(ROVIO) [30] framework is used to produce a 6 DoF poseouput for the sensor IMU based on the raw images and IMU

Fig. 5. Top: Block diagram of our UAV state estimation framework depictingthe sensor suite and major software components. Bottom: Comparison of ourVI-GPS fusion-based state estimation with raw GPS and ground truth.

data from the VI sensor. We then use the Multi-Sensor Fusion(MSF) framework [31] to combine the ROVIO output andthe UAV IMU data and obtain 6 DoF state estimates passeddirectly to our Model Predictive Controller (MPC) for trajec-tory tracking. To further improve accuracy and robustness, weintegrate our system with MAPLAB [32], a VI framework withmap maintenance and processing capabilities. On-field resultsusing an AscTec NEO and DJI Matrice 100 UAV platformsdemonstrate high accuracy in state estimation compared topositions obtained with a Leica Geosystems Total Station asground truth (Fig. 5, bottom).

Accurate high-resolution field map models are a key prereq-uisite for enabling robots in precision agriculture applications.To this end, we develop a UAV environmental modelingframework using the pose estimate from our localizationsystem, and colour and multi-spectral camera informationover multiple flights to create a spatio-temporal-spectral fieldmodel. Fig. 6 shows an overview of our pipeline. Taking rawRGB and multi-spectral images, and UAV poses as inputs, weradiometrically correct the spectral data to create a spatial fieldmodel in the form of a dense point cloud. For each point,the spectral reflectances in the wavelength bands observedby the multi-spectral camera are estimated and stored in adatabase. The field development over time can be viewedthrough layered orthomosaics generated from this data througha custom built browser-based visualization module (Fig. 7),

Page 6: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

6

Fig. 6. Block diagram of our environment modeling framework depictingthe sensor suite and major software components (images courtesy of Intel®,Pix4D S.A and Ximea).

ensuring portability between client computers. We use thehigher-quality RGB camera images, recovering the geometryof the field at a high resolution, and use the relative positionand orientation between the RGB and multi-spectral camerato estimate its spectral reflectance. Importantly, our recon-struction/reprojection strategy removes the need for a separatereconstruction step for each band and the following alignmentstep, which are commonly used; thus providing a high-fidelity,fast, and efficient spatio-temporal-spectral modeling system.

B. UGV Global Positioning and Mapping

Currently, most positioning systems used in commercialfarming robot rely on high-end Real-Time Kinematic GlobalPositioning Systems (RTK-GPSs), that however are not robustagainst base station signal loss or multipath interference, andcannot provide the full 6D position (translation and rotation) ofthe vehicle. We tackle this problem by taking advantage of theheterogeneity of the UGV sensors (Fig. 2, right). We model theglobal localization problem as a 6D pose graph optimizationproblem (Fig. 8, [33]). The constraints between consecutivenodes (superscript X in Fig. 8) are represented by motionestimations provided by the UGV wheel odometry, local point-cloud registration, and a visual odometry (VO) front-end that

provides a full 6D ego-motion estimation. Noisy, but drift-freeGPS readings, along with a pitch and roll estimate provided byan IMU, are directly integrated as prior nodes (superscript Yin Fig. 8). Driven by the fact that both GPS and VO providepoor estimates along the z-axis (i.e., parallel to the gravityvector), we introduce two additional altitude constraints:

1) An altitude prior, provided by a Digital Elevation Model(DEM) (superscript Y in Fig. 8);

2) A smoothness constraint for the altitude of adjacentnodes (superscript MRF in Fig. 8).

The integration of such constraints not only improves theaccuracy of the altitude estimation, but it also positively affectsthe estimate of the remaining state components.

C. Cooperative UAV-UGV Environment Modeling

The UAV and the UGV should collaborate to generate 3Dmaps of the environment, e.g., annotated with parameterssuch as crop density suitable for supporting the farmer’sdecision making. Building a common map is an essential butchallenging task: the UAV can provide a coarse reconstructionof large areas, that should be updated with more detailed mapportions generated by the UGV. To this aim, we introducedAgriColMap [34], acronym for Aerial-Ground Collaborative3D Mapping for Precision Farming, an effective map registra-tion pipeline that leverages a multimodal field representationand casts the data association problem as a large displacementdense optical flow (LDOF) estimation.We assume that both the UAV and the UGV can generatecolored, geotagged point clouds of a farm environment, MA

and MG, e.g., by means of photogrammetry-based 3D re-construction (Fig. 9 bottom row, column 1). Our goal is toestimate an affine transformation F : R3 → R3 that allows toaccurately align them and correct the geotags misalignmentsand the reconstruction and scale errors (Fig. 9 bottom row,column 2). We start looking for a set of point correspondences,mA,G = (p, q) : p ∈ MA, q ∈ MG, that represent pointspairs belonging to the same global 3D position. Inspired by thefact that points inMA locally share a coherent ”flow“ towardscorresponding points in MG, we cast the data associationproblem as a dense, regularized, matching approach. Thisproblem recalls the dense optical flow estimation problemfor RGB images, however solutions to the latter are notdirectly applicable to point clouds: we introduce a multimodalenvironment representation that allows to exploit such methodswhile enhancing both the semantic and geometrical propertiesof the map. We exploit two intuitions:

• A Digital Surface Model (DSM) well approximates thestructure of a cultivated field;

• A vegetation index can highlight the meaningful parts ofthe field and its visual patterns.

We transform MA and MG into 2D grid maps JA,JG :R2 → R2 (Fig. 9 bottom row, column 3), where for eachcell p we provide the surface height h and the Excess Greenindex, ExG(p) = 2gp − rp − bp, being rp, gp, bp the(average) RGB components of the cell. To estimate the offsetsmap, we employ a modified version of the recent LDOFCoarse-to-fine PatchMatch framework (CPM) [35]. We apply

Page 7: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

7

Fig. 7. Visualization interface for the spatio-temporal-spectral database showing both RGB orthomosaics (left) and corresponding index maps (right) for asugar beet field over time. The user can select spectral layers to view a georeferenced reflectance orthomosaic corresponding to a wavelength band, view thecolor (RGB) orthomosaic and toggle through all available surveys using the timeline.

Fig. 8. Left: A noisy GPS trajectory optimized by using our multi-cue positioning system that can be used, for example, to stitch together images acquiredfrom a downward looking camera. Right: Overview of the built pose graph. Solid arrows represent graph edges, that encode conditional dependencies betweennodes, dotted arrows temporal relationships between nodes. The errors e·i,j is computed between the actual measurement and the predicted measurement; the

information matrix Ω·i,j represents the measurements uncertainty.

the visual descriptor of the original CPM method directlyto the ExG channel of JA and JG, while we exploit a 3Ddescriptor computed over the DSM to extract salient geometricinformation; the matching cost has been modified accordinglyto take into account both descriptors.Once we have computed the flow between JA and JG, weextract the largest set of coherent flows by employing a votingscheme (Fig. 9 bottom row, columns 4, 5); these flows definea set of matches mA,G that are used to infer a preliminaryalignment F . Using F as an initial guess, we finally estimatethe target affine transformation F by exploiting the CoherentPoint Drift registration algorithm [36] over the point cloudsMveg

A and MvegG , obtained from MA and MG by extracting

with a ExG based thresholding operator only points that belongto vegetation. (Fig. 9, columns 6, 7).In our experiments, AgriColMap outperforms several state-of-the-art map registration techniques by a large margin, and hasa higher tolerance to large initial misalignments (e.g., Fig. 9top right).

D. Long-Term Temporal Map Registration

Continuous crop monitoring is an important aspect ofphenotyping and requires the registration of sensor data overthe entire season. This task is challenging due to the strongchanges in the visual appearance of the growing crops andthe field itself. Conventional image registration based onvisual descriptors is typically unable to deal with such drasticchanges in appearance. To address this challenge, we devel-oped a method for registering temporally separated images byexploiting the inherent geometry of the crop arrangement inthe field, which remains relatively invariant over the season.We propose a scale-invariant, geometric feature descriptor thatencodes the local plant arrangement geometry and uses thesedescriptors to register the images even in the presence ofstrong visual changes [37]. The registration results allow formonitoring growth parameters at a per plant level as illustratedin Fig. 10.

Page 8: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

8

Fig. 9. The AgriColMap method aims to accurately merge UGV and UAV colored point clouds (top left) by means of an affine transformation that registersthe UGV submap (red rectangular area) into the UAV aerial map (blue rectangular area). Top right: Some qualitative registration results seen from aerial andground point-of-views. The UGV clouds are clearly visible due to their higher points density. Bottom row: The full AgriColMap pipeline.

1 2 3 4Sessions

0

100

200

300

400

500

600

700

Ses 1 Ses 2

Ses 3 Ses 4

Fig. 10. Monitoring crop growth parameters. Left: Same crop identified inthe bounding box over different sessions using our registration results. Right:plot of leaf cover over time at five different sites in the field.

V. PLANNING, NAVIGATION AND COORDINATION

The UAV and the UGV have different working areas androles within each field analysis and targeted intervention mis-sion. The planning of their actions and the navigation policiesshould reflect these differences. We introduced an ad-hocUAV navigation module (Sec. V-A) that integrates a missionplanner to effectively perform field monitoring missions whilerespecting battery constraints. Crop row localization and safein-field UGV navigation is addressed in Sec. V-B, where thehigh number of DoFs of the UGV is used to improve theefficiency and smoothness of motions. The inter-robot missioncoordination framework is then introduced in Sec. V-C.

A. UAV Mission Planning and Navigation

A key challenge in agricultural monitoring is developingmission planning algorithms to define the path for the UAV

to optimally survey the field. The planning module shouldtake into account field coverage, scientifically defined areasof interest, and battery life constraints. To address this, wedevelop an informative path planning (IPP) framework foradaptive mission planning [38], [39].

The generic structure of our system is depicted in Fig. 11(top). Our framework is suitable for mapping either discreteor continuous variables on a terrain depending on the typeof sensor data received, e.g., from a depth or multi-spectralcamera, as specified by the farmer. In particular, our maincontribution is a method for continuous variable mapping thatconsiders the patterns of the target distributions on the farm.To achieve this, we leverage Gaussian Processes (GPs) asa natural way of encoding the spatial correlations commonin biomass distributions. The strategy exploits a GP as aprior for recursive Bayesian data fusion with probabilistic,variable-resolution sensors. In doing so, it supports mappingusing dense visual imagery without the computational burdenof standard GP regression, making it suitable for online on-platform application in agricultural scenarios.

In terms of planning, a fundamental challenge we addressis trading off image resolution and FoV to find most usefulmeasurement sites, while accounting for limited endurance andcomputational resources. During the mission, the terrain mapsbuilt online are used to plan trajectories in continuous 3Dspace for maximum gain in an information metric reflectingthe mission aim, e.g., targeted high-resolution mapping ofareas containing excessive green biomass level, i.e., regionswith weed infestations. Our planning scheme proceeds in afinite-horizon fashion, alternating between replanning and planexecution. This allows us to create adaptive plans, taking new

Page 9: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

9

Fig. 11. Top: System diagram of our IPP framework. A field map is built usingmeasurements extracted from a sensor. During a mission, the map state is usedto plan informative trajectories for data collection. These are then executed bythe UAV, allowing for map updates in a closed-loop manner. Middle: Examplecomparison of our CMA-ES-based approach to “lawnmower” coverage (leftand right, respectively) for continuous variable mapping in 200 s missions. Thecolored lines and spheres represent the traveled trajectories and measurementsites. Ground truth maps are rendered. Bottom-left: Comparison of the finalmap uncertainties (measured by the GP covariance matrix trace) for variouspath budgets. Ten CMA-ES trials were run for each budget. Bottom-right:Comparison of times taken to achieve the same final map uncertainty, givena fixed CMA-ES budget.

sensor data into account to focus on areas of interest as they arediscovered. For replanning, we leverage an evolutionary tech-nique, the Covariance Matrix Adaptation Evolution Strategy(CMA-ES), to optimize initial trajectory solutions obtained bya course 3-D grid search in the UAV workspace.

Our approach was evaluated extensively in simulation,where it was shown to outperform existing methods (Fig. 11,bottom), and validated on the field.

B. UGV Position Tracking and Navigation

For autonomous navigation on fields the BoniRob UGV(Fig. 2, right) needs to accurately steer along the crop rowswithout crushing any of the value crop. Moreover, it needs tobe able to transition between different crop rows by performingtight and accurate turns at the end of the field. Thus, first therobot needs to estimate its pose relative to the crop rows;second, it needs to plan a path along the crop rows throughthe field; and third, it needs to generate smooth velocitycommands that precisely follow this path. To this end, wedeveloped a crop row detection algorithm, the Pattern HoughTransform [40]. We first process the input from either vision

or lidar data by extracting plant features and projecting themonto a feature grid map in the local robot frame (see Fig. 12,top left). Then we apply our Pattern Hough Transform onthese feature maps to find the best matching set of parallel andequidistant lines, i.e., the best matching pattern (see Fig. 12,top right). Such a pattern is defined using the parameters θfor orientation, s for spacing between adjacent lines and o asoffset of the first line to the origin as shown in Fig. 12 (topcenter). Our approach finds a best matching pattern for allparameters in a single step. The advantages of this approach,shown in extensive experiments, are that it is robust againstoutliers like weed growing between the crop rows, and it alsoyields accurate results during turning, i.e., when the crop rowsare not necessarily aligned with the orientation of the robot.

We integrated the output from our Pattern Hough Transforminto the localization module of the BoniRob. The localizationis based on an Extended Kalman Filter. Fused odometryand IMU measurements are used for the prediction. In thecorrection step, we align the detected crop row pattern with aGPS-referenced map of crop rows to correct the pose estimateof the robot relative to the field. Since the crop row patternonly provides lateral and orientation information, i.e., nocorrection along the crop rows, we use GPS measurementsto correct the position estimate of the robot along the croprows.

We implemented a global planner based on a state latticeplanner to ensure that the BoniRob is able to find a path to anyreachable pose in the field. Whether it is able to pass througha narrow gap or over an obstacle depends on the current wheelpositions (see Fig. 12 bottom left), as the BoniRob is able tochange its track width by adjusting the angles of the lever armsto which the wheels are attached (Fig. 2, right). We developeda path planner that considers these lever angles explicitly [41].Our state space includes not only the 2D pose of the robot,but also the arm angles to account for wheel position changes.Furthermore, we added actions to the planner that allow therobot to change its arm angles. The additional degrees offreedom from the arm angles greatly increase the size of thestate space. This makes planning with commonly used searchalgorithms inefficient. Thus, we introduced a novel method torepresent the robot state with a reduced cardinality, that is, wetrack valid arm angle intervals instead of single arm angles inthe robot state.

A pose path from the global planner is translated into veloc-ities by our local planner that considers steering constraints.This enables the BoniRob to execute such a path. Any robotwith slow-turning independently steerable wheels, such as theBoniRob, has certain steering constraints, the most prominentone being the limited steering velocity. Other constraints arenon-continuous steering or wheel angle instabilities when therobot tries to turn around a wheel. Violations of these steeringconstraints need to be avoided. To this end, we presented anew approach to generate velocity rollouts that incorporatessteering constraints shown in Fig. 12 bottom right [42]. Itmakes use of the correspondence between the wheel anglesand the instantaneous center of rotation (ICR) of the robot.After projecting the steering constraints into ICR space, wecompute a valid ICR path that matches the constraints. From

Page 10: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

10

Fig. 12. Overview of our navigation system. Top left: Our Pattern Hough Transform can detect crop rows in either lidar or camera data using the extractedplant features. Top center: A pattern is defined as the set of parallel and equidistant lines (red) with orientation θ, offset o and spacing s. Top right: Resultof the Pattern Hough Transform (red) on medium sized sugar beets (≈ 5cm). Bottom left: Valid arm angle intervals when moving close to an obstacle.Bottom center: Velocity rollouts in the local planner. The rollouts are color coded with their respective costs. Bottom right: The ICR constraints derived fromthe hardware constraints (red) and the maximum steering velocity (green).

this ICR path, we calculate valid velocity sequences that therobot can execute smoothly. Results of real-world experimentsshow that our local planner incorporating steering constraintsimproves efficiency and leads to smoother execution.

C. UAV-UGV Mission Coordination

To unlock the potential of the Flourish robotics system, itis essential to be able to run coordinated missions betweenthe robots. Since both robots share information via Wi-Fi,this information needs to be kept at a minimal level and thecoordination needs to be ensured even when communicationis lost. The only data exchanged are: the UAV pose, the UGVpose, the coordinates of the areas of interest, the requests fromone robot to the other and their status messages. Because ofthe lossy communication, exchanging requests and status isreliable way to ensure a message sent by a robot is indeedreceived by the other one.

The mission framework used on both robots is based onros task manager [43]. This is a task scheduler developedfor ROS particularly easy to use, that allows for combiningmultiple behaviours, with elements running in sequence or inparallel, eventually interrupting each other. This frameworkis based on tasks, implemented in C++ which are combinedinto complex missions implemented in basic python. In thefollowing an example of coordinated mission:

1) The final user launches the coordinated mission in theuser interface;

2) The UAV takes off and flies over the field to detect areasof interest;

3) As soon as an area of interest is detected, it is sent tothe UGV;

4) The UGV reaches the area of interest, activating theweed intervention module and starting the treatment;

5) When exiting the area of interest, the UGV stops thetreatment and follows the row until its end;

6) When all the field is surveyed, the UAV flies back to thelanding point and lands;

7) The UGV moves to the next row containing an area ofinterest;

8) When all the required rows have been treated, the UGVmoves back to its meeting point and the mission is over.

VI. IN-FIELD INTERVENTION: THE COLLABORATIVEWEEDING USE CASE

The main use case addressed in the Flourish project is thecollaborative weeding application (Fig. 1). The UAV flies overthe field running the navigation and planning algorithms ofSec. IV-A and V-A, while analyzing the weed pressure byusing the classification algorithms presented in Sec. III-A.High weed pressure areas are notified to the UGV by usingthe coordination framework described in Sec. V-C. Thus, theUGV starts to move toward the selected areas, running thealgorithms of Sec. IV-B and V-B. In this section, we describethe tools (Sec. VI-A) and methods (Sec. VI-B) used forthe actual weed treatment, with possible agronomic impactsreported in Sec. VI-C.We successfully tested the whole pipeline in a public demon-stration during a dissemination event held near Ancona (Italy)in May 2018.

A. Selective Wedding Removal

The weed intervention module (Fig. 13), whose perceptionsystem was introduced in Sec. II-B, includes further toolsdesigned to address the targeted weed treatment: a weedstamping tool and a selective spraying tool (Fig. 13). Thestamping tool is composed of 18 pneumatic stamps arranged intwo ranks. All stamps are individually controllable and a high

Page 11: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

11

Fig. 13. Schematic 3D model of the weed intervention module (left), sprayingnozzles (top right), and single row of mechanical stampers (bottom right).

precision of the positioning is ensured by only allowing onedegree of freedom for the positioning across to the drivingdirection. The spraying tool is positioned in the back. It isassembled out of nine nozzles, individually controlled by offthe shelf magnetic valves.

Both weeding tools are controlled with a scalable, pro-grammable logic controller (PLC). Modules requiring morecomputational resources, i.e weed detection and tracking, areimplemented on a computer dedicated to the weed controlrunning Linux and ROS.

The bolt of the stamps have a 10mm diameter, whereas thefootprint of a sprayer is 30mm when set in the lowest positionas in our experiments. To actually treat a weed while the robotis moving is a time-critical part of the process because a smalldelay can lead to a position error at centimeter-level that islarge enough to miss a small weed. The decision on whichtool is used on which weed in our experiments is only basedon a size criteria: large weeds are sprayed while small weedsare stamped.

B. Weed Tracking

The main challenge in the weed tracking with non-overlapping multi-camera systems (Fig. 13, Sec. II-B2) is todeal with the high variance delay between the instant whenthe image of the first camera is acquired and the one when atarget is output by the detection system. To address this issue,a novel tracking system has been developed. The inputs are theimages and the coordinates of the targets given by the classifier(Sec. III-A) in the images of the first camera (detection camerain Fig. 13, top-left camera in Fig. 14). The outputs are the thetrigger time and position for the actuators. In a nutshell, theprocess performed by the algorithm is illustrated in Fig. 14(top). The main steps are:

1) The intra-camera tracking module estimates the camerapose and the 3D scene map using VO direct methods;

2) After receiving delayed classification results and scenestructures, the object initializer and updater modulecreates the templates of the received objects, propagatestheir updated poses, and accumulates their labels;

3) As an object moves out of the FoV of the detectioncamera, a naive Bayes classifier (NBC) validates itsclassification based on the accumulated labels, to preventdestroying a mis-classified crop;

Fig. 14. Top: An overview of our proposed weed control system that iscomposed of weed detection, tracking and predictive control modules. Theweeds are tracked across the cameras and finally fed into a predictive controlmodule to estimate the time and position of treatment at which they will beapproaching the weeding tools. Bottom: An example of reconstructed inversedepth map and a 3D point-cloud of plants and ground surface from ourproposed intra-camera tracking algorithm.

4) Once the tracking camera finds a new weed objectmoving into its FoV, inter-camera tracking performsillumination-robust direct tracking to find its new poseand creates a new template for intra-camera tracking;

5) After repeated intra-camera tracking, updating, andinter-camera tracking, the weed finally approaches theend-effector, where the control algorithm predicts thetrigger time and position of actuation for intervention.

The novelty in this module resides in intra- and inter-cameratracking [44] (Fig. 14, bottom).

Intra-camera tracking: unlike conventional multi-objecttracking algorithms receiving classification results, extractingthe image template, then propagating its pose and tracking, ourproposed VO approach recovers the 3D scene structure beforeobtaining object information, then formulates each templateas a combination of trimmed image and inverse depth mapfor later tracking upon arrival of classification results. As aresult, our proposed intra-camera tracking strategy guaranteesa constant-time operation in spite of the change of the amountof tracked objects. The algorithm consists of two majorcomponents: camera tracking and scene structure mapping.

Page 12: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

12

Inter-camera tracking: taking advantage that only the 2Dpositions of weeds in image space is of interest other thanthe estimation of camera poses, we extract the small frametemplate of each weed combined with a global illumination-invariant cost to perform local image alignment. Then, theweed center and its template boundary are transformed intothe current frame using the pose estimate, which is used togenerate a new template for intra-camera tracking. It shouldbe noted that the retrieval of weeds object is achieved byusing 3D-2D direct template-based matching instead of usingconventional 2D-2D image correspondence since the changeof viewpoint could induce significant changes of appearanceof objects especially for the ones with high depth variance asshown.

To evaluate the mechanical weed removal, real leaves withan average radius of 10mm are chosen as targets, countingthe successfully stamped ones. To evaluate selective spraying,we set up a webcam to monitor targets after spraying. FromTab. I we can observe that the successful treatment rate isalmost invariant with the speed in both flat and rough fieldground. These experiments are illustrated in Fig. 15.

TABLE ITREATMENT RATES

flatvel. [m/s] 0.05 0.1 0.2 0.3 0.4

stamping 112/113 99/102 90/91 - -spraying 125/125 124/124 129/129 132/134 112/113

roughvel. [m/s] 0.05 0.1 0.2 0.3 0.4

stamping 120/121 84/86 93/96 - -spraying 121/121 111/112 118/119 131/133 129/129

Fig. 15. Left: real field experiments for spraying evaluation. Right: experi-ments in a simulated environment for stamping evaluation.

C. Agronomic Impacts in Sugar Beet and Sunflower Crops

The potential impacts generated by the introduction of theFlourish precision farming technologies into crop managementpractices and consequently into the farming market have beeninvestigated through field experimentation and stakeholdersinquiries, respectively. The results of a 3-years field trialscampaign in Italy (2016-2018), simulating the Flourish se-lective in-field spraying and comparing different chemical

treatment scenarios (i.e. full-field treatment/patch treatment/notreatment) on sugar beet and sunflower seedlings, show thatsugar beet (Tab. II, values in tons/hectare) patch-field treatment(performed on 30% of the crop), whether associated to a pre-emergence treatment, represents a valid sustainable alternativeto traditional treatments (full-field treatments). This allows toachieve comparable sugar production levels while reducingchemical inputs. Conversely, sunflower oil production (data notshown) did not benefit equally from patch selective weedingprobably due to the intrinsic characteristics of the crop (e.g.seedlings faster growth and competition with weeds is notrelevant for later crop development stages than for sugar beet).

TABLE IISUGAR PRODUCTION (TONS/HECTARE) IN THREE DIFFERENT SUGAR BEET

SEEDLINGS TREATMENT SCENARIOS (AFTER PRE-EMERGENCETREATMENT).

Sugar beet 2016 2017 2018Full-field treatment 15.2 8.4 10.1

Patch-field treatment (30%) 12.3 8.6 11.5No treatment 11.0 3.4 6.6

These results suggest that the introduction of the Flourishselective weeding system in new crops may lead to differenteffectiveness, depending on the respective crop cycle. Hence,its effectiveness and affordability has to be assessed caseby case through in-field experimentation. Beside this strictlyagronomic impact, the introduction of innovative, autonomousand sustainable precision agriculture technologies into cropmanagement practices could have a larger-scale impact onthe farming sector. In this regard, a participatory evaluation(Metaplan model and SWOT analysis) carried out within theproject and involving a panel of 17 stakeholders professionallyoperating in the farming sector, highlighted that the applicationof such technologies would be able to potentially improvethe efficiency, efficacy and safety of farming operations,and thus to reducing labour cost, provide more informationon crops, field structure and meteorological events, increasefarming environmental sustainability and food safety. There isalso potential to increase agriculture competitiveness, shortenthe technological gap between farming and industry, promptyoung people to work in agriculture and reverse the depopu-lation tendency affecting marginal areas.

VII. OPEN-SOURCE SOFTWARE AND DATASETS

Many of the methods presented above have been releasedas open-source softwares, with download links reported in thecorresponding papers. A short list is given here:

• A modified version of DJI Onboard ROS Software De-velopment Kit (SDK), aiming to enable the DJI UAVsto interface easily with generic frameworks for stateestimation and control, and users to employ their ownsoftware to communicate with DJI systems [45];

• Plant stress phenotyping dataset and analysis soft-ware [46] (Sec. IV-A).

Page 13: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

13

• The IPP framework (Sec. V-A) for terrain monitoring[47]. Additionally, we provide a package for more generalUAV-based high-level waypoint navigation1;

• MCAPS (Multi-Cue Agricultural Positioning System), amulti-sensor framework developed for UGV localizationwithin the field (Sec. IV-B) [33];

• AgriColMap (Aerial-Ground Collaborative 3D Mapping):a tool developed to register 3D maps gathered by aerialand ground farming robots (Sec. IV-C) [34];

• Algorithms for synchronizing clocks2.

The development, training, and testing of reliable plant clas-sification, mapping, and navigation softwares requires large-scale agricultural datasets that possibly cover all the plantgrowth stages relevant for the operation of the robot. Sinceno datasets with all or some of these features were publiclyavailable, we created and made publicly available several noveldatasets:

• Sugar Beets 2016: a novel, vast long-term dataset of asugar beet field [48]. 27 recording sessions, distributedover the course of two months, provide 4-channel multi-spectral camera images, RGB-D depth data, LIDARmeasurements, GPS, and wheel encoder data.

• Flourish Sapienza Datasets [49]: a collection of datasets,with related ground truths, acquired from farming robots.It includes: a dataset acquired with 9 different sensorsto benchmark multi-sensors UGV self-localization andmapping algorithms; a dataset acquired to benchmarkUAV-UGV 3D map registration algorithms; a sytheticgenerated datasets for crop/weeds detection in sugar beetsfields.

• A dataset of 4-, 5-channel multi-spectral aerial imagesdedicated to plant semantic segmentation [19], [50].

• A pixel-wise ground truthed sugar beet and weed datasetscollected from a controlled field experiment [19]. 375training and 90 testing images were annotated and re-leased3.

• WeedMap dataset [50], contains high-fidelity, large-scale,and spatio-temporal multi-spectral images (4 and 5 chan-nels). This sugar beet dataset includes 10,196 imagescollected from 3 different farms.

VIII. CONCLUSIONS

The main goal of the Flourish research project was todevelop an adaptable robotic solution for precision farmingby combining the aerial survey capabilities of a smallautonomous UAV with a multi-purpose agricultural UGV.In this paper, we presented an overview of the custom-builthardware solutions, methods and algorithms specificallydeveloped, emphasizing the complementarity of and synergiesbetween the involved aerial and ground robots. We finallydescribe a successful demonstration of in-field interventiontask integrating the various modules developed in the project.

1github.com/ethz-asl/waypoint navigator2github.com/ethz-asl/cuckoo time translator3goo.gl/UK2pZq

We believe that the proposed solutions represent, fromseveral points of view, a step forward in the state-of-the-art ofrobotic systems applied to precision agriculture, with solutionsthat are easily applicable to a wide range of robots, farmmanagement activities, and crop types.

REFERENCES

[1] Flourish project. [Online]. Available: http://flourish-project.eu[2] Rhea project. [Online]. Available: http://www.rhea-project.eu/[3] A. Gasparri, G. Ulivi, N. Bono Rossello, and E. Garone, “The H2020

project Pantheon: precision farming of hazelnut orchards (extendedabstract),” in Convegno Automatica, Florence, Italy, Sep 2018.

[4] P. Astolfi, A. Gabrielli, L. Bascetta, and M. Matteucci, “Vineyardautonomous navigation in the echord++ grape experiment,” IFAC-PapersOnLine, vol. 51, pp. 704–709, 01 2018.

[5] Sweeper project. [Online]. Available: http://www.sweeper-robot.eu/[6] D. Albani, J. IJsselmuiden, R. Haken, and V. Trianni, “Monitoring and

mapping with robot swarms for agricultural applications,” in IntelligentTechnologies for Environmental Monitoring Workshop, IEEE AVSS Con-ference, 2017, pp. 1–6.

[7] W. Dong, P. Roy, and V. Isler, “Semantic mapping for orchard environ-ments by merging two-sides reconstructions of tree rows,” Journal ofField Robotics, pp. 1–25, 2019.

[8] S. W. Chen, S. S. Shivakumar, S. Dcunha, J. Das, E. Okon, C. Qu, C. J.Taylor, and V. Kumar, “Counting apples and oranges with deep learning:A data-driven approach,” IEEE Robotics and Automation Letters, vol. 2,no. 2, pp. 781–788, April 2017.

[9] R. Ehsani, D. Wulfsohn, J. Das, I. Lagos, and Z. Ins, “Yield estimation;a low-hanging fruit for application of small uas.” Resource: Engineeringand Technology for Sustainable World, vol. 23, pp. 16–18, 07 2016.

[10] J. Das, G. Cross, C. Qu, A. Makineni, P. Tokekar, Y. Mulgaonkar, andV. Kumar, “Devices, systems, and methods for automated monitoringenabling precision agriculture,” in 2015 IEEE International Conferenceon Automation Science and Engineering (CASE), Aug 2015, pp. 462–469.

[11] Ecorobotix. [Online]. Available: https://www.ecorobotix.com/en/[12] Bluerivertechnology. [Online]. Available: http://about.

bluerivertechnology.com/[13] Sagarobotics. [Online]. Available: https://sagarobotics.com/[14] J. Nikolic, J. Rehder, M. Burri, P. Gohl, S. Leutenegger, P. T. Furgale,

and R. Siegwart, “A synchronized visual-inertial sensor system withFPGA pre-processing for accurate real-time SLAM,” in IEEE Inter-national Conference on Robotics and Automation. IEEE, 2014, pp.431–437.

[15] P. Lottes, M. Hoferlin, S. Sander, and C. Stachniss, “Effective Vision-based Classification for Separating Sugar Beets and Weeds for PrecisionFarming,” Journal of Field Robotics, vol. 34, pp. 1160–1178, 2017.

[16] P. Lottes, R. Khanna, J. Pfeifer, R. Siegwart, and C. Stachniss, “UAV-based Crop and Weed Classification for Smart Farming,” in InternationalConference on Robotics and Automation, 2017.

[17] P. Lottes and C. Stachniss, “Semi-supervised online visual crop andweed classification in precision farming exploiting plant arrangement,”in Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems(IROS), 2017.

[18] P. Lottes, J. Behley, A. Milioto, and C. Stachniss, “Fully convolutionalnetworks with sequential information for robust crop and weed detectionin precision farming,” IEEE Robotics and Automation Letters (RA-L),vol. 3, pp. 3097–3104, 2018.

[19] I. Sa, Z. Chen, M. Popovic, R. Khanna, F. Liebisch, J. Nieto, andR. Siegwart, “weedNet: Dense Semantic Weed Classification UsingMultispectral Images and MAV for Smart Farming,” IEEE Robotics andAutomation Letters, pp. 588–595, 2018.

[20] M. Di Cicco, C. Potena, G. Grisetti, and A. Pretto, “Automatic modelbased dataset generation for fast and accurate crop and weeds detection,”in Proc. of the IEEE/RSJ International Conference on Intelligent Robotsand Systems (IROS), 2017.

[21] C. M. Hoffmann, “Changes in nitrogen composition of sugar beetvarieties in response to increasing nitrogen supply,” Journal of Agronomyand Crop Science, vol. 191, no. 2, pp. 138–145, 2005.

[22] M. Gnyp, M. Panitzki, S. Reusch, J. Jasper, A. Bolten, and G. Bareth,“Comparison between tractor-based and uav-based spectrometer mea-surements in winter wheat.” in Proceedings of the 13th InternationalConference on Precision Agriculture (unpaginated, online). Monticello,IL: International Society of Precision Agriculture, 2016.

Page 14: Building an Aerial-Ground Robotics System for Precision ... · precision agriculture, where the areas of interest are usually vast. A UAV allows rapid inspections of large areas,

14

[23] L. Prey and U. Schmidhalter, “Simulation of satellite reflectance datausing high-frequency ground based hyperspectral canopy measurementsfor in-season estimation of grain yield and grain nitrogen status in winterwheat,” ISPRS Journal of Photogrammetry and Remote Sensing, vol.149, pp. 176–187, 2019.

[24] F. Liebisch, J. Pfeifer, C. Mller-Ruh, and A. Walter, “Proximal andremote quantification of nitrogen fertilzer demand a case study in sugarbeet,” in Proceedings book of the XVIII international plant nutritioncolloquium with boron and manganese satellite meetings, 2017.

[25] A. Walter, R. Khanna, P. Lottes, C. Stachniss, R. Siegwart, J. Nieto, andF. Liebisch, “A robotic approach for automation in crop management.”in In Proceedings of the 14th International Conference on PrecisionAgriculture (unpaginated, online). Monticello, IL: International Societyof Precision Agriculture., 2018.

[26] A. Walter, F. Liebisch, and A. Hund, “Plant phenotyping: from beanweighing to image analysis,” Plant Methods, vol. 11, p. 14, 2015.

[27] A. Walter, R. Finger, R. Huber, and N. Buchmann, “Opinion: Smartfarming is key to developing sustainable agriculture,” Proceedings ofthe National Academy of Sciences, vol. 114, no. 24, pp. 6148–6150,2017.

[28] R. Finger, S. M. Swinton, N. E. Benni, and A. Walter, “Precision farmingat the nexus of agricultural production and the environment,” AnnualReview of Resource Economics, vol. 11, no. 1, 2019.

[29] F. Argento, T. Anken, F. Liebisch, and A. Walter, “Crop imaging and soiladjusted variable rate nitrogen application in winter wheat,” in Precisionagriculture 19, 2019, pp. 511 – 517.

[30] M. Bloesch, S. Omari, M. Hutter, and R. Siegwart, “Robust visualinertial odometry using a direct ekf-based approach,” in IntelligentRobots and Systems (IROS), 2015 IEEE/RSJ International Conferenceon. IEEE, 2015, pp. 298–304.

[31] S. Lynen, M. Achtelik, S. Weiss, M. Chli, and R. Siegwart, “A robustand modular multi-sensor fusion approach applied to mav navigation,”in Proc. of the IEEE/RSJ Conference on Intelligent Robots and Systems(IROS), 2013.

[32] T. Schneider, M. Dymczyk, M. Fehr, K. Egger, S. Lynen, I. Gilitschen-ski, and R. Siegwart, “maplab: An open framework for re-search in visual-inertial mapping and localization,” arXiv preprintarXiv:1711.10250, 2018.

[33] M. Imperoli, C. Potena, D. Nardi, G. Grisetti, and A. Pretto, “Aneffective multi-cue positioning system for agricultural robotics,” IEEERobotics and Automation Letters, vol. 3, no. 4, pp. 3685–3692, October2018.

[34] C. Potena, R. Khanna, J. Nieto, R. Siegwart, D. Nardi, and A. Pretto,“AgriColMap: Aerial-ground collaborative 3D mapping for precisionfarming,” IEEE Robotics and Automation Letters, vol. 4, no. 2, pp. 1085–1092, 2019.

[35] Y. Hu, R. Song, and Y. Li, “Efficient coarse-to-fine patch match for largedisplacement optical flow,” in Proc. of the IEEE Conf. on ComputerVision and Pattern Recognition (CVPR), 2016, pp. 5704–5712.

[36] A. Myronenko and X. Song, “Point set registration: Coherent point

drift,” IEEE Transactions on Pattern Analysis and Machine Intelligence,vol. 32, no. 12, pp. 2262–2275, 2010.

[37] N. Chebrolu, T. Labe, and C. Stachniss, “Robust Long-Term Registrationof UAV Images of Crop Fields for Precision Agriculture,” IEEE Roboticsand Automation Letters, vol. 3, no. 4, pp. 3097–3104, 2018.

[38] M. Popovic, G. Hitz, J. Nieto, I. Sa, R. Siegwart, and E. Galceran,“Online Informative Path Planning for Active Classification UsingUAVs,” in IEEE International Conference on Robotics and Automation.IEEE, 2017, pp. 5753–5758.

[39] M. Popovic, T. Vidal-Calleja, G. Hitz, I. Sa, R. Siegwart, and J. Nieto,“Multiresolution Mapping and Informative Path Planning for UAV-based Terrain Monitoring,” in IEEE/RSJ International Conference onIntelligent Robots and Systems. IEEE, 2017, pp. 1382–1388.

[40] W. Winterhalter, F. Fleckenstein, C. Dornhege, and W. Burgard, “CropRow Detection on Tiny Plants With the Pattern Hough Transform,” IEEERobotics and Automation Letters (RA-L), vol. 3, no. 4, pp. 3394–3401,2018.

[41] F. Fleckenstein, C. Dornhege, and W. Burgard, “Efficient Path Planningfor Mobile Robots with Adjustable Wheel Positions,” in InternationalConference on Robotics and Automation (ICRA), 2017.

[42] F. Fleckenstein, W. Winterhalter, C. Dornhege, C. Pradalier, and W. Bur-gard, “Smooth Local Planning Incorporating Steering Constraints,” in12th Conference on Field and Service Robotics (FSR), 2019.

[43] C. Pradalier, “A task scheduler for ROS,” Jan. 2017. [Online].Available: https://hal.archives-ouvertes.fr/hal-01435823

[44] X. Wu, S. Aravecchia, and C. Pradalier, “Design and implementation ofcomputer vision based in-row weeding system,” in 2019 InternationalConference on Robotics and Automation (ICRA). IEEE, 2019, pp.4218–4224.

[45] I. Sa, M. Kamel, M. Burri, M. Bloesch, R. Khanna, M. Popovic,J. Nieto, and R. Siegwart, “Build your own visual-inertial drone: Acost-effective and open-source autonomous drone,” IEEE Robotics &Automation Magazine, 2017.

[46] R. Khanna, L. Schmid, A. Walter, J. Nieto, R. Siegwart, and F. Liebisch,“A spatio temporal spectral framework for plant stress phenotyping,”Plant Methods, vol. 15, 2019.

[47] M. Popovic, T. Vidal-Calleja, G. Hitz, J. J. Chung, I. Sa, R. Siegwart,and J. Nieto, “An informative path planning framework for UAV-basedterrain monitoring,” Autonomous Robots, 2019, under review.

[48] N. Chebrolu, P. Lottes, A. Schaefer, W. Winterhalter, W. Burgard,and C. Stachniss, “Agricultural robot dataset for plant classification,localization and mapping on sugar beet fields,” J. of Robotics Research,2017.

[49] “Flourish Sapienza Datasets [Online],” inhttp://www.dis.uniroma1.it/~labrococo/fsd/.

[50] I. Sa, M. Popovic, R. Khanna, Z. Chen, P. Lottes, F. Liebisch, J. Nieto,C. Stachniss, A. Walter, and R. Siegwart, “WeedMap: A Large-ScaleSemantic Weed Mapping Framework Using Aerial Multispectral Imag-ing and Deep Neural Network for Precision Farming,” Remote Sensing,vol. 10, no. 9, 2018.