1
Light Interception Estimation : Ground Data Vs Aerial Anjana K N , Dr. Stephen Nuske | Field Robotics Centre, Robotics Institute, Carnegie Mellon University Upward facing camera with fisheye lens mounted on the robot to collect RGB images. Drones from Near Earth Autonomy for aerial data collection. Thanks to Stephen Nuske for this opportunity and guidance. Thanks to George Kantor and Near Earth Autonomy. Thanks to Omeed Mirbod for helping out in data collection and support. A special thanks to John Dolan, Rachel Burcin, Mikana Maeda and the RISS cohort. Amount of Light Intercepted is what decides a plant’s productivity, hence it’s important to measure LI. The algorithm proposed is robust to illumination variations hence making the system on the go. To estimate Light Interception(LI) as a function of Canopy architecture. Correlating Light Interception estimated from data collected by ground robot with aerial vehicle. INTRODUCTION Fig.3: (a) Aerial view of the field (b) Raw image of one plot ( c) Segmented image Estimating Light Interception from LIDAR point cloud and comparing it with the ones from RGB cameras Capturing more plant traits Extending the technique to other crop varieties 10 images at different illumination conditions were used for ground truth. Images were manually segmented into leaf and non leaf pixels. Since obtaining ground truth to validate Light Interception as it is nearly impossible to achieve manually, estimated LI was compared with manually collected harvest data. LI had positive correlation of 0.47 with Leaf Weight at harvest(Fig.4.). LI had the highest correlation with the Yield data as compared to any other phenotype. The processed data can then be used to quantify genotype by phenotype and by environment interaction. Plots(bounded by white boxes in the image below) were extracted. Shadow removal had to be incorporated to achieve leaf vs non-leaf segmentation for every plot. GROUND DATA RESULTS FUTURE WORK OBJECTIVES ACKNOWLEDGEMENT AERIAL DATA Adaptive sky mapping for data from upward facing camera on the ground robot. Fig.1: Ground robot and aerial vehicle which were used for data collection LI=53.61% LI=82.5% LI=18.17% Fig.4: Correlation of Light Interception(x-axis) with Total Leaf weight at harvest (y-axis) R^2=0.47 f(x)=0.111x*0.5397 (a) (b) (c ) Fig.2: (a) Raw images obtained from ground robot (b)Processed images with calculated LI (a) (b)

Light Interception Estimation : Ground Data Vs Aerial Printing · 0.47 with Leaf Weight at harvest(Fig.4.). LI had the highest correlation with the Yield data as compared to any other

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Light Interception Estimation : Ground Data Vs Aerial Printing · 0.47 with Leaf Weight at harvest(Fig.4.). LI had the highest correlation with the Yield data as compared to any other

Printing:This poster is 48” wide by 36” high. It’s designed to be printed on a largeformat printer.

Customizing the Content:The placeholders in this poster you. placeholders to add text, or click an icon to add a table, chart, SmartArt graphic, picture or multimedia file.

Tpoints from text, just click the Bullets button on the Home tab.

If you need more placeholders for titles, contentmake a copy of what you need and drag it into place.

Light Interception Estimation : Ground Data Vs Aerial

Anjana K N , Dr. Stephen Nuske | Field Robotics Centre, Robotics Institute, Carnegie Mellon University

Upward facing camera with fisheye lens mounted on the robot to collect RGB images.

Drones from Near Earth Autonomy for aerial data collection.

Thanks to Stephen Nuske for this opportunity and guidance.

Thanks to George Kantor and Near Earth Autonomy.

Thanks to Omeed Mirbod for helping out in data collection and support.

A special thanks to John Dolan, Rachel Burcin, Mikana Maeda and the RISS cohort.

Amount of Light Intercepted is what decides a plant’s productivity, hence it’s important to measure LI.

The algorithm proposed is robust to illumination variations hence making the system on the go.

To estimate Light Interception(LI) as a function of Canopy architecture.

Correlating Light Interception estimated from data collected by ground robot with aerial vehicle.

INTRODUCTION

Fig.3: (a) Aerial view of the field(b) Raw image of one plot ( c) Segmented image

Estimating Light Interception from LIDAR point cloud and comparing it with the ones from RGB cameras

Capturing more plant traits Extending the technique to other crop varieties

10 images at different illumination conditions were used for ground truth. Images were manually segmented into leaf and non leaf pixels.

Since obtaining ground truth to validate Light Interception as it is nearly impossible to achieve manually, estimated LI was compared with manually collected harvest data. LI had positive correlation of 0.47 with Leaf Weight at harvest(Fig.4.).

LI had the highest correlation with the Yield data as compared to any other phenotype.

The processed data can then be used to quantify genotype by phenotype and by environment interaction.

Plots(bounded by white boxes in the image below) were extracted.

Shadow removal had to be incorporated to achieve leaf vs non-leaf segmentation for every plot.

GROUND DATA RESULTS

FUTURE WORK

OBJECTIVES

ACKNOWLEDGEMENT

AERIAL DATA

Adaptive sky mapping for data from upward facing camera on the ground robot.

Fig.1: Ground robot and aerial vehicle which were used for data collection

LI=53.61%

LI=82.5%

LI=18.17%

Fig.4: Correlation of Light Interception(x-axis) with Total Leaf weight at harvest (y-axis)

R^2=0.47f(x)=0.111x*0.5397

(a)

(b) (c )

Fig.2: (a) Raw images obtained from ground robot (b)Processed images with calculated LI

(a) (b)