Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Draft
Multi-Crop Recognition using UAV based High Resolution NDVI Time-Series
Journal: Journal of Unmanned Vehicle Systems
Manuscript ID juvs-2018-0036.R1
Manuscript Type: Article
Date Submitted by the Author: 08-Mar-2019
Complete List of Authors: Latif, Muhammad Ahsan; University of Agriculture Faisalabad, Computer Science
Keyword: Multi-Crop Classification, Decision Tree, Multispectral, UAV
Is the invited manuscript for consideration in a Special
Issue? :Not applicable (regular submission)
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
Multi-Crop Recognition using UAV based High Resolution NDVI Time-Series
Muhammad Ahsan Latif
Department of Computer Science, Faculty of Sciences, University of Agriculture, Faisalabad, Pakistan.
AbstractMulti-crop recognition is a highly nonlinear task in nature as it involves many dynamic factors to
address. In this paper, a decision tree (DT) based approach is presented to classify and recognize
17 different crops. High spatial and temporal NDVI signatures were extracted from multispectral
imagery using multispectral sensor onboard quadrotor. Detailed datasets were prepared through
sampling based on normal distribution with different standard deviations. The impact of reduced
dimensions was also tested using principal component analysis. A very high degree of accuracy
is achieved for classification. The results also indicate that NDVIs pertaining to early-to-mid
season have much more weightage in the classification process for multiple crops.
Keywords: Crop classification; UAV; Decision Tree; NDVI
1. IntroductionThe farming systems in most of the Asian regions are generally characterized by small field size,
diverse crop species with similar traits and intercropping. These small scale farming approaches
lead to produce complex cropping patterns difficult to map through satellite based traditional
remote sensing techniques. The site-specific crop mapping is important as it helps the
agronomists to develop estimations for yield, biomass, vigor, crop cover, etc. for multiple crops
discretely. Nevertheless, mapping multiple crops within a small area is challenging as many
crops exhibit almost similar color hue and plant phenology, overlapping spectral signatures,
similar growth cycles and patterns. These dynamics contribute strenuously making multi-crop
mapping difficult. With this backdrop, this contribution presents a methodology for site-specific
Page 1 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
multi-crop recognition using a high resolution multispectral sensor onboard a quadcopter UAV
(unmanned air vehicle).
Recognizing multiple crops through classification approaches has been a key area of interest for
satellite based remote sensing (Huang et al. 2018; Bargiel 2017; Wu et al. 2017; Hütt et al. 2016;
Hao et al. 2015). In case of UAVs, however, very limited literature over multi crop classification
is available. For example, (Rebetez 2016) proposed a hybrid method based on deep neural
network for crop classification using RGB image dataset. The computationally expensive
approach used texture patterns and color distribution to classify a wide variety of crops. For
delineation and classification of maize crop, (Hall et al. 2018) developed object-oriented
approaches coupled with image texture, hue, saturation and intensity using RGB and NIR
imagery aerially acquired through a quadcopter. To differentiate sugar beet plants and weeds,
(Lottes et al. 2017) exploited spatial arrangement of plants through geometric features and
random forest approach using two digital cameras (RGB, RGB + NIR) onboard a quadcopter.
(Lin et al. 2019) proposed a Fourier dense network based on convolutional neural network to
classify the plants. The methodology involves learning and extraction of the plant features in
time and frequency domains from the optical images captured using a UAV. (Liu 2018)
proposed a multistep process to detect an invasive parsnip plant using a simple RGB camera
fixed on a UAV. The results provided evidence that UAVs with only visible band cameras serve
as economic resource for small and irregular vegetation detection. Through convolution neural
network approach, (Wei and Mao 2018) recognized rice crop using UAV imagery. An object
based mangrove classification methodology is presented by (Cao et al. 2018) that involves
hyperspectral and RGB sensors onboard UAV. They extracted texture features, spectral features
and vegetation indices from hyperspectral imagery whereas digital surface models were used to
Page 2 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
estimate the height of the plants. (Böhler, Schaepman, and Kneubühler 2018) presented a crop
classification method using aerial imagery obtained by un-calibrated consumer grade cameras
fixed on a UAV. They evaluated different spectral and spatial resolutions with different structure
elements to map textural features. The optimal classification was achieved with RGB bands and
textural features extracted at 0.5 m resolution. (Liu et al. 2018) used an integrated approach by
combining digital surface model and texture features from RGB images in SVM (support vector
machine) classification framework. An object based technique was tested for classification of
crops and an accuracy 85% was achieved as reported by (Park and Park 2015). The other closely
related tasks widely performed through UAVs include crop-weed separation or weed detection
(Zisi et al. 2018; Pérez-Ortiz et al. 2016; López-Granados et al. 2016; Hung et al. 2014) and land
cover usage estimation (Hyeok and Wan 2017; Feng et al. 2015). A comprehensive insight into
the UAV based crop phenotyping is also discussed by (Yang et al. 2017) and (Latif 2018)
Following the Introduction sec., this paper is divided into three main parts, i.e., Materials and
Methods, Results and Discussion. The Materials and Methods sec. further encapsulates sub-
sections such as site description, image acquisition, data sampling, classification and dimension
reduction.
2. Materials and Methods2.1. Site Description
The study was conducted at main agronomy research farm, University of Agriculture Faisalabad,
Punjab, Pakistan. A plot of size 35 m x 18 m (182m, Latitude: N 31°26'29.75'', Longitude: E
73°4'23.33'') with 17 different crops was selected for aerial operations as shown in Figure 1. The
plot contained 17 subplots each of size 18 m x 1 m with a gap of 1 m between adjacent subplots.
Each subplot contained 4 rows of the same crop with row-to-row distance of 0.3 m. The detail of
the crops observed is provided in Table 1.
Page 3 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
2.2. Image Acquisition and ProcessingWe used a quadrotor (Phantom-4, DJI) platform for aerial operations. To work for precision
agriculture, Phantom-4 platform was integrated with a customized payload (Figure 2). This
included a multispectral camera (ADC Micro, Tetracam Inc., Chatsworth, CA, USA),
FirePointTM 100 GPS and a battery. The GPS module and battery were attached directly with the
landing gears of the quadrotor. The ADC-Micro sensor was fixed to the quadrotor faced
downwards through a light-weight frame made up of aluminum alloy. The ADC-Micro sensor is
a multispectral sensor with three fixed bands (Red-Green-NIR) equivalent to the Landsat
thematic bands TM2, TM3 and TM4. It contains a yellow long-pass filter permanently fixed in
front of the lens which blocks the blue light and allows the remaining visible and NIR bands pass
through the lens. Inside the camera, there is an Aptina CMOS sensor and a Bayer RGB filter
array with a checkerboard pattern. The highly light-weighted (90 g) sensor operates within the
wavelength range of 520 nm - 920 nm and is very suitable for remote sensing applications in
agriculture. The ADC-Micro sensor was pre-set to capture multispectral images at nadir position
in an uncompressed raw-10 format. Total, 8 flights were conducted after every two weeks at an
altitude of 50 m AGL (Above Ground Level) with ground sampling distance of 2 cm (spatial
resolution). The flight operations were executed around mid-day to avoid the impact of shadows.
The calibration for IMU (Inertial Measurement Unit) and compass was performed just once at
the site before the very first flight operation and was never required at any later stage during
whole season. As the pixel values are direct manifestation of reflectance that changes
continuously, it is very important to establish the right association between the two measures to
produce true exhibition of vegetation. To achieve this, we used a Teflon calibration target which
gives 100% reflectance for the electromagnetic wavelength range 400 nm – 1000 nm. This
spectrally flat response helps to determine in-situ spectral-balance of sunlight and is
Page 4 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
subsequently used to make corrections in NIR/R/G bands. This information was later used in the
post-processing of multispectral images to calculate NDVI values (equation 1). The image
coordinates were corrected using the calibration parameters given in Table 2 to perform certain
geometrical transformations, i.e., rotational, projective and shear. The rectified multispectral
images were further segmented into two groups, i.e., the crop pixels and soil pixels. The quality
of segmentation process is very crucial to ensure that only the contribution from pixels
representing the canopies is considered while deriving NDVI. The classification is based on Red
and NIR reflectance exposed by the image content. The vegetation part shows high NIR
reflectance and the soil parts (also background, shadows, etc.) reveal higher red values. To derive
NDVI values from the same region(s), a fixed area of interest (1 m2) was marked in the middle
of each sub-plot.
Normalized Difference Vegetation Index (NDVI) = (NIR - RED) / (NIR + RED) (1)
For crop monitoring and vegetation studies, NDVI is the most widely accepted vegetation index
(Collwell 1974). It is an effective indicator as it is robust to atmospheric corrections and
variations in sun-light. According to the Köppen-Geiger climate classification system, Faisalabad
region has been categorized as a hot desert climate zone (Peel, Finlayson, and Mcmahon 2007).
The climate plays a key role in defining seasonal-growth and phenology of the crops. Therefore,
different climate-variables were also recorded in the meteorological observatory (CR-1000,
Campbell Scientific, Utah, USA) installed at University of Agriculture, Faisalabad as shown in
Figure 4.
2.3. Data Sampling As mentioned above, a fixed area of 1 m2 was marked in the middle of each crop subplot for
aerial data acquisition during whole season. The canopy pixels (≈ 3000) inside each fixed area
Page 5 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
were averaged to produce a single NDVI value for a particular crop on a specific date. Following
the same procedure, NDVI values for all 17 crops were acquired on 8 different days after regular
intervals covering almost complete phenological phases. Each crop sample is thus represented by
a distinct NDVI vector of 8 scalar values. These NDVI vectors were repeatedly checked and
validated ensuring their true respective representation. Each NDVI vector can be assumed as
NDVI signature for a particular crop. To train a DT classifier for crop classification, a dataset
with considerable number of samples is required. Moreover, despite our utmost care while
preparing NDVI vectors for different crops listed in the experiment, there is always room for
variations in NDVI values as a symbol of generalization. These issues were addressed by
modeling NDVI vectors through normal distribution as given in following.
(2)𝑓(𝑥│𝜇,𝜎) = 1
𝜎 2𝜋𝑒―(𝑥 ― 𝜇)2
2𝜎2
Where ‘µ’ represents mean NDVI value for one crop on a specific day. There are 8 mean NDVI
values for each NDVI vector (crop). The standard deviation ‘σ’ is used to define the scale of
variation for NDVI values. Each NDVI vector sample generated this way takes the following
form:
(3)(𝑋,𝑌) = (𝑥1, 𝑥2,………,𝑥𝑘, 𝑌)
‘X’ is the set of NDVI vectors where each vector sample contains 8 predictors (k=8), i.e., 8
NDVI scalar values for 8 different days. ‘Y’ represents the actual class assigned to NDVI
vectors, i.e., crop label.
2.4. ClassificationTo classify 17 different crops based on high resolution NDVI time series, we used decision tree
classification approach (Rokach and Maimon 2014) as it reflects an easy to understand close
human decision making logic than other approaches (James et al. 2013). The DT algorithm
Page 6 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
breaks down a dataset into smaller subsets whilst a tree structure is incrementally developed. The
optimal split decisions are made at nodes considering one feature at one time. The algorithm
continues to perform splitting until pure nodes are reached. The topmost node is regarded as the
best predictor whereas final classification is exhibited by leaf nodes. In our case, the dataset is
comprised of NDVI vectors formulated against different crops on specific dates. The mean
NDVI values inside these NDVI vectors serve as diverse set of features which form the basis for
splitting nodes. We used different splitting criteria to test the robustness of decision tree as given
below (Jost 2006) .
Gini’s Diversity IndexGini’s index is a measure of impurity at the node(s). ‘p(i)’ represents the fraction of
samples labeled with class ‘i’ at any node. A node with samples belonging to only one
class has a Gini’s index as 0. Any other state leads to a positive index value.
(4)𝐺𝑖𝑛𝑖′𝑠 𝐼𝑛𝑑𝑒𝑥 = 1 ― ∑𝑖𝑝
2(𝑖) Deviance
With same definition of p(i) as described for Gini’s index, the deviance splitting rule has
following form. For a pure node, deviance is 0 and positive otherwise.
(5)𝐷𝑒𝑣𝑖𝑎𝑛𝑐𝑒 = ― ∑𝑖𝑝(𝑖)𝑙𝑜𝑔 𝑝(𝑖)
Twoing RuleTwoing splitting rule will maximize the following change of impurity measure:
(6)∆𝑖(𝑡) = 𝑃𝐿𝑃𝑅
4 [∑𝐽𝑗 = 1|𝑝(𝑗│𝑡𝐿) ― 𝑝(𝑗│𝑡𝑅)|]
2
PL and PR are the proportions of samples that split to the left and right respectively. A
relatively larger expression gives pure child node whereas a smaller expression leads to
produce similar child nodes thus decreases node purity.
Page 7 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
2.5. Dimension ReductionIn a multidimensional system like ours, different variables may lead to produce redundant
information. This happens when some variables move together under some hidden correlation. In
our case, the samples are shaped as NDVI vectors each with 8 mean NDVI values calculated
against 8 different dates for each crop. By removing redundancy information in the data, we can
figure out NDVI values against the dates which contribute significantly in classification process.
This is achieved through principle component analysis (Jolliffe 2002). The approach transforms
original multidimensional space into a new space where all the dimensions are orthogonal to
each other. The orthogonality eliminates the redundancy in the original data. Each new
dimension is referred to as principal component and can be written as a linear combination of the
original variables. The first principal component gives maximum variance, the second principal
component contains second highest variance in the data and hence forth. The matrix of
eigenvalues calculated as measure of variance for principle components is given below in
equation 8. The analysis shows only 4 features (mean NDVIs) out of total 8 features are enough
to define >95% variance (for σ = 0.02).
(7)𝑃𝐶 = ∑𝑎𝑖𝑥𝑖
Equation 7 defines principle component as a linear combination of original variables. Here ‘ai’
indicates ith coefficient and xi is ith mean NDVI value.
1 15 1 15 1 15 1 15
4
4
4
4
0.043 0 0 0 0 0 0 00 0.009 0 0 0 0 0 00 0 0.005 0 0 0 0 00 0 0 0.002 0 0 0 00 0 0 0 9.07 0 0 00 0 0 0 0 5.75 0 00 0 0 0 0 0 4.11 00 0 0 0 0 0 0 2.43
Jan Jan Feb Feb Mar Mar Apr AprNDVI NDVI NDVI NDVI NDVI NDVI NDVI NDVI
ee
ee
(8)
Page 8 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
3. Results Figure 5 shows classification of all 17 crops in the form of a classification tree. The tree is
comprised of 16 decision nodes and 17 leaf nodes. The decision nodes show the variables' names
used in decisions for classification. These are the mean NDVI values calculated on Jan 1, Jan 15,
Feb 1, Feb 15, Mar 1, Mar 15, Apr 1 and Apr 15. The leaf nodes represent classified crops, i.e.,
the crops as a result of decisions based on mean NDVI values imposed at decision nodes. The
labels of these classes at leaf nodes are A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P and Q. The
actual names of these crop labels are given in Table 1. NDVI - Jan 15 is used at three decision
nodes, i.e., it classifies crop ‘P’ and node NDVI – Jan 1, sub node NDVI – Jan 1 and node
NDVI – Feb 1, and, crop ‘L’ and sub node NDVI – Feb 1. NDVI – Jan 1 appears at five nodes,
i.e., it classifies crop ‘Q’ and node NDVI – Jan 15, crops ‘A’ and ‘J’, crops ‘H’ and ‘G’, crop ‘E’
and sub node NDVI – Jan 15, and, crop ‘M’ and sub node NDVI – Feb 1. NDVI – Feb 1 is used
at four decision nodes, i.e., it classifies sub nodes NDVI – Feb 1 and NDVI – Feb 15, sub nodes
NDVI – Jan 1, crops ‘D’ and ‘I’, and crops ‘N’ and ‘O’. The node NDVI – Feb 15 breaks down
into further two sub nodes, i.e., NDVI – Mar 1 and NDVI – Mar 15. The crops ‘F’ and ‘K’ are
classified by NDVI – Mar 1 whereas crops ‘B’ and ‘C’ are classified by NDVI – Mar 15. No
crop is classified by NDVI – Apr 1 or NDVI – Apr 15. The respective confusion matrix given in
Table 3 shows very few misclassifications culminating 99.9% accuracy. Different splitting
criteria described above produced similar classification tree. This shows the robustness in NDVI
vector dataset used in the classification process. The original dataset is composed of 8 features,
i.e., each sample contains 8 mean NDVI values. To reduce dimensions of data, PCA resulted
only four principal components offering >95% variance. Table 5 shows coefficients and total
variance for four selected principal components. The coefficients indicate that mean NDVI
values pertaining to early to mid-season contribute significantly in crop classification process
Page 9 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
than remaining mean NDVI values. The confusion matrix for classification tree based on
selected features identified through PCA is shown in Table 4. Here, we see classification
accuracy of 99.4% with certain misclassifications mentioned. It is observed that though
classification using all 8 features is more accurate than using only selected features, nevertheless
it is still very optimal. Figure 6 shows the relation between predictive accuracy of classification
models and classification error (a measure of misclassification) as standard deviation varies. Two
different models are tested, i.e., the model using all 8 features and the model with only 4 features
selected through PCA. It is clear that predictive accuracy of both models (DT and PCA based
DT) decreases and respective classification errors (CE) increases as standard deviation (Sigma)
increases. It is observed that predictive accuracy decreases from 100% to ≈76% as standard
deviation varies from 0.01 to 0.06. The classification error increases from 0 to ≈0.25 for same
value range of standard deviation. This helps in choosing an optimal value of standard deviation
to develop decision tree classification model.
4. DiscussionThe UAVs facilitated acquisition of high resolution image data to a great extent for applications
in precision agriculture. The flexibility, availability and affordability inherent to the said
technology has really outclassed the traditionally used satellite imagery for site-specific precision
farming (Stein 2018). The high resolution crop imagery acquired through UAVs provides an
easy access to understand and exploit the fine detail at plant level. In this work, we presented
multi crop classification based on high resolution NDVI data using multispectral sensor onboard
a quadrotor. Many crops share similar visible features in terms of color, shape, growth, etc.,
making crop identification difficult. We choose a plot with 17 different crops for our experiment.
We performed 8 flight operations on different days with a gap of 15 days regularly. This led to
calculate mean NDVI values and NDVI vectors for each crop. The provision of high resolution
Page 10 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
spatial and temporal NDVI time series data laid the foundation for crop classification as results
revealed. Based on NDVI vectors, data samples are generated using normal distribution.
Standard deviation (σ) is used to define the scale for NDVI values to vary within a certain range
against a definite date. Specific NDVI values can’t be attributed to phenological phases of crops
discretely as there are many factors that affect NDVI values, e.g., weather, water stress, nutrient
deficiency, calibration error(s), image noise, soil reflectance, etc. The variations in NDVI values
due to these factors are covered by defining a scale of variation using standard deviation. We
tested standard deviation for a range of values, i.e., from 0.01 to 0.06. For σ = 0.01, the generated
NDVI samples offer an idealistic crop classification using DT algorithm. However, this can be
the result of overfitting as training data doesn’t contain generality. For σ = 0.02 or σ = 0.03, the
modeled NDVI values offer a rather realistic sampling data. The training data turns out to be
more general as NDVI values vary within an optimal interval. For σ >= 0.05, overlapping within
NDVI values against different dates increases that causes to grow misclassifications. A close
look at DT classifier shows NDVI values related to early to mid-season contributed more
significantly for multi crop classification than other NDVIs. For example, NDVIs against Jan 1,
Jan 15 and Feb 1 made frequent appearance at different decision nodes. This fact gives a key
finding that early season NDVIs play pivotal role in classification and recognition of multiple
crops using high resolution aerial imagery. This also concludes that NDVIs vary considerably
during that specific time enabling multi crop recognition easy. The PCA based dimension
reduction also verifies the major contributions from specific NDVIs, e.g., 1st principle
component contains 69.69% variance that mainly came from NDVIs against Jan 1, Jan 15, Feb 1
and Feb 15. Using selected features (NDVIs) only, PCA based confusion matrix indicates an
optimal classification result with 99.4% accuracy. This is slightly less than the accuracy we
Page 11 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
achieved using all features. The results suggest that high spatial and temporal resolution NDVI
data may serve a primary source for site specific crop recognition. For future work, we intend to
include other important crop traits such as plant height, plant shape and color in the classification
process to make better multi crop recognition. For this, multi view crop imagery will be
exploited to develop 3D models and feature extraction. The use of hyperspectral sensors may
also be used to get more precise high resolution spatial and temporal NDVI signatures for multi
crop classification and recognition.
5. Acknowledgement The support from the Endowment Fund Secretariat, University of Agriculture, Faisalabad is
deeply acknowledged for this research work.
6. ReferencesBargiel, D. 2017. A New Method for Crop Classification Combining Time Series of Radar
Images and Crop Phenology Information. Remote Sensing of Environment, 198(2017):
369–83.
Böhler, Jonas E., Michael E. Schaepman, and Mathias Kneubühler. 2018. Crop Classification in
a Heterogeneous Arable Landscape Using Uncalibrated UAV Data [online]. Remote
Sensing, 10(8). doi:10.3390/rs10081282.
Cao, J., Leng, W., Liu, K., Liu, L., He, Z., and Zhu, Y. 2018. Object-Based Mangrove Species
Classification Using Unmanned Aerial Vehicle Hyperspectral Images and Digital Surface
Models [online]. Remote Sensing, 10(1). doi:10.3390/rs10010089.
Collwell, John E. 1974. Vegetation Canopy Reflectance. Remote Sensing of Environment, 3(3):
175–83.
Page 12 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
Feng, Q., Liu, J., and Gong, J. 2015. UAV Remote Sensing for Urban Vegetation Mapping
Using Random Forest and Texture Analysis [online]. Remote Sensing, 7(1): 1074–94.
https://doi.org/10.3390/rs70101074
Hall, O., Dahlin, S., Marstorp, H., Bustos, M., Öborn, I., and Jirström, M. 2018. Classification of
Maize in Complex Smallholder Farming Systems Using UAV Imagery [online]. Drones,
2(3). https://doi.org/10.3390/drones2030022.
Hao, P., Zhan, Y., Wang, L., Niu, Z., and Shakir, M. 2015. Feature Selection of Time Series
MODIS Data for Early Crop Classification Using Random Forest: A Case Study in
Kansas, USA [online]. Remote Sensing, 7(5): 5347–69.
https://doi.org/10.3390/rs70505347.
Huang, H., Deng, J., Lan, Y., Yang, A., Deng, X., and Zhang, L. 2018. A Fully Convolutional
Network for Weed Mapping of Unmanned Aerial Vehicle (UAV) Imagery [online]. PLoS
ONE, 13(4). https://doi.org/10.1371/journal.pone.0196302.
Hung, C., Xu, Z., and Sukkarieh, S. 2014. Feature Learning Based Approach for Weed
Classification Using High Resolution Aerial Images from a Digital Camera Mounted on a
UAV. Remote Sensing, 6(12): 12037–54.
Hütt, C., Koppe, W., Miao, Y., and Bareth, G. 2016. Best Accuracy Land Use / Land Cover
(LULC) Classification to Derive crop types using multitemporal, multisensor and multi-
polarization SAR satellite images [online]. Remote Sensing, 8(8).
https://doi.org/10.3390/rs8080684.
Hyeok, G., and Wan, J. 2017. Land Cover Classification with High Spatial Resolution Using
Orthoimage and DSM Based on Fixed-Wing UAV. Journal of the Korean society of
Surveying, Geodesy, Photogrammetry and Cartography, 35(1): 1–10.
Page 13 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
James, G., Witten, D., Hastie, T., and Tibshirani, R. 2013. An Introduction to Statistical
Learning. Springer, USA.
Jolliffe, I.T. 2002. Principal Component Analysis, Series: Springer Series in Statistics. Springer
Science & Business Media. USA.
Jost, L. 2006. Entropy and Diversity [online]. Wiley Online Library.
https://doi.org/10.1111/j.2006.0030-1299.14714.x.
Latif, M. A. 2018. An Agricultural Perspective on Flying Sensors: State of the Art, Challenges,
and Future Directions. IEEE Geoscience and Remote Sensing Magazine, 6(4): 10–22.
Lin, C. W., Ding, Q., Tu, W. H., Huang, J. H., and Liu, J. F. 2019. Fourier Dense Network to
Conduct Plant Classification Using UAV-Based Optical Images. IEEE Access, 7(2019):
17736-17749.
Liu, B., Shi, Y., Duan, Y., and Wu, W. 2018. UAV-Based Crops Classification with Joint
Features from Orthoimage and DSM Data. International Archives of the
Photogrammetry, Remote Sensing & Spatial Information Sciences, 42(3): 1023–28.
Liu, J. 2018. A Combined Method for Vegetation Classification Based on Visible Bands from
UAV Images : A Case Study for Invasive Wild Parsnip Plants. MSc Thesis, School of
Environmental Studies, Queen's University, Canada.
López-Granados, F., Torres-Sánchez, J., Serrano-Pérez, A., de Castro, A. I., Mesas-
Carrascosa, F. J., and Pena, J. M. 2016. Early Season Weed Mapping in Sunflower
Using UAV Technology: Variability of Herbicide Treatment Maps against Weed
Thresholds. Precision Agriculture, 17(2): 183–99.
Page 14 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
Lottes, P., Khanna, R., Pfeifer, J., Siegwart, R., and Stachniss. 2017. UAV-Based Crop and
Weed Classification for Smart Farming. In 2017 IEEE International Conference on
Robotics and Automation (ICRA), IEEE, 3024–31.
Park, J. K., and Park, J. H. 2015. Crops Classification Using Imagery of Unmanned Aerial
Vehicle (UAV). Journal of the Korean society of agricultural engineers, 57(6): 91–97.
Peel, M. C., Finlayson, B. L., and McMahon, T. A. 2007. Updated World Map of the Köppen-
Geiger Climate Classification. Hydrology and earth system sciences discussions, 4(2):
439–73.
Pérez-Ortiz, M., Peña, J. M., Gutiérrez, P. A., Torres-Sánchez, J., Hervás-Martínez, C., and
López-Granados, F. 2016. Selecting Patterns and Features for between- and within-
Crop-Row Weed Mapping Using UAV-Imagery. Expert Systems with Applications, 47:
85–94.
Rebetez, J., Satizábal, H. F., Mota, M., Noll, D., Büchi, L., Wendling, M., and Burgos, S. 2016.
Augmenting a Convolutional Neural Network with Local Histograms - A Case Study in
Crop Classification from High-Resolution UAV Imagery. In European Symposium on
Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges
(Belgium).
Rokach, L., and Maimon, O. Z. 2014. Data Mining with Decision Trees: Theory and
Applications. World Scientific.
Stein, N. 2018. The Future of Drones in the Modern Farming Industry. GEOmedia, 21(5).
Wei, H., and Mao, J. 2018. The Recognition of Rice Area Images by UAV Based on Deep
Learning. In MATEC Web of Conferences (EITCE 2018).
Page 15 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
Wu, M., Huang, W., Niu, Z., Wang, Y., Wang, C., Li, W., and Yu, B. 2017. Fine Crop Mapping
by Combining High Spectral and High Spatial Resolution Remote Sensing Data in
Complex Heterogeneous Areas. Computers and Electronics in Agriculture, 139: 1–9.
Yang, G., Liu, J., Zhao, C., Li, Z., Huang, Y., Yu, H., and Zhang, R. 2017. Unmanned Aerial
Vehicle Remote Sensing for Field-Based Crop Phenotyping: Current Status and
Perspectives [online]. Frontiers in Plant Science, 8(2017).
https://doi.org/10.3389/fpls.2017.01111.
Zisi, T., Alexandridis, T., Kaplanis, S., Navrozidis, I., Tamouridou, A. A., Lagopodi, A., and
Polychronos, V. 2018. Incorporating Surface Elevation Information in UAV Multispectral
Images for Mapping Weed Patches [online]. Journal of Imaging, 4(11).
https://doi.org/10.3390/jimaging4110132.
Page 16 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
Figure Captions
Figure 1. Experimental plot with 17 different crops labeled
Figure 2. Phantom-4 DJI with payload
Figure 3. Classification of pixels for one image (canopy vs soil)
Figure 4. Climate-variables observed during experiment-phase (2016-17)
Figure 5. Classification tree for 17 crops with NDVI values at nodes used for decisions (σ=0.02)
Figure 6. Change in models’ accuracy and classification errors as sigma varies
Page 17 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
Figure 1. Experimental plot with 17 different crops labeled
Figure 2. Phantom-4 DJI with payload
Figure 3. Classification of pixels for one image (canopy vs soil)
N
Page 18 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft Figure 4. Climate-variables observed during experiment-phase (2016-17)
Figure 5. Classification tree for 17 crops with NDVI values at nodes used for decisions (σ=0.02)
Page 19 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
DraftFigure 6. Change in models’ accuracy and classification errors as sigma varies
Page 20 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
Table 1 List of crops studied
Sr. no. Label Crop Name Seasonal Time (Punjab-Pakistan)
1 A Wheat Nov-April2 B Barley Nov-April3 C Oat Nov-April4 D Clover Nov-April5 E Alfalfa Nov-April6 F Rapeseed Nov-April7 G UAF-11 Rapeseed Nov-April8 H Mustard Nov-April9 I Linseed Nov-April10 J Kusumba Nov-April11 K Haloon Nov-April12 L Methre Nov-April13 M Lentil Nov-April14 N Chickpea Nov-April15 O Fennel Nov-April16 P Soo ye Nov-April17 Q Black cumin Nov-April
Table 2. Lens calibration parameters of ADC-Micro sensor
Principal Distance Optical Center Radial Distortion Differential
ScalingPixel Size
(mm)
8.5097 xp= 0.0593, yp= -0.0655 K1=1.18e-3, K2=1.65e-5, K3=-1.85e-6 b1=0, b2=0 0.0032
Page 21 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
Table 3. DT based Confusion matrix for classification with 8 features (99.9%)
A 499 1B 500C 1 499D 500E 500F 500G 500H 500I 500J 500K 1 499L 500M 499 1N 1 499O 500P 500Q 500
A B C D E F G H I J K L M N O P Q
Table 4. DT based Confusion matrix for classification with 4 features (99.4%)
A 496 4B 500C 3 497D 476 17 3 4E 11 488 1F 500G 499 1H 3 497I 500J 500K 500L 500M 500N 500O 3 497P 500Q 500
A B C D E F G H I J K L M N O P Q
Page 22 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems
Draft
Table 5. Coefficients and variance of 4 principal components
PrincipalComponents
NDVI(Jan 1)
NDVI(Jan 15)
NDVI(Feb 1)
NDVI(Feb 15)
NDVI(Mar 1)
NDVI(Mar 15)
NDVI(Apr 1)
NDVI(Apr 15) Variance (%)
PC1 0.26 0.55 0.59 0.34 0.07 -0.04 -0.30 -0.23 69.69PC2 0.03 0.28 0.21 0.19 -0.03 0.21 0.68 0.56 15.04PC3 -0.38 -0.15 -0.08 0.49 0.61 0.33 -0.23 0.16 7.94PC4 0.33 0.16 -0.25 -0.17 0.30 0.55 0.31 -0.51 3.9
Page 23 of 23
https://mc06.manuscriptcentral.com/juvs-pubs
Journal of Unmanned Vehicle Systems