12
Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation Fraction Hee Sup Yun 1 , Soo Hyun Park 2 , Hak-Jin Kim 1,3 *, Wonsuk Daniel Lee 4 , Kyung Do Lee 5 , Suk Young Hong 5 , Gun Ho Jung 6 1 Deptartment of Biosystems & Biomaterials Science and Engineering, Seoul National University, Seoul 08826, Korea 2 KIST Gangneung Institute of Natural Products, Gangneung 25451, Korea 3 Research Institute for Agriculture and Life Sciences, Seoul National University, Seoul 08826, Korea 4 Department of Agricultural and Biological Engineering, University of Florida, Gainesville, FL 32611, United States 5 Deptartment of Agricultural Environment, National Academy of Agricultural Science, Jeollabuk-do 55265, Korea 6 Upland Crop Research Div., National Institute of Crop Science, Gyeonggi-do 16429, Korea Received: March 7 th , 2016; Revised: May 12 th , 2016; Accepted: May 20 th , 2016 Purpose: The overall objective of this study was to evaluate the vegetation fraction of soybeans, grown under different cropping conditions using an unmanned aerial vehicle (UAV) equipped with a red, green, and blue (RGB) camera. Methods: Test plots were prepared based on different cropping treatments, i.e., soybean single-cropping, with and without herbicide application and soybean and barley-cover cropping, with and without herbicide application. The UAV flights were manually controlled using a remote flight controller on the ground, with 2.4 GHz radio frequency communication. For image pre-processing, the acquired images were pre-treated and georeferenced using a fisheye distortion removal function, and ground control points were collected using Google Maps. Tarpaulin panels of different colors were used to calibrate the multi-temporal images by converting the RGB digital number values into the RGB reflectance spectrum, utilizing a linear regression method. Excess Green (ExG) vegetation indices for each of the test plots were compared with the M-statistic method in order to quantitatively evaluate the greenness of soybean fields under different cropping systems. Results: The reflectance calibration methods used in the study showed high coefficients of determination, ranging from 0.8 to 0.9, indicating the feasibility of a linear regression fitting method for monitoring multi-temporal RGB images of soybean fields. As expected, the ExG vegetation indices changed according to different soybean growth stages, showing clear differences among the test plots with different cropping treatments in the early season of < 60 days after sowing (DAS). With the M-statistic method, the test plots under different treatments could be discriminated in the early seasons of <41 DAS, showing a value of M > 1. Conclusion: Therefore, multi-temporal images obtained with an UAV and a RGB camera could be applied for quantifying overall vegetation fractions and crop growth status, and this information could contribute to determine proper treatments for the vegetation fraction. Keywords: Barley cover cropping, Excess green, Image processing, M-statistic method, UAV, Vegetation index Original Article Journal of Biosystems Engineering J. of Biosystems Eng. 41(2):126-137. (2016. 6) http://dx.doi.org/10.5307/JBE.2016.41.2.126 eISSN : 2234-1862 pISSN : 1738-1266 *Corresponding author: Hak Jin Kim Tel: +82-2-880-4604; Fax: +82-2-873-2049 E-mail: [email protected] Copyright 2016 by The Korean Society for Agricultural Machinery This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited. Introduction In recent years, UAVs have been commonly used for low-altitude and high-resolution remote sensing applications in precision agriculture owing to several advantages such as their versatile applicability, lightweight, and low operational costs (Hunt et al., 2010; Garcia-Ruiz et al., 2013). In particular, UAVs have been successfully used to assess the vegetation status of crops and to predict yields, because the operational flexibility of vertical takeoff and landing platforms with various image sensors makes it easy to fly over agricultural fields and acquire aerial images with high spatial and temporal resolutions (Torres-

[8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of

Soybean Vegetation Fraction

Hee Sup Yun1, Soo Hyun Park

2, Hak-Jin Kim

1,3*, Wonsuk Daniel Lee

4, Kyung Do Lee

5,

Suk Young Hong5, Gun Ho Jung

6

1Deptartment of Biosystems & Biomaterials Science and Engineering, Seoul National University, Seoul 08826, Korea

2KIST Gangneung Institute of Natural Products, Gangneung 25451, Korea

3Research Institute for Agriculture and Life Sciences, Seoul National University, Seoul 08826, Korea

4Department of Agricultural and Biological Engineering, University of Florida, Gainesville, FL 32611, United States5Deptartment of Agricultural Environment, National Academy of Agricultural Science, Jeollabuk-do 55265, Korea

6Upland Crop Research Div., National Institute of Crop Science, Gyeonggi-do 16429, Korea

Received: March 7th

, 2016; Revised: May 12th

, 2016; Accepted: May 20th

, 2016

Purpose: The overall objective of this study was to evaluate the vegetation fraction of soybeans, grown under different

cropping conditions using an unmanned aerial vehicle (UAV) equipped with a red, green, and blue (RGB) camera. Methods:

Test plots were prepared based on different cropping treatments, i.e., soybean single-cropping, with and without herbicide

application and soybean and barley-cover cropping, with and without herbicide application. The UAV flights were manually

controlled using a remote flight controller on the ground, with 2.4 GHz radio frequency communication. For image

pre-processing, the acquired images were pre-treated and georeferenced using a fisheye distortion removal function, and

ground control points were collected using Google Maps. Tarpaulin panels of different colors were used to calibrate the

multi-temporal images by converting the RGB digital number values into the RGB reflectance spectrum, utilizing a linear

regression method. Excess Green (ExG) vegetation indices for each of the test plots were compared with the M-statistic

method in order to quantitatively evaluate the greenness of soybean fields under different cropping systems. Results: The

reflectance calibration methods used in the study showed high coefficients of determination, ranging from 0.8 to 0.9,

indicating the feasibility of a linear regression fitting method for monitoring multi-temporal RGB images of soybean fields.

As expected, the ExG vegetation indices changed according to different soybean growth stages, showing clear differences

among the test plots with different cropping treatments in the early season of < 60 days after sowing (DAS). With the

M-statistic method, the test plots under different treatments could be discriminated in the early seasons of <41 DAS,

showing a value of M > 1. Conclusion: Therefore, multi-temporal images obtained with an UAV and a RGB camera could be

applied for quantifying overall vegetation fractions and crop growth status, and this information could contribute to

determine proper treatments for the vegetation fraction.

Keywords: Barley cover cropping, Excess green, Image processing, M-statistic method, UAV, Vegetation index

Original Article Journal of Biosystems Engineering

J. of Biosystems Eng. 41(2):126-137. (2016. 6)http://dx.doi.org/10.5307/JBE.2016.41.2.126

eISSN : 2234-1862 pISSN : 1738-1266

*Corresponding author: Hak Jin Kim

Tel: +82-2-880-4604; Fax: +82-2-873-2049

E-mail: [email protected]

Copyright ⓒ 2016 by The Korean Society for Agricultural MachineryThis is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0)

which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Introduction

In recent years, UAVs have been commonly used for

low-altitude and high-resolution remote sensing applications

in precision agriculture owing to several advantages such

as their versatile applicability, lightweight, and low

operational costs (Hunt et al., 2010; Garcia-Ruiz et al.,

2013). In particular, UAVs have been successfully used to

assess the vegetation status of crops and to predict yields,

because the operational flexibility of vertical takeoff and

landing platforms with various image sensors makes it

easy to fly over agricultural fields and acquire aerial

images with high spatial and temporal resolutions (Torres-

Page 2: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

127

Sánchez et al., 2014).

The acquired aerial images can help farmers to evaluate

the crop growth status, while providing useful information

about canopy greenness, leaf area, and water stress

assessment, as well as about various geographical conditions

such as crop areas, digital surface models (DSM), and

depth contour lines (Jimenez and Agudelo, 2015). Recently,

many research results for agricultural applications that

used aerial images obtained from UAVs have been reported,

such as the monitoring of early stage crop growth (Torres-

Sánchez et al., 2014) and the development of a weed

detection method (Peña et al., 2013).

A common use of remote sensing is to evaluate crop

growth status based on canopy greenness by quantifying

the distribution of the vegetation index (VI) in the crop

field. Various vegetation indices, including the Normalized

Difference Vegetation Index (NDVI) and Excess Green

(ExG), are defined as reflectance values of the vegetation

canopy in a given ground area (Woebbecke et al., 1995).

However, since remote sensing images are affected by

changes in illumination, atmospheric conditions, or

viewing geometry, it is necessary to perform various

image- processing steps prior to the vegetation index

analysis, including radiometric calibration and geometric

correction methods. In particular, when investigating

temporal changes in time-based datasets of remotely

sensed images, and because vegetation indices are easily

affected by radiometric disturbances, a careful use of a

radiometric calibration is required for obtaining time-

invariant normalized images. A number of image calibration

and geometric correction methods for small-sized

unmanned vehicles have been developed in order to

account for variations in atmospheric transmission, sun

azimuth, and aircraft attitude (Berni et al., 2009).

Separating images into different classes for extracting

meaningful features, or image thresholding, is a common

task in image-processing applications that is usually

performed based on a certain characteristic of the image

pixels, such as threshold, peak, valley, and clustering.

Various classification algorithms, such as the Otsu and

the valley-emphasis methods, are used to automatically

perform clustering-based image thresholding. The Otsu

method is one of the most commonly used because of its

simplicity to calculate the optimum threshold that separates

the two classes in bimodal histograms. This method

assumes that an image contains two classes of pixels and

calculates an optimum threshold by minimizing the

combined spread (Torres-Sánchez et al., 2014).

The objective of this research was to investigate the

potential of a multirotor-based UAV equipped with a

low-cost digital RGB camera, along with the appropriate

calibration methodologies for quantifying temporal and

spatial variations in the vegetation index of soybeans

grown under different cropping conditions. The treatments

consisted of four cropping systems: soybean single-

cropping with and without herbicide application and

soybean cropping with barley as a cover crop, with and

without herbicide application. The assessment was conducted

based on a comparison of the vegetation index maps

obtained from the different crop canopies.

Materials and Methods

Test plots

UAV remote sensing was conducted in a soybean field

of the National Institute of Crop Science (NICS), Rural

Development Administration (RDA), in 2014. The soybean

field was located in 894-51, Gosaek-dong, Gwonseon-gu,

Suwon-si, Gyeonggi-do, Korea (378530E, 4125344N, 30

m above sea level, UTM zone 52N). The effects of

herbicide and the barley crop cover were tested for the

soybean field experiment, and the effects were compared

focusing on soybean growth status and the overall

vegetation fraction.

A soybean crop (Glycine max L. Merrill) was planted on

May 22. A split plot design was used for the herbicide

treatment and for the barley crop cover arrangement.

The test site was divided into 14 individual plots of 30×14

m, corresponding to a total area of about 0.3 ha (Figure 1).

A conventional amount of herbicide for soil treatment

(mixture of Ethalfluralin and Linuron) was applied to the

main plot, and barley (Fenn beard-less, 15 kg/10a) was

planted as a cover crop in the subplot. The subplots contained

three replications for the barley-covered soybean and

four replications for the conventional soybean field.

The barley cover crop in test plots was expected to

suppress the development of weed in the early season

and maintain a proper vegetation fraction (Kobayashi et

al., 2004). In addition, it was reported that the management

of a proper vegetation fraction is important for upland

fields to prevent soil erosion in the fallow period and

early season (Lu et al., 2000; Poggio, 2005).

Page 3: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

128

Figure 1. Split plot design for the soybean experiments with two treatments and three replications. The ‘N’ letter indicates No-herbicide sprayed, while ‘H’ letter indicates Herbicide sprayed. The ‘B’ letter indicates Barley cover crop, and ‘A’ letter indicates soybean alone.

Figure 2. Views of the UAV platform equipped with a camera (a), the UAV flying over the experimental crop field and the color reference panels (b).

Table 1. Specification of the UAV platform used in the study

Item Specifications

Airframe DJI F550 Hexa-rotor

Flight controller DJI NAZA M Lite

Diagonal

wheelbase550 mm

Propeller 8 × 4.5 in (8045)

Battery 4S Li-Po 6000 mAh

Motor Stator size: 22 × 12 mm, KV: 920 rpm/V

ESC 30 A OPTO, Signal frequency: 30 Hz ~ 450 Hz

Takeoff weight 2,272 g

Maximum flight

time15 min (approx.)

UAV platform and image sensor

The UAV airframe system used in this study was built

with a hexa-rotor airframe model (F550, DJI, China)

equipped with a low-cost digital RGB camera (HERO 3+,

GoPro, USA). The UAV flight was manually controlled

using an R/C flight controller (NAZA M Lite, DJI, China) on

the ground through a 2.4 GHz radio frequency communica-

tion. Figure 2 shows the UAV (Figure 2a) and the color

reference panels used in the study for calibration (Figure

2b). Detail specifications of the UAV platform are

presented in Table 1.

The camera used in the study to capture RGB images

provided 12 megapixels, with a SONY IMX 117 image

sensor and a 14 mm focal length lens. According to the

datasheet provided by the company, as shown in Figure 3,

different spectral responses to different wavelengths of

red, green, and blue bands were considered in image

processing.

Image acquisition and preprocessing

Prior to the image acquisition with the UAV, real-time

Page 4: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

129

Figure 3. Relative spectral responses to different wavelengths of the SONY IMX117 RGB sensors.

Figure 4. Footprints for the images acquired during a flight plan conducted by changing the flight altitudes over the soybean test field. Green rectangles correspond to single image frames.

Figure 5. The original image acquired with the GoPro camera (a) and the image corrected from lens distortion (b).

images were monitored to determine a proper flight

altitude and field of view. For this purpose, a Wi-Fi

remote application (GoPro App, GoPro, USA) was used,

connecting the camera image into an Android smartphone.

The images were acquired based on a time-lapse function

that took one image each three seconds, while varying the

altitude and orientation of the UAV (Figure 4). The UAV

aerial images of the test field were taken on a 2-week

interval for four months, beginning in the period from

June to October 2014.

As shown in Figure 5, in the first step of image pre-

processing, the acquired images were pre-treated using a

fisheye-removal function provided by a commercial

photo-handling software (Adobe Lightroom 5.5) in order

to generate images corrected from lens distortion. The

images were then georeferenced with ground control

points (GCPs) collected with Google Maps. Finally, image

mapping was performed using the GCPs based on a

WGS84 UTM 52N coordinate system (Figure 6).

Radiometric calibration

A radiometric calibration of the RGB digital numbers in

each original image, captured under different outdoor

illumination conditions, was conducted using the empirical

line method (Smith and Milton, 1999). Using commercial

color tarpaulins with uniform reflectance characteristics,

seven 600×900 mm calibration targets were placed in a

location within the flight path of the UAV platform. As

shown in Figure 7, the reference spectral reflectance of

Page 5: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

130

Figure 6. Ground Control Points used for image mapping (a) and result image (b).

Figure 7. Reflectance spectrum of the calibration reference targets.

Table 2. Calculated mean spectral reflectance of the calibration targets

Calibration target Calibration target

Red tarpaulin 0.33 0.14 0.11 Gray tarpaulin 0.28 0.25 0.26

Green tarpaulin 0.30 0.38 0.28 Black tarpaulin 0.09 0.09 0.09

Blue tarpaulin 0.11 0.10 0.12 White tarpaulin 0.89 0.89 0.89

Yellow tarpaulin 0.43 0.26 0.15

the calibration targets was measured in the 400-700 nm

spectral range with a spectrophotometer (ColorMate,

Scinco Co., Korea). To determine the RGB reflectance

spectra to be used in the empirical line calibration

method, the mean reflectance values of the calibration

targets were calculated using Eq. 1 (Table 2).

(1)

where,

: Sepctral reflectance of the calibration target,

calibration targets

: Relative spectral sensitivity of the image sensor,

R, G, B bands

The empirical line calibration method derived the

coefficients needed to fit the uncalibrated RGB imagery

acquired with the GoPro camera to the lab-measured

reflectance spectra of the field targets. It was assumed

that the reflectance values of the calibration targets were

proportional to the RGB band digital numbers (Wang and

Myint, 2015) (Eq. 2).

(2)

where,

and : Coefficients of the linear equation obtained

from regression.

: Reflectance, : digital number for each band.

Figure 8 shows sample calibration curves relating

digital numbers to reflectance for each of the RGB bands.

The UAV images were acquired from the standard panels

placed on the ground, thereby exhibiting coefficients of

determination higher than 0.9.

The results of the linear line fitting obtained with this

empirical method demonstrated that the spectral reflectance

was successfully calibrated, as compared with the standard

spectrophotometer measurements, yielding in most

instances coefficients of determination between 0.85 and

0.99 (Table 3).

Vegetation index calculation and evaluation

A vegetation index of ExG (Excess Green, Eq. 3) was

used in the study for quantifying the vegetation fraction

Page 6: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

131

Figure 8. Sample calibration curves of the RGB sensors that show linear equation relationship between the digital numbers and reflectance spectra, with coefficients of determination > 0.9 (20141015, 146 days after sowing).

Table 3. Results of the calibration curves of the RGB sensor for images obtained at different dates

Image date Image date

20140619

(28 DAS)

R 0.0250 0.093 0.9120140823

(93 DAS)

R 0.0033 -0.193 0.89

G 0.0020 0.125 0.99 G 0.0021 0.015 0.97

B 0.0021 0.163 0.88 B 0.0016 0.067 0.54

20140702

(41 DAS)

R 0.0026 0.069 0.9820140904

(105 DAS)

R 0.0024 0.049 0.95

G 0.0029 -0.013 0.96 G 0.0026 -0.012 0.95

B 0.0022 0.042 0.96 B 0.0020 0.012 0.91

20140716

(55 DAS)

R 0.0023 0.131 0.9720140922

(123 DAS)

R 0.0018 0.246 0.97

G 0.0022 0.109 0.96 G 0.0019 0.193 0.94

B 0.0017 0.129 0.85 B 0.0015 0.188 0.90

20140730

(69 DAS)

R 0.0033 -0.155 0.9520141007

(138 DAS)

R 0.0018 0.226 0.98

G 0.0032 -0.144 0.91 G 0.0019 0.167 0.97

B 0.0025 -0.065 0.79 B 0.0015 0.150 0.93

20140812

(82 DAS)

R 0.0036 -0.275 0.9120141015

(146 DAS)

R 0.0019 0.237 0.95

G 0.0019 -0.008 0.95 G 0.0021 0.156 0.94

B 0.0016 0.036 0.74 B 0.0017 0.149 0.90

in the test plots, since it was reported that the ExG values

could effectively visualize the vegetation fraction and

vegetation growth based on RGB images (Torres-Sánchez

et al., 2014). In this study, the vegetation index was

calculated from the radiometrically calibrated RGB reflectance

values, instead of from the RGB digital numbers.

(3)

where,

and : reflectance values of an image

Vegetation fraction

In order to perform an image classification, the value of

each grayscale image pixel was compared with a prefixed

threshold; if the pixel value was higher than the threshold,

then it was recognized as a vegetation pixel. Once the

image pixels were classified, the vegetation fraction (VF)

was determined as the percentage of pixels classified as

vegetation per unit area of ground surface, which was the

ratio between pixels classified as vegetation and total

pixels in the delimited area (Torres-Sánchez et al., 2014).

The VF values for each frame were calculated using each

of thresholds values. The Otsu threshold was applied in

the greyscale images of every temporal acquisition. The

Otsu threshold is widely used to perform clustering-based

image classification automatically, so it was chosen to

evaluate the feasibility of an automatic threshold determi-

nation. The obtained threshold values were compared to

the original color image to determine whether the binary

Page 7: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

132

Figure 9. Sample histograms used in the M-statistic applied to the test plots (20140702, 41 days after sowing).

Figure 10. Flowchart of the general image processing steps including image acquisition, image preparation, vegetation index calculation, and data analysis.

image could explain the vegetation fraction and whether

this methodology was applicable to quantify VF using

UAV imagery.

Treatment separability

A statistical index named M-statistic, utilizing histograms

obtained from different treatments, was used to assess

the vegetation fractions in the test plots under different

treatments, both in terms of temporal and spatial variability

(Eq. 4, Kaufman and Remer, 1994).

In principle, M-Statistic quantifies the degree of

discrimination between two different pixel groups,

assuming that they are normally distributed. An M value

between the pixel groups lower than 1 means that the

two histograms overlap each other and have a poor

separability, whereas an M value higher than 1 means

that the histograms are well discriminated. (Pereira et al.,

1999).

Figure 9 shows sample histograms used in the M-statistics.

(4)

where,

,

: Mean vegetation index values for the treatments

,

,

: Standard deviation of vegetation index values

for the treatments ,

Figure 10 shows the flow chart of the general steps

conducted in the study, including image acquisition,

image preprocessing, calculation of vegetation fraction,

and data analysis.

Results and Discussion

Multi-temporal soybean field images

Figure 11 shows changes in the soybean field images

taken with the multirotor-based UAV on a 2-week interval,

along with appropriate image calibrations and removal

of lens distortion. As expected, the color images clearly

changed over time according to varying soybean and

barley growing stages. This experiment demonstrated

that RGB images onboard an UAV could provide useful

information about how canopies of soybeans change over

time under different cropping conditions, thus enabling

the identification of vegetation fraction changes in soybean

fields.

Changes in ExG vegetation index over time

Figure 12 shows ExG vegetation index maps of the

soybean test plots calculated from the multi-temporal

images taken at different dates. As expected, they show

that the vegetation fractions changed over time, due to

changes in the soybean canopy greenness. Furthermore,

there were clear differences in ExG values among individual

plots cultivated under varying cropping conditions,

especially in the early season of < 55 days after sowing

(DAS). However, the ExG values reached steady-state

responses, showing no difference between the test plots,

particularly in the late season of > 120 DAS.

In order to compare the ExG values obtained from the

different treatments, mean values and standard deviations

were calculated using the UAV images from each of the

test plots. Similar to the graphical observation in Figure

12, there were clear differences in mean ExG values

among the test plots when image maps from the early

season of < 60 DAS were compared (Figure 13). At 28

Page 8: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

133

Figure 11. Multi-temporal soybean field images taken with a UAV, along with radiometric calibrations and corrections for lens distortion, at different days after sowing.

DAS, the ExG values for the test plots without herbicide

application, NA and NB, were higher than those for the

herbicide-sprayed fields, HA and HB. In addition, at the

early stage of fields with the same herbicide treatment,

higher ExG values were measured for the barley cover

cropped field than for those with a soybean monoculture.

At 41 DAS, the order of the ExG mean values changed,

with decreasing values for no herbicide-treated and

barley cover cropped plots and increasing values for the

test plots without herbicide application and barley cover

crop.

The vegetation fractions, derived from the ExG vegetation

index and the Otsu method, showed significant changes

only in the early season, because after the mid-growth

season they were saturated. As shown in Figure 14, in

images taken 28 days after sowing, the herbicide-sprayed

plots showed a lower vegetation fraction than did the

non-sprayed plots, because the herbicide suppressed

overall vegetation growth, including the soybean crop.

Barley cover cropped plots showed a higher vegetation

fraction, because barley grows faster than soybean and

maintains a high vegetation fraction in the spring season.

As for images taken 41 days after sowing, the vegetation

fraction was saturated in non-sprayed fields. In herbicide

sprayed fields, barley cover cropped plots showed higher

vegetation fraction values than those from soybean mono-

culture plots.

Page 9: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

134

Figure 12. ExG vegetation index maps of the soybean test plots, calculated from multi-temporal images taken at different days after sowing.

Figure 13. Multi-temporal change of mean ExG values under differenttreatments.

Separability tests

M-statistic values were calculated using the means and

standard deviations of ExG pixel groups for different test

plots (Table 4). In the early season, it appeared that the

treatment effect is well defined by the ExG vegetation

index, since the M-Statistic values were the highest

() at 41 DAS. However, after 82 DAS, the M-Statistic

values were lower (≅) and the treatment separability

became poor. Hence, it was thought that the treatments

could not be discriminated in the late season when using

the M-statistics values.

The differences in the ExG vegetation index as well as

the treatment separability found in the early season were

verified using low flight altitude images (<10 m) (Figure

15), assuming that ExG values are proportional to the

vegetation fraction or vegetation growth (Torres-Sánchez

et al., 2014). Barley was a dominant crop in the test fields,

growing faster than soybean in the early stages, thus

supporting the result that barley cover-cropped plots

have a higher vegetation fraction than soybean monoculture

Page 10: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

135

Figure 14. Vegetation fraction in monoculture and barley cover cropped fields in the early season, 28 and 41 days after sowing.

Table 4. Comparison of M-Statistics values obtained from four different test plots at different crop growth stages

DAS NA-NB NA-HA NA-HB NB-HA NB-HB HA-HB

28 -0.26 0.73 0.23 1.51 0.73 -0.82

41 1.15 2.25 2.00 1.50 1.07 -0.62

55 1.15 0.04* 0.89 -0.93 -0.27 0.72

69 -1.01 0.26 0.55 1.25 1.20 0.37

82 0.25 -0.41 0.05* -0.63 -0.16 0.36

93 0.03* 0.51 0.88 0.54 0.98 0.29

105 0.34 -0.41 -0.03* -0.70 -0.31 0.34

123 0.19 0.03* 0.15 -0.14 -0.08* 0.09*

138 -0.01* 0.59 0.61 0.67 0.70 -0.05*

146 0.01* 0.31 0.12 0.34 0.13 -0.22

Bold for M > 1 (Good Separability) and * for M ≅(Bad Separability).

Figure 15. Low altitude images as ground truth images: 28 DAS (left) and 41 DAS (right).

plots.

Moreover, it was clearly shown that the field with

herbicide application showed a lower vegetation fraction

than did the field without herbicide, due to the herbicide

effect on grass suppression. The ‘NA’ treatment showed

the highest ExG value at 41 DAS, and such a high index

might be related to the presence of uncontrolled weeds

(Figure 15). As shown in Figure 15, the barley cover

cropped field showed the highest vegetation fraction,

compared to the soybean monoculture field at 28 and 41

DAS. Furthermore, it shows that the herbicide treatment

suppressed the overall vegetation, including soybean,

barley, and weed, thus supporting the vegetation fraction

results in Figure 12.

In addition, the effect of barley crop cover on ExG was

less significant after the early season, since barley

Page 11: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

136

Figure 16. Comparison of ground truth images, 55 days (left) and 69 days (right) after sowing.

perished after 55 DAS (Figure 16). However, the barley

cover crop had no effect on the soybean crop after 55 DAS,

indicating that barley in its early growth stage could

suppress the growth of other plants, including soybean,

when they are at similar growth stages (Todd et al., 1999;

Corre-Hellou et al., 2011). In summary, our study using

the UAV low-flight altitude RGB images for data analysis

of ExG values demonstrated that ExG values could be

applicable in identifying the influence of herbicide use and

barley cover cropping on plant growth.

Conclusion

We focused on visualizing multi-temporal soybean

growth using an UAV and an RGB camera. The acquired

images were corrected for lens distortion and then

georeferenced. The images were calibrated using an

empirical method based on linear regression to generate

ExG image maps, which could then be used to quantify the

vegetation fraction of soybean. The temporal ExG

values obtained from different soybean test plots were

considerably affected by herbicide and barely cover crop

treatments in the early season of < 60 DAS. Therefore,

multi-temporal images obtained with an UAV and a RGB

camera could be applied for quantifying overall vegetation

fractions and crop growth status, and this information

could contribute to determine proper treatments for the

vegetation fraction.

Conflict of Interest

The authors have no conflicting financial or other

interests.

Acknowledgement

This paper was supported by NIA (The National

Information Society Agency) and IPET (Korea Institute of

Planning and Evaluation for Technology of Food, Agriculture,

Forestry and Fisheries) in 2014-2016.

References

Berni, J. A., P. J. Zarco-Tejada, L. Suárez and E. Fereres,

2009. Thermal and narrowband multispectral remote

sensing for vegetation monitoring from an unmanned

aerial vehicle. IEEE Transactions on Geoscience and

Remote Sensing, 47(3):722-738.

Corre-Hellou, G., A. Dibet, H. Hauggaard-Nielsen, Y. Crozat,

M. Gooding, P. Ambus and E.S. Jensen. 2011. The

competitive ability of pea–barley intercrops against

weeds and the interactions with crop productivity

and soil N availability. Field Crops Research 122(3):

264-272.

Garcia-Ruiz, F., S. Sankaran, J. M. Maja, W. S. Lee, J.

Rasmussen and R. Ehsani. 2013. Comparison of two

aerial imaging platforms for identification of

Huanglongbing-infected citrus trees. Computers and

Electronics in Agriculture 91:106-115.

Hunt, E. R., W. D. Hively, S. J. Fujikawa, D. S. Linden, C.S.

Daughtry and G.W. McCarty. 2010. Acquisition of

NIR-green-blue digital photographs from unmanned

aircraft for crop monitoring. Remote Sensing 2(1):

290-305.

Jimenez, P. L. and D. Agudelo 2015. Validation and

calibration of a high resolution sensor in unmanned

aerial vehicles for producing images in the IR range

utilizable in precision agriculture. In: Proceedings of

Page 12: [8 126-137최종]김학진 Use of Unmanned Aerial Vehicle for Multi

Yun et al. Use of Unmanned Aerial Vehicle for Multi-temporal Monitoring of Soybean Vegetation FractionJournal of Biosystems Engineering • Vol. 41, No. 2, 2016 • www.jbeng.org

137

the AIAA SciTech, Paper No. 2015-0988. Kissimmee,

FL: AIAA Infotech @ Aerospace.

Kaufman, Y. J. and L. A. Remer. 1994. Detection of forests

using mid-IR reflectance: an application for aerosol

studies. IEEE Transactions on Geoscience and Remote

Sensing 32(3):672-683.

Kobayashi, H., S. Miura and A. Oyanagi. 2004. Effects of

winter barley as a cover crop on the weed vegetation

in a no‐tillage soybean. Weed Biology and Management

4(4):195-205.

Lu, Y. C., K. B. Watkins, J. R. Teasdale and A. A. Abdul-Bakil.

2000. Cover crops in sustainable food production.

Food Reviews International 16(2):121-157.

Poggio, S. L. 2005. Structure of weed communities

occurring in monoculture and intercropping of field

pea and barley. Agriculture, Ecosystems and Environment

109(1):48-58.

Peña, J. M., J. Torres-Sánchez, A. I. de Castro, M. Kelly, and

F. López-Granados. 2013. Weed mapping in early-

season maize fields using object-based analysis of

unmanned aerial vehicle (UAV) images. PLoS One

8(10):e77151.

Pereira, J. M., A. C. Sá, A. M. Sousa, J. M. Silva, T. N. Santos,

and J. M. Carreiras. 1999. Spectral characterization

and discrimination of burnt areas. In: Remote sensing

of large wildfires, pp. 123-138. Springer-Berlag Berlin

Heidelberg.

Smith, G. M., and E. J. Milton. 1999. The use of the empirical

line method to calibrate remotely sensed data to

reflectance. International Journal of remote sensing,

20(13):2653-2662.

Todd, A. P., C. B. Orvin, and H. O. James. 1999. Increasing

crop competitiveness to weeds through crop breeding.

Journal of Crop Production 2(1):59-76.

Torres-Sánchez, J., J. M. Peña, A. I. de Castro, and F.

López-Granados. 2014. Multi-temporal mapping of

the vegetation fraction in early-season wheat fields

using images from UAV. Computers and Electronics in

Agriculture 103:104-113.

Wang, C. and S. W. Myint, 2015. A simplified empirical line

method of radiometric calibration for small unmanned

aircraft systems-based remote sensing. IEEE Journal

of Selected Topics in Applied Earth Observations and

Remote Sensing, 8(5):1876-1885.

Woebbecke, D. M., G. E. Meyer, K. Von Bargen, and D. A.

Mortensen, 1995. Color indices for weed identification

under various soil, residue, and lighting conditions.

Transactions of the ASAE 38(1):259-269.