Login

14th ICPA - Session

Session
Title: Crop Biomass Sensing
Date: Mon Jun 25, 2018
Time: 3:30 PM - 5:00 PM
Moderator: Sun-ok Chung
Feasibility of Estimating the Leaf Area Index of Maize Traits with Hemispherical Images Captured from Unmanned Aerial Vehicles

Feeding a global population of 9.1 billion in 2050 will require food production to be increased by approximately 60%. In this context, plant breeders are demanding more effective and efficient field-based phenotyping methods to accelerate the development of more productive cultivars under contrasting environmental constraints. The leaf area index (LAI) is a dimensionless biophysical parameter of great interest to maize breeders since it is directly related to crop productivity. The LAI is defined as the one-sided photosynthetically active leaf area per unit ground area. Direct estimates of the LAI through leaf collection and subsequent leaf area determination in the laboratory are tedious and time-consuming. Hence, indirect methods based on gap fraction theory are frequently used for in situ LAI estimation. The LAI obtained from gap fraction analysis by most optical sensors available on the market is not the true LAI, but a term called the “effective LAI” that does not consider foliage clumping. Hemispherical images of the bottom-up view of crop canopies offer important advantages to maize breeders, such as a low cost compared to other commercial sensors, and it also may provide LAI estimates corrected for foliage clumping (i.e., true LAI). However, taking bottom-up hemispherical images in every single plot of a maize breeding program can take time and patience. The use of small-sized unmanned aerial vehicles (UAVs) in agriculture has allowed for crop information inference at spatial and temporal resolutions that exceed the benefits of other remote sensing technologies (e.g., airborne, satellites). We assessed the efficacy of using UAVs to collect hemispherical images for estimating the LAI. To do this, we investigated the suitability of using nadir-view hemispherical images taken from a UAV flying at a low altitude (15 m) to accurately derive LAI estimates based on gap fraction analysis in a maize breeding trial carried out near Seville, Spain. Six maize cultivars grown in a split-plot design with three blocks and two irrigation treatments (well-watered and water-stressed) were used in the experiment. LAI estimates from top-down hemispherical imaging taken from the UAV were compared with LAI estimates from both bottom-up hemispherical imaging and direct LAI estimates obtained from an allometric relationship derived in the study. The results show that hemispherical images taken from a UAV flying at a low altitude can estimate the LAI of maize breeding plots as accurately as by the classical bottom-up hemispherical imaging approach. CAN-EYE software, which includes automatic image classification and allows the processing of a series of hemispherical photographs, was used in this experiment.

Enrique Apolo-Apolo (speaker)
PhD Student
University of Seville
Seville, AL, Seville 41013
ES
I'm PhD student from University of Seville (Spain). I completed my bachelor's degree in Agricultural Engineering in 2016. Then I enrolled in a Master in Crop Protection (2017) and, later I applied for another Master in Geographical Information Systems (GIS) . I am interested in computers, electronics, machine learning, mechanization and irrigation and I'm convinced the combination of these can help to enhance current agricultural practices.
Manuel Perez-Ruiz
Professor
University of Seville, Spain
La Rinconada, AL, Seville 41309
ES
Full professor and director of the master’s degree in Digital Agriculture and Agri-Food Innovation at University of Sevilla (Spain). For more than 18 years he works continuously on research lines with sensors and instrumentation in agricultural machinery, precision agriculture, variable application techniques, analysis of spectral and thermal information, GNSS/RTK technology and intelligent system for weed control. He has authored 45 scientific papers published in SCI scientific journals as well as included in various book chapters. He is a founder of Agrosap and Agroplanning startups. Both companies are very focus on Agriculture 4.0- ensuring connectivity of agricultural equipment.
Gregorio Egea
ES
Jorge Martinez-Guanter
Carlota Marin-Barrero
Length (approx): 15 min
 
Predicting Dry Matter Composition of Grass Clover Leys Using Data Simulation and Camera-Based Segmentation of Field Canopies into White Clover, Red Clover, Grass and Weeds

Targeted fertilization of grass clover leys shows high financial and environmental potentials leading to higher yields of increased quality, while reducing nitrate leaching. To realize the gains, an accurate fertilization map is required, which is closely related to the local composition of plant species in the biomass. In our setup, we utilize a top-down canopy view of the grass clover ley to estimate the composition of the vegetation, and predict the composition of the dry matter of the forage. Using a deep learning approach, the canopy image is automatically pixel wise segmented and classified into white clover, red clover, grass and weeds. While robust grass and clover segmentation has proven to be a difficult task to automate, red and white clover discrimination in images is challenging, even for human experts, due to many visual similarities between the two clover species. Using high-resolution color images with a ground sampling distance of 4 to 6 pixels per mm and data simulation of hierarchical labels, a cascaded convolutional neural network was trained for segmentation and classification. Clover, grass and weeds was automatically segmented and classified with a pixel wise accuracy of 87.3 percent, while red clovers and white clovers could be distinguished automatically with 89.6 percent accuracy. Utilizing the image analysis on 179 images of mixed crop plots of ryegrass, white clover and red clover, demonstrated a linear correlation between the detected clover and clover species fractions in the canopy, and the corresponding compositions in harvested dry matter.

Søren Skovsen (speaker)
PhD student
DK
Mads Dyrmann
DK
René Gislum
Henrik Karstoft
Rasmus Jørgensen
Length (approx): 15 min
 
Laser Triangulation for Crop Canopy Measurements

From a Precision Agriculture perspective, it is important to detect field areas where variabilities in the soil are significant or where there are different levels of crop yield or biomass. Information describing the behavior of the crop at any specific point in the growing season typically leads to improvements in the manner the local variabilities are addressed. The proper use of dense, in-season sensor data allows farm managers to optimize harvest plans and shipment schedules under variable plant growth dynamics, which may originate from soil spatial variability and management conditions. Sensing of crop architectonics has been used as a diagnostic tool in this context. Moving from the subjective visual estimation of farm workers to automated sensing technologies allows for improved repeatability and savings in cost, time, and labor. The goal of this paper is to report on the evaluation of a prototype sensor system embedded in a portable, low-cost instrument for green vegetable production. The prototype system is currently in its second iteration, featuring improvements for issues found in a previous experiment. The system involves circular scanning of crop canopies to identify crop biomass yield using laser triangulation. The results of these scans are height profiles along an angular position from 0° to 360°, which are the input for the biomass estimation. Two approaches for processing the laser-based height profiles are discussed: regression of profile-representative features and inference of a canopy density function. An experiment was conducted in a spinach field of a commercial farm in Sherrington, Quebec, Canada. The coefficient of determination (R2) for regression between measured and predicted biomass was 0.78 and 0.94. The root mean square error (RMSE) was in turn 4.18 and 2.16 t/ha. The results indicate that the developed sensor system would be a suitable tool for rapid assessment of fresh biomass in the field. Its application would be beneficial in the process of optimizing crop management logistics, comparing the performance of different varieties of crops, and detecting potential stresses in a field environment.

Roberto Buelvas (speaker)
Mr
McGill University
Montreal, AL, Quebec H3H 2K2
CA
BSc Electrical Engineering - 2015 BSc Mechanical Engineering - 2016 MSc Bioresource Engineering - In progress
Viacheslav Adamchuk
Professor and Chair
McGill University
Ste-Anne-de-Bellevue, AL, Quebec H9X 3V9
CA

Originally from Kyiv, Ukraine, Dr. Adamchuk obtained a mechanical engineering degree from the National Agricultural University of Ukraine (currently National University of Life and Environmental Sciences of Ukraine), located in his hometown. Later, he received both MS and PhD degrees in Agricultural and Biological Engineering from Purdue University (USA). In 2000, Dr. Adamchuk began his academic career as a faculty member in the Department of Biological Systems Engineering at the University of Nebraska-Lincoln (USA). Ten years later, he assumed his current position in the Department of Bioresource Engineering at McGill University (Canada), while retaining his adjunct status at the University of Nebraska-Lincoln. Currently, he serves as the Chair of the Bioresource Engineering Department. In addition, he is Canada’s representative to the International Society of Precision Agriculture. Dr. Adamchuk leads a Precision Agriculture and Sensor Systems (PASS) research team that focuses on developing and deploying soil and plant sensing technologies to enhance the economic and environmental benefits of precision agriculture. His team has designed and evaluated a fleet of proximal sensor systems capable of measuring physical, chemical and biological attributes directly in a field. Most sensors produce geo-referenced data to quantify spatial soil/plant heterogeneity, which may be used to prescribe differentiated treatments according to local needs. Through studies on sensor fusion and data clustering, he investigated the challenges faced by early adopters of precision agriculture. Through his outreach activities, Dr. Adamchuk has taught multiple programs dedicated to a systems approach in adopting smart farming technologies around the world.

Length (approx): 15 min
 
Ground Vehicle Mapping of Fields Using LiDAR to Enable Prediction of Crop Biomass

Mapping field environments into point clouds using a 3D LIDAR has the ability to become a new approach for online estimation of crop biomass in the field. The estimation of crop biomass in agriculture is expected to be closely correlated to canopy heights. The work presented in this paper contributes to the mapping and textual analysis of agricultural fields. Crop and environmental state information can be used to tailor treatments to the specific site. This paper presents the current results with our ground vehicle LiDAR mapping systems for broad acre crop fields. The proposed vehicle system and method facilitates LiDAR recordings in an experimental winter wheat field. LiDAR data are combined with data from Global Navigation Satellite System (GNSS) and Inertial Measurement Unit (IMU) sensors to conduct environment mapping for point clouds. The sensory data from the vehicle are recorded, mapped, and analysed using the functionalities of the Robot Operating System (ROS) and the Point Cloud Library (PCL). In this experiment winter wheat (Triticum aestivum L.) in field plots, was mapped using 3D point clouds with a point density on the centimeter level. The purpose of the experiment was to create 3D LiDAR point-clouds of the field plots enabling canopy volume and textural analysis to discriminate different crop treatments. Estimated crop volumes ranging from 3500-6200 (m3) per hectare are correlated to the manually collected samples of cut biomass extracted from the experimental field.

Søren Skovsen (speaker)
PhD student
DK
Martin Christiansen
Postdoctoral Researcher
Aarhus University; Denmark
Morten Laursen
Rasmus Jørgensen
René Gislum
Length (approx): 15 min
 
Mapping Cotton Plant Height Using Digital Surface Models Derived from Overlapped Airborne Imagery

High resolution aerial images captured from unmanned aircraft systems (UASs) are recently being used to measure plant height over small test plots for phenotyping, but airborne images from manned aircraft have the potential for mapping plant height more practically over large fields. The objectives of this study were to evaluate the feasibility to measure cotton plant height from digital surface models (DSMs) derived from overlapped airborne imagery and compare the image-based estimates with the data from a tractor-mounted ultrasonic distance sensor. An airborne imaging system consisting of a red-green-blue (RGB) camera and a modified near-infrared (NIR) camera mounted on a Cessna 206 aircraft was flown along six flight lines over a 27-ha field at peak cotton growth and again with tilled bare soil. Images were captured at 370 m above ground level to achieve a ground pixel size of 0.09 m and side/forward overlaps of about 85%. The ultrasonic distance sensor and a centimeter-grade GPS receiver were mounted on a high-clearance tractor to collect cotton plant height data from every 8th row at 1-s intervals. The images taken on the two dates were processed to create orthomosaics and DSMs. Plant height was estimated from the difference between the two DSMs. Results showed that a significant linear relation existed between image-based and ground-based plant height estimates with a R2 value of 0.657 and a standard error of 0.11 m. The preliminary results from this study indicate that DSMs derived from overlapped airborne imagery have the potential to estimate and map plant height for monitoring crop growth conditions.

Chenghai Yang (speaker)
Research Agricultural Engineer
USDA-ARS
College Station, TX 77845
US

Dr. Chenghai Yang is a Research Agricultural Engineer with the USDA-ARS Aerial Application Technology Research Unit in College Station, TX. His research has focused on the development and application of remote sensing technologies for precision agriculture and pest management since 1995.

Length (approx): 15 min
 
Estimating Corn Biomass from RGB Images Acquired with an Unmanned Aerial Vehicle

Above-ground biomass, along with chlorophyll content and leaf area index (LAI), is a key biophysical parameter for crop monitoring. Being able to estimate biomass variations within a field is critical to the deployment of precision farming approaches such as variable nitrogen applications.

With unprecedented flexibility, Unmanned Aerial Vehicles (UAVs) allow image acquisition at very high spatial resolution and short revisit time. Accordingly, there has been an increasing interest in those platforms for crop monitoring and precision agriculture. Typically, classic remote sensing techniques tend to rely on a vegetation index – such as the popular Normalized Difference Vegetation Index (NDVI) – as a proxy for plant biophysical parameters. However, when applied to UAV imagery, those approaches do not fully exploit the greater details provided by high resolution.

The purpose of this research is to develop a procedure for assessing above-ground biomass based on the analysis of very high resolution RGB imagery acquired with a UAV platform. A small consumer-grade UAV (the DJI Phantom 3 Professional) with a built-in RGB camera was flown over an experimental corn (Zea mays L.) field. A series of images were acquired in summer 2017 at very low altitudes, resulting in milli-resolution imagery (images with less than 1 cm per pixel). Two modes of image acquisition were performed: in a grid pattern at an altitude of 10m AGL (above ground level) for generating orthomosaics, and in a stationary mode at a height of 2.9m AGL. For stability reasons, the latter mode was simulated by a low-altitude platform hung on a zip-line.

Image acquisitions were repeated in time during the early stages of corn growth, covering phenological stages from V2 to V8. Oblique imagery was also acquired in order to evaluate the effect of viewing angle. Field measurement campaigns were carried out in order to provide quantitative measurements of some biophysical parameters, including plant fresh biomass, plant dry biomass, plant height, leaf fresh biomass and leaf dry biomass. The method proposed in this study is based on computer vision, which allowed extracting leaf projected area from the images for estimating biomass and detecting differences in corn growth. Using UAV-derived imagery to extract information on biomass proves to be a cost-effective means for monitoring crop biomass spatially and temporally.

Kosal Khun (speaker)
CA
Philippe Vigneault
Research professional
Agriculture and Agri-Food Canada
Saint-Jean-sur-Richelieu, AL, Québec J3B 3E6
CA
Edith Fallon
Agriculture and Agri-Food Canada
Nicolas Tremblay
Research Scientist
Agriculture and Agri-Food Canada
St-Jean-sur-Richelieu, AL, Quebec J3B 3E6
CA

ISPA President from 2016 to 2018 On-Farm Experimentation Community co-lead as of October 2020

Length (approx): 15 min