Login

14th ICPA - Session

Session
Title: Remote Sensing 4
Date: Tue Aug 2, 2016
Time: 3:40 PM - 5:40 PM
Moderator: Chenghai Yang
Planet Labs' Monitoring Solution in Support of Precision Agriculture Practices

Satellite imagery is particularly useful for efficiently monitoring very large areas and providing regular feedback on the status and productivity of agricultural fields. These data are now widely used in precision farming; however, many challenges to making optimal use of this technology remain, such as easy access to data, management and exploitation of large datasets with deep time series, and sharing of the data and derived analytics with users. Providing satellite imagery through a cloud computing based platform offers an operationally successful method to enable timely and efficient access to important field information and crop analytics.

Planet’s Monitoring for Agriculture solution leverages the high-frequency revisit schedule and collection capacity of the PlanetScope and RapidEye satellite constellations. It provides access to an enormous pool of multi-temporal, multispectral, orthorectified high-resolution imagery collected throughout the growing season, as well as archive imagery from previous seasons. Subscribers and their end-users benefit from this unique information source for extracting accurate and timely information on crop indicators.

As an example, derived vegetation indices can efficiently map and display the different levels of greenness in crop fields, providing a basis for managing different parts of the fields according to their productivity levels. Additionally, trend analysis on temporally and spatially consistent high frequency time series data improves the ability to monitor crop health, identify stress factors, and provide more actionable and timely information for field management decision support.

Another key feature of these image collection programs is the cloud-based technology supporting them. Planet’s imagery dissemination and analytics platform empowers users with fast and easy access to imagery collected shortly after acquisition using Application Programming Interfaces (APIs). It also allows users to run processing workflows more efficiently, by bringing their proprietary algorithms that derive information about agricultural fields closer to the enormous pool of imagery. Clients are thus able to accelerate their image analytics, as well as reduce capital and operational expenses needed to build and maintain custom systems.

Together, collection programs and cloud-based services, integrated with localized application software, respond to worldwide demand for managing crop production, potential, and problems more effectively than blanket field applications. The results are optimized costs and improved yields, while reducing the burden on the environment.

Finally, the provision of high-resolution, high frequency satellite imagery through a cloud solution allows Planet’s customers to focus more on their core expertise: data analytics and applications.

Ryan Schacht (speaker)
Planet Labs
US
Length (approx): 20 min
 
First Experiences with the European Remote Sensing Satellites Sentinel-1A/ -2A for Agricultural Research

The Copernicus program headed by the European Commission (EC) in partnership with the European Space Agency (ESA) will launch up to twelve satellites, the so called “Sentinels” for earth and environmental observations until 2020. Within this satellite fleet, the Sentinel-1 (microwave) and Sentinal-2 (optical) satellites deliver valuable information on agricultural crops. Due to their high temporal (5 to 6 days repeating time) and spatial (10 to 20 m) resolutions a continuous monitoring of the state of agricultural crops becomes possible. The open data policy offers great opportunities for the operational integration of remote sensing data into agricultural practice.

In April 2014 and 2016 the technical identical Sentinel-1A and -1B satellites have been launched. The radar satellites operate with C-Band (λ~6 cm) and two polarizations (VH, VV) resulting in a spatial resolution over land of nominal 10 m. Combining the orbits of both satellites will result in repeating times of 6 days at equatorial line. In Germany, the revisit time is 4 days even with one satellite only. With its cloud penetrating radar, the Sentinel-1 satellites are very predictable in data availability, which has been the major drawback of the use of remote sensing in agriculture in the past. The high revisit times allow the detection of the phenological stages of different crops, which helps to produce crop type classifications with high accuracy. It has been difficult to distinguish cereals (e.g. barley, rye and wheat) by optical systems so far but this new strategy will render it possible. Also important dates in agricultural management (soil management, harvest dates) can be identified on a field level.

In June 2015, the multispectral Sentinel-2 satellite has been launched. This satellite offers a good spectral resolution with 10 bands and spatial resolutions of 10 to 20 m. This spectral setup helps to retrieve quantitative vegetation parameters like above ground fresh or dry biomass and leaf area index, which serve as input variables to spatial crop growth models. Also a technical identical twin satellite Sentinel-2B is scheduled for launch in 2016. Combining the orbits of both satellites will result in revisit times of 5 days at the equatorial line.

At submission date of the paper, only few cloud free Sentinel-2 images could be acquired for the region of Braunschweig. Also a proper atmospheric correction is not available at the moment, so due to that reasons this paper focused mainly on Sentinel-1A data.

Radar backscatter signatures for the most important crops in Germany will be examined to utilize the high temporal resolution of the data. The new data source offers opportunities to improve crop classification as well as monitoring different phenological stages of the crops.

Holger Lilienthal (speaker)
Dr.
Julius Kuehn-Institut
Braunschweig, AL, Lower-Saxony 38116
DE
Heike Gerighausen
Dr.
Julius Kühn-Institut, Institut of Crop & Soil Science
Braunschweig 38116
DE
Length (approx): 20 min
 
Selection and Utility of Uncooled Thermal Cameras for Spatial Crop Temperature Measurement Within Precision Agriculture

Since previous research used local, single-point measurements to indicate crop water stress, thermography is presented as a technique capable of measuring spatial temperatures supporting its use for monitoring crop water stress. This study investigated measurement accuracy of uncooled thermal cameras under strict environmental conditions, developed hardware and software to implement uncooled thermal cameras and quantified intrinsic properties that impact measurement accuracy and repeatability. A DRS Tamarisk® 320 (CAM1) and FLIR® Tau 2 (CAM2) were selected for this study. Results indicated that wide and medium angle lens distortion was 19% for CAM1 and 30% for CAM2. A minimum of four pixels were recommended to maintain surface temperature integrity and maximize image coverage area. A 19 and 7 min warm-up was necessary for CAM1 and CAM2 respectively. A real-time (RT) and one-time (OT) radiometric calibration provided absolute surface temperatures with environmental compensation. CAM1 analog output yielded a configurable temperature span from 5°C-156°C, resolution from 0.02°C-0.61°C, and measurement accuracy of ±0.82°C or 0.62ºC with OT or RT radiometric calibration, respectively, whereas digital output yielded a fixed temperature span of 156°C, resolution of 0.01ºC and measurement accuracy of ±0.43 or 0.29ºC with OT or RT radiometric calibration, respectively. CAM2 yielded a controllable temperature span of 18°C-206ºC, resolution of 0.07°C-0.80ºC, and measurement accuracy of ±0.87 or 0.63ºC with OT or RT radiometric calibration, respectively. Both cameras were sensitive to surface temperatures (R2=0.99); but, CAM1 was more controllable. Results highlight that uncooled thermal cameras can measure spatial temperatures, thereby measuring subtle crop dynamics for water resource management. 

 

Devin Mangus (speaker)
Research Assistant
Kansas State University
West Des Moines, IA 50265
US
Ajay Sharda
Professor and Director
Kansas State University
Manhattan, KS, AL 66506
US

Dr. Ajay Sharda is an Associate Professor in the Department of Biological and Agricultural Engineering at Kansas State University. He received his Ph.D. in Biosystems Engineering from Auburn University. At K-State, Ajay's research focuses on the development, analysis, and experimental validation of control systems for agricultural machinery systems with a variety of emphases, including automation, sensor testing/development, mechatronic systems, computer vision, artificial intelligence, developing automated test setups for hardware-in-the-loop simulations, unmanned vehicles and thermal infrared imaging. He also serves as Director-Research at K-State's Institute for Digital Agriculture and Advanced Analytics, a people-centered interdisciplinary collective transforming learning, research, and outreach around digital technologies and advanced analytical methods to enhance agricultural, environmental, and socioeconomic decision-making.

Length (approx): 20 min
 
High Resolution 3D Hyperspectral Digital Surface Models from Lightweight UAV Snapshot Cameras – Potentials for Precision Agriculture Applications

Precision agriculture applications need timely information about the plant status to apply the right management at the right place and the right time. Additionally, high-resolution field phenotyping can support crop breeding by providing reliable information for crop rating. Flexible remote sensing systems like unmanned aerial vehicles (UAVs) can gather high-resolution information when and where needed. When combined with specialized sensors they become powerful sensing systems.

Hyperspectral data has been shown to provide information about biophysical and biochemical properties of vegetation and agricultural crops, for biotic and abiotic stress detection and biomass and yield estimation. Recently, lightweight hyperspectral snapshot cameras have been introduced, which record spectral information as a two dimensional image with every exposure. Specialized workflows based on photogrammetric algorithms allow reconstructing the 3D topography of a surface from this data and thus retrieve structural and spectral information at the same time.

Hyperspectral digital surface models (HS DSMs) derived from snapshot cameras are a novel representation of the surface in 3D space linked with spectral information about the reflection and emission of the objects covered by the surface. In this contribution the requirements and workflow to derive HS DSM of crop canopies are described and their potential for precision agriculture applications is demonstrated. The data is derived by the hyperspectral snapshot camera UHD 185 – Firefly, which records hyperspectral information from 450 to 950 nm in 138 bands

Results from a multi-temporal monitoring campaign of an experiment with two nitrogen fertilizer treatments and six different spring barley cultivars are presented. The campaign was carried out at the research station Campus Klein-Altendorf, belonging to the University of Bonn, close to the city of Bonn, Germany. From the HS DSM the chlorophyll content, plant height and biomass were estimated for individual growth stages and across several growth stages. Plant height was estimated with an R² of up to 0.98, chlorophyll with R² of 0.13 to 0.64 depending on the growth stage. Additionally, based on three years of experience with hyperspectral UAV snapshot cameras important remarks regarding the 3D and spectral data quality are given.

Helge Aasen (speaker)
Institue of Geography, University of Cologne
, AL 50674
DE
Length (approx): 20 min
 
Comparison Between Tractor-based and UAV-based Spectrometer Measurements in Winter Wheat

In-season variable rate nitrogen fertilizer application needs a fast and efficient determination of nitrogen status in crops. Common sensor-based monitoring of nitrogen status mainly relies on tractor mounted active or passive sensors. Over the last few years, researchers tested different sensors and indicated the potential of in-season monitoring of nitrogen status by unmanned aerial vehicles (UAVs) in various crops. However, the UAV-platforms and the available sensors are not yet accepted to monitor nitrogen status in farm practice. This study compares tractor-based spectrometer measurements with measurements from a UAV to assess the potential in estimating N uptake. Sensors on both platforms were technically identical. The UAV sensor was adapted to a UAV platform and its payload restriction. The sensors measured the reflectance in the spectral wavelength domain of 600-1100 nm and with a spectral resolution of 10 nm. Measurements were taken in a winter wheat field, which was split into 12 differently fertilized treatments. At three crop growth stages in 2015 (BBCH 31, 49, 59) crop scans were conducted, accompanied by destructive biomass samples to determine aboveground plant N uptake. Spectra from both sensing platforms showed comparable characteristics. Similar correlations between N uptake and a Simple Ratio vegetation index (SR) were observed for both platforms across the three growth stages, whereas the commonly observed saturation of the Normalized Difference Vegetation Index (NDVI) was less pronounced for the UAV based sensor with nadir view, resulting in a better correlation with N uptake compared to an NDVI calculated from tractor-based sensor spectra obtained at oblique view. The differences in explained variability between the systems were due to the different sensor viewing angles and footprints. N uptake can be monitored by spectral reflectance measurements with an acceptable accuracy for farming practice, irrespective of the platform (UAV or tractor-mounted) and the related viewing angle. Provided that crop and growth stage specific calibrations are developed, UAV-based spectral crop sensing, therefore, has the potential to supplement tractor based sensing where required.

Martin Gnyp (speaker)
Dr.
Research Centre Hanninghof, Yara International ASA
Dülmen, AL 48249
DE
Stefan Reusch
Dr.
Yara International, Research Centre Hanninghof, Duelmen, Germany
Dülmen, AL 48249
DE
Jörg Jasper
YARA GmbH & Co. KG
Duelmen, NRW 48249
DE
Georg Bareth
Prof. Dr.
University of Cologne
Cologne, AL 50923
DE

Since 2004, Dr. Georg Bareth is Professor of Geoinformatics and Head of the GIS & RS Group at the Institute of Geography, University of Cologne, Germany. He also is currently serving as Vicedean of the Faculty for Mathematics and Natural Sciences of the University of Cologne and as Vicepresident of the German Society for Photogrammetry, Remote Sensing, and Geoinformation. He graduated in Physical Geography from Stuttgart University in 1995. From 1996 to 2004, he earned his Dr.sc.agr. and Habilitation in Agroinformatics at the University of Hohenheim, Germany. One major research interest of Dr. Bareth is the development of proximal and remote sensing methods for crop monitoring on which he has been working for the last 25 years.

Length (approx): 20 min
 
Melon Classification and Segementation Using Low Cost Remote Sensing Data Drones

Object recognition represents currently one of the most developing and challenging areas of the Computer Vision. This work presents a systematic study of various relevant parameters and approaches allowing semi-automatic or automatic object detection, applied onto a study case of melons on the field to be counted. In addition it is of a cardinal interest to obtain the quantitative information about performance of the algorithm in terms of metrics the suitability whereof is determined by the final goal of the classification. Research will consist of texture analysis, color segmentation in the RGB and YCbCr color spectrums, and the combination of all extracted features. Classification methods such as manual threshold tuning and k-nearest neighbor will be used after extracting the necessary components to identify melons. Provided that the aforementioned approaches can be commonly described as feature-based, this work as aiming to cover solutions operating on both local and global scale subsequently continues by advanced techniques as for example the normalized spatial correlation based on a known sample of either texture of the whole object being sought.

Tiebiao Zhao (speaker)
atwater, CA 95301
US
Length (approx): 20 min