Universität Bonn

IGG | Geodesy

Applications in Agriculture

AGRI_HighResolution4DCropScanning.png
© IGG Geodesy

High-Resolution 4D Crop Point Cloud Creation

The determination of phenotypic traits such as leaf area, leaf angle distribution, and plant height based on 3D point clouds at the level of plots, individual plants, and individual plant organs has become increasingly important in agricultural sciences in recent years and decades. These phenotypic traits provide important information for plant breeding, plant stress monitoring, or yield optimization.

The focus of this PhenoRob-funded research topic is on a kinematic laser scanning system mounted on a field robot (see left) that is used to generate georeferenced and high-resolution 3D point clouds of crops in agricultural fields. To create the 3D point clouds the measured laser profiles of two laser triangulation sensors (see left middle) are georeferenced using the robot's position and orientation estimated by fusing the sensor data of an IMU and a dual-antenna GNSS receiver.

Based on the 3D point clouds phenotypic traits of single plants and even their organs can be tracked over time giving high information gain for breeders and modern agricultural sciences.

Contacts:

    

  • Esser, F., Rosu, R. A., Cornelißen, A., Klingbeil, L., Kuhlmann, H., & Behnke, S. (2023). Field Robot for High-Throughput and High-Resolution 3D Plant Phenotyping: Towards Efficient and Sustainable Crop Production. IEEE Robotics & Automation Magazine. https://doi.org/10.1109/MRA.2023.3321402

  • Esser, F.; Klingbeil, L.; Zabawa, L.; Kuhlmann, H. (2023) Quality Analysis of a High-Precision Kinematic Laser Scanning System for the Use of Spatio-Temporal Plant and Organ-Level Phenotyping in the Field. Remote Sens.  15, 1117. https://doi.org/10.3390/rs15041117

PCP_CropMonitoring_image.png
© IGG Geodesy

Point Cloud Processing for Crop Monitoring

Plant phenotyping is a central task in crop science and plant breeding. Since standard methods often require time-consuming and manual observations it is indispensable to develop automatic, sensor driven methods which offer objective and fast information. In recent years 3D sensing systems like laser scanners became increasingly popular, since even under challenging field conditions they are able to provide structural plant parameters, which can be hardly extracted with spectral sensors. In order to determine relevant plant parameters, such as leaf area, leaf area index and leaf angle distribution, it is often necessary to first generate surface models from the pure point clouds.

To generate the surface model, meshing algorithms for plants have proven useful because the meshed area can be used as a starting point for determining the required plant parameters. Since the automated, sensor-driven methods allow measurements to be distributed throughout the season, it is now possible to determine and observe the plant growth over a longer period of time, which enables us to monitor environmental influences (e.g. drought stress).

Contacts:

    

  • Zabawa L.M, Zabawa L, Esser F, Klingbeil L, Kuhlmann H (2021) Automated Surface Area Estimation of Plants based on 3D Point Clouds, International Conference on Computer Vision (ICCV), Workshop CPPA, 11-17 October 2021
3D_Phenotyping_Viticulture_image2.jpg
© IGG Geodesy

3D Phenotyping in Viticulture

Accurately characterizing the macro structure of vineyards and determining the location of individual grapevine is crucial for precise vineyard management tasks like selective harvesting, accurate spraying, fertilization and weeding, and effective crop management. Key geometric parameters of grapevine crops, such as canopy structure, height, width, volume, and leaf area, are closely connected to plant growth, health, and potential yield. Estimation of these parameters is traditionally performed by human operators collecting manual measurements; however, this task is labor intensive and prone to error. Recently, Unmanned Aerial Vehicles (UAVs) have been commonly used for this task due to their efficient data acquisition, simplicity, and cost-effectiveness, and they can quickly cover large vineyard areas. Although some grapevine parameters can be extracted from single images, a complete 3D vineyard model is more effective for investigating conditions under the canopy and deriving traits like biomass, canopy volume, and vine-row width and height.

The research aims to determine single plant locations in a vineyard from UAV-derived 3D point clouds. Additionally, we automatically extract geometric parameters like plant height, canopy width, and canopy volume along the row with a high spatial resolution, making it possible to assign the values to the detected single plants. These detailed geometric analysis of the canopy offers valuable insights for vineyard managers and breeders, assisting them in crucial tasks such as pruning, agrochemical spraying, and optimizing yields.

Contacts:

    

  • Cantürk, M., Zabawa, L., Pavlic, D., Dreier, A., Klingbeil, L., & Kuhlmann, H. (2023). UAV-based individual plant detection and geometric parameter extraction in vineyards. Frontiers in Plant Science, 14, 1244384.
UAVCropHeights_image.png
© IGG Geodesy

UAV Image-based Crop Height Estimation

Crop height is defined as the shortest distance between ground level and the upper boundary of the main photosynthetic tissues on a plant. It is classified as a reliable trait in crop phenotyping and recognized as a good indicator for biomass, expected yield, lodging, or crop stress. The current industry standard for crop height measurement is a manual measurement using a ruler. This data sampling method is time-consuming, labour-intensive, and subjective due to the specific approaches of the observer in the field. Using the UAV image-based method, crop height can be estimated more efficiently and non-invasively. Besides better efficacy, high spatial resolution enables additional information about crop height variability which describes the structural properties of the canopy and its link to plant health.

In general, the use of novel sensing technologies has drastically increased the amount and diversity of phenotypic data, but it should be investigated how selected sensing system influence estimation of phenotypic traits like crop height. Hence, well known workflow how to create a Crop surface model (CSM) and how to estimate crop height should be adjusted to the 3D sensing system used to observe the canopy surface. This approach is our way to go.

Contacts:

    

  • Becirevic, D., Klingbeil, L., Honecker, A., Schumann, H., Rascher, U., Léon, J., and Kuhlmann, H. (2019): On the derivation of crop heights from multitemporal UAV based imagery, ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., IV-2/W5, 95-102, https://doi.org/10.5194/isprs-annals-IV-2-W5-95-2019
  • Becirevic, D.; Klingbeil, L.; Honecker, A.; Schumann, H.; Leon, J.; Kuhlmann, H. (2018). UAV-based Growth Rate Determination in Winter Wheat. 6th International Conference on Machine Control and Guidance. Bornimer Agrartechnische Berichte, Heft 101, pp 128-136.
  • Honecker, A.; Schumann, H.; Becirevic, D.; Klingbeil, L.; Volland, K.; Forberig, S.; Jensen,M.;  Paulsen, H.;Kuhlmann, H. and Leon, J. (2020). Plant, space and time - linked together in an integrative and scalable data management system for phenomic approaches in agronomic field trials. Plant Methods 16, 55 (2020) htpps://doi.org/10.1186/s113007-020-00596-3
Phenotyping_Viticulture_image.png
© IGG Geodesy

High Throughput Phenotyping in Viticulture

Grapevine is a historically and economically important crop. Due to the perennial nature of grapevine, the monitoring of plants for decision making, like leaf removal or thinning procedures, needs to be carried out in the field. Usually small samples are investigated in detail and the results are extrapolated to the whole field. This results in labor intensive, subjective and inaccurate results. Therefore current research focuses on objective and data driven methods to provide large scale data to the farmers.

In our research we focus on image based data which are recorded with the Phenoliner, a semi automatic field Phenotyping Platform. The Phenoliner is a modified grapevine harvester equipped with a camera system instead of the harvesting equipment.

The Phenoliner records horizontally and vertically image sequences. Images are fed into a neural network which produces berry masks. These masks can be used to extract phenotypic traits like the number and size of visible berries.
These phenotypic traits can be used to guide management and breeding decisions based on objective and reliable information.

Contacts:

    

  • Kierdorf J, Weber I, Kicherer A, Zabawa L, Drees L, Roscher R (2022) Behind the Leaves: Estimation of Occluded Grapevine Berries with Conditional Generative Adversarial Networks, Frontiers in Artificial Intelligence, Vol. 5
  • Zabawa L, Kicherer A, Klingbeil L, Töpfer R, Roscher R, Kuhlmann H (2022) Image-based analysis of yield parameters in viticulture, Biosystems Engineering, Vol. 218: 94-109
  • Miranda M, Zabawa L, Kicherer A, Strothmann L, Rascher U, Roscher R (2022) Detection of Anomalous Grapevine Berries Using Variational Autoencoders, Frontiers in Plant Science, 13:729097
  • Bömer J, Zabawa L, Sieren P, Kicherer A, Klingbeil L, Rascher U, Muller O, Kuhlmann H, Roscher R (2020) Automatic Differentiation of Damaged and Unharmed Grapes Using RGB Images and Convolutional Neural Networks, European Conference on Computer Vision (ECCV), Workshop CVPPP, 23-28 August 2020
  • L. Zabawa, A. Kicherer, L. Klingbeil, A. Milioto, R. Töpfer, H. Kuhlmann, R. Roscher (2019) Detection of single grapevine berries in images using fully convolutional neural networks, The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshop CVPPP, 16-20 June 2019
  • L. Zabawa, A. Kicherer, L. Klingbeil, R. Töpfer, H. Kuhlmann, R. Roscher (2019) Counting of grapevine berries in images via semantic segmentation using convolutional neural networks, ISPRS Journal of Photogrammetry and Remote Sensing, Vol. 164, pp. 73-83
  • Kicherer, A.; Herzog, K.; Bendel, N.; Klück, H.-C.; Backhaus, A.; Wieland, M.; Rose, J.C.; Klingbeil, L.; Läbe, T.; Hohl, C.; Petry, W.; Kuhlmann, H.; Seiffert, U.; Töpfer, R. (2017): Phenoliner: A New Field Phenotyping Platform for Grapevine Research. Sensors 2017, 17, 1625, http://www.mdpi.com/1424-8220/17/7/1625
  • Rose, J.C.; Kicherer, A.; Wieland, M.; Klingbeil, L.; Töpfer, R.; Kuhlmann, H.(2016): Towards Automated Large-Scale 3D Phenotyping of Vineyards under Field Conditions. Sensors 2016, 16, 2136, http://www.mdpi.com/1424-8220/16/12/2136
Template_CoordinateControlledSeeding.png
© IGG Geodesy

Coordinate controlled sugar beet seeding

Seeds in agricultural field operation are sown by seed drills. The drills meter by volume with toothed wheels which is common for cereals or by singling devices, selecting single seeds and allocating the seeds with fixed distances in a row. Current cropping systems are applicable for field traffic in longitudinal direction. In case of weeding by herbicides the space between the rows are used for sprayer paths. Mechanical weeding requires satisfactory space between the rows which is treated by haws with cutting tools. The space between the plants in the rows is left untreated and reduces the mechanical weeding efficiency.

The goal of the project was to control the angle and rotational speed of the seeding wheel on the seeding a machine in real time, in order to place a single seed exactly at the intended distance to a reference line at the border of the field. A multi-sensor system has been developed and implemented on a tractor. The system integrates RTK GNSS position measurements with angular rate data from a gyroscope and velocity measurements from an odometer using a Kalman Filter, to determine the horizontal position and heading of the tractor in real-time with a high accuracy. This information is the used to control the seeding wheel.

After emergence of the plants, their position has been determined using a total station with an accuracy below a centimetre. An analysis of the data showed, that the deviation of plant positions from the intended seeding positions had a standard deviation of 2cm, including the state estimation error and the rolling of the seed after it has been dropped.

Contacts:

    

  • Schulze Lammers, P.; Schmittmann, O.; Klingbeil, L.; Wieland, M.; Kuhlmann, H. (2017): Coordinate controlled placement of sugar beet seeds. Proceedings of the 45th International Symposium on Agricultural Engineering, Actual Tasks on Agricultural Engineering, 21-24 February 2017, Opatija, Croatia, 293-301.
  • Schölderle, F. ; Zeimetz, Ph ; Kuhlmann, H. (2010) A Multi-Sensor System in Precision Farming for Position Steered Seed of Sugar Beet, Schulze Lammers ; Kuhlmann (Ed.): 2nd International Conference on Machine Control & Guidance, Schriftenreihe des Instituts für Geodäsie und Geoinformation, Heft 16, pp. 213-221, Bonn

Contact

Avatar Kuhlmann

Prof. Dr.-Ing. Heiner Kuhlmann

Head of working group

1.010

Nußallee 17

53115 Bonn

Wird geladen