You are here: Home Institut Pressemitteilungen Visual Instance Segmentation of Leaves and Plants for In-Field Plant Phenotyping
Date: Jan 10, 2022

Visual Instance Segmentation of Leaves and Plants for In-Field Plant Phenotyping IGG-Blogpost Series | Working Group Photogrammetry

— filed under: ,

Crop production provides food, feed, and fiber for our society. To keep the yields high and to adapt to stresses as well as to the impacts of climate change, plant breeders continuously generate new genetic variations of crops. These variations are then planted, and their performance is assessed. For that, plant breeders are looking for effective systems to assess detailed phenotypic traits about plants at a large scale for an in-depth understanding of the relationship between genotype and phenotype. Recording how the individual plants develop and grow is a labor- and time-consuming process.



Crops with consistent instance segmentation over leaves and plants. (© Photo: IGG / Photogrammetry).


Approaches from crop production to describe vegetative development stages (such as the BBCH index) are mainly defined by the number of leaves produced on the main stem, the number of tillers on a plant, or the number of nodes depending on the plant. Thus, the leaf count is a key plant trait directly related to the plant’s growth stage. It can even tell farmers something about the yield potential and when to perform herbicide treatments. In fields today, the vegetative stage is obtained by manual inspection, typically sampled at a subset of locations in the field. 

Automating this process can save labor, do more frequent assessments on a large scale in less time, and support plant breeders and precision farming on agricultural fields. Here, recent developments from computer vision, machine learning, and robots can become handy. In a collaboration between the Cluster of Excellence PhenoRob in Bonn and researchers from Bosch, we made progress in this field. A recent work lead by Jan Weyler to be published at the WACV conference in January 2022 addresses the problem of analyzing crops in agricultural fields based on camera data recorded with small and lightweight UAVs or robots. The goal is to derive information about plant development. They can monitor phenotypic traits in a fully automatic manner.


Crops and weeds

Crops and weeds in a field with consistent instance segmentation over leaves and plants  (© Photo: IGG / Photogrammetry).


The work proposes a vision-based approach that performs instance segmentation of individual crop leaves and associates each with its corresponding crop plant. All that works in the lab and real fields — furthermore, the source code for this work has been made public so that you can try it on your data



Examples of automatically extracted leaf instances (© Photo: IGG / Photogrammetry).


It enables computing relevant basic phenotypic traits on a per-plant level. The work proposes using a convolutional neural network operating directly on imagery such as from a lightweight drone using a regular RGB camera. The network generates two different representations of the input image that are then utilized to cluster individual crop leaf and plant instances. Combining a novel way to compute cluster regions based on the network’s predictions achieves highly accurate results. These results can directly segment and localize the plants in the fields and support automatic trait computation at large scales.


The network

The network behind the approach (© Photo: IGG / Photogrammetry).



... further reading:

  • Jan Weyler, Federico Magistri, Peter Seitz, Jens Behley, and Cyrill Stachniss: “In-Field Phenotyping Based on Crop Leaf and Plant Instance Segmentation,” in Proc. of the Winter.
    Conf. on Applications of Computer Vision (WACV), 2021.


Document Actions