Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data

PeerJ. 2021 Jul 29;9:e11790. doi: 10.7717/peerj.11790. eCollection 2021.


Airborne remote sensing offers unprecedented opportunities to efficiently monitor vegetation, but methods to delineate and classify individual plant species using the collected data are still actively being developed and improved. The Integrating Data science with Trees and Remote Sensing (IDTReeS) plant identification competition openly invited scientists to create and compare individual tree mapping methods. Participants were tasked with training taxon identification algorithms based on two sites, to then transfer their methods to a third unseen site, using field-based plant observations in combination with airborne remote sensing image data products from the National Ecological Observatory Network (NEON). These data were captured by a high resolution digital camera sensitive to red, green, blue (RGB) light, hyperspectral imaging spectrometer spanning the visible to shortwave infrared wavelengths, and lidar systems to capture the spectral and structural properties of vegetation. As participants in the IDTReeS competition, we developed a two-stage deep learning approach to integrate NEON remote sensing data from all three sensors and classify individual plant species and genera. The first stage was a convolutional neural network that generates taxon probabilities from RGB images, and the second stage was a fusion neural network that "learns" how to combine these probabilities with hyperspectral and lidar data. Our two-stage approach leverages the ability of neural networks to flexibly and automatically extract descriptive features from complex image data with high dimensionality. Our method achieved an overall classification accuracy of 0.51 based on the training set, and 0.32 based on the test set which contained data from an unseen site with unknown taxa classes. Although transferability of classification algorithms to unseen sites with unknown species and genus classes proved to be a challenging task, developing methods with openly available NEON data that will be collected in a standardized format for 30 years allows for continual improvements and major gains for members of the computational ecology community. We outline promising directions related to data preparation and processing techniques for further investigation, and provide our code to contribute to open reproducible science efforts.

Keywords: Airborne remote sensing; Data science competition; Deep learning; Machine learning; National Ecological Observatory Network; Neural networks; Open science; Remote sensing; Species classification.

Grant support

Funding for this work was provided by Earth Lab, through the University of Colorado at Boulder (CU Boulder) Grand Challenge Initiative, and the Cooperative Institute for Research in Environmental Sciences (CIRES) at CU Boulder. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.