Naive Gabor Networks for Hyperspectral Image Classification

IEEE Trans Neural Netw Learn Syst. 2021 Jan;32(1):376-390. doi: 10.1109/TNNLS.2020.2978760. Epub 2021 Jan 4.

Abstract

Recently, many convolutional neural network (CNN) methods have been designed for hyperspectral image (HSI) classification since CNNs are able to produce good representations of data, which greatly benefits from a huge number of parameters. However, solving such a high-dimensional optimization problem often requires a large number of training samples in order to avoid overfitting. In addition, it is a typical nonconvex problem affected by many local minima and flat regions. To address these problems, in this article, we introduce the naive Gabor networks or Gabor-Nets that, for the first time in the literature, design and learn CNN kernels strictly in the form of Gabor filters, aiming to reduce the number of involved parameters and constrain the solution space and, hence, improve the performances of CNNs. Specifically, we develop an innovative phase-induced Gabor kernel, which is trickily designed to perform the Gabor feature learning via a linear combination of local low-frequency and high-frequency components of data controlled by the kernel phase. With the phase-induced Gabor kernel, the proposed Gabor-Nets gains the ability to automatically adapt to the local harmonic characteristics of the HSI data and, thus, yields more representative harmonic features. Also, this kernel can fulfill the traditional complex-valued Gabor filtering in a real-valued manner, hence making Gabor-Nets easily perform in a usual CNN thread. We evaluated our newly developed Gabor-Nets on three well-known HSIs, suggesting that our proposed Gabor-Nets can significantly improve the performance of CNNs, particularly with a small training set.