Kernel recursive least squares (KRLS) is very sensitive to non-Gaussian noise and hence, robust extensions are proposed using maximum correntropy criterion or generalized maximum correntropy. However, because of the complex form of the model, there is no theoretical analysis on the convergence of these filters. In this paper, we propose a new alternative: Kernel Regularized Robust RLS (KR3LS). It uses half-quadratic technique to simplify the form of the loss function. Our major contribution is then proving the convergence of the filter to the target weights and desired output. The bounds of regularization factor is also obtained. KR3LS is experimentally tested using synthetic and real data and is shown to perform superior compared to other robust alternatives.
Keywords: Half-quadratic optimization; Kernel robust recursive least squares; Non-Gaussian noise; Performance analysis.
Copyright © 2020 ISA. Published by Elsevier Ltd. All rights reserved.