Fractional Approximation of Broad Learning System

IEEE Trans Cybern. 2024 Feb;54(2):811-824. doi: 10.1109/TCYB.2021.3127152. Epub 2024 Jan 17.

Abstract

Approximation ability is of much importance for neural networks. The broad learning system (BLS) (Chen and Liu, 2018), widely used in the industry with good performance, has been proved to be a universal approximator from the aspect of density. This kind of approximation property is very important, which proves the existence of the desired network but does not provide a means of construction that is commonly implemented through complexity aspect. Thus, such an approach lacks the advantage of determining constructively the network architecture and its weights. To the best of our knowledge, for a BLS, there is a few theory providing a constructive approach to obtain the network structure along with weights ensuring the approximation properties. By virtue of the long-term memory and nonlocality properties, fractional calculus has observed many distinctive applications. The purpose of this article is to study the BLS approximation ability constructively, which is valid for fractional case as well. Specifically, first we introduce two simplified BLSs by means of extending functions. For each of the simplified BLSs, an upper bound of error is derived through the modulus of continuity of Caputo fractional derivatives. As a result, two types of fractional convergent behaviors of BLS, that is: 1) pointwise and 2) uniform convergence, have been rigorously proved as well. Finally, some numerical experiments are conducted to demonstrate the approximation capabilities of BLSs.