Beyond multilayer perceptrons: Investigating complex topologies in neural networks

Neural Netw. 2024 Mar:171:215-228. doi: 10.1016/j.neunet.2023.12.012. Epub 2023 Dec 9.

Abstract

This study delves into the crucial aspect of network topology in artificial neural networks (NNs) and its impact on model performance. Addressing the need to comprehend how network structures influence learning capabilities, the research contrasts traditional multilayer perceptrons (MLPs) with models built on various complex topologies using novel network generation techniques. Drawing insights from synthetic datasets, the study reveals the remarkable accuracy of complex NNs, particularly in high-difficulty scenarios, outperforming MLPs. Our exploration extends to real-world datasets, highlighting the task-specific nature of optimal network topologies and unveiling trade-offs, including increased computational demands and reduced robustness to graph damage in complex NNs compared to MLPs. This research underscores the pivotal role of complex topologies in addressing challenging learning tasks. However, it also signals the necessity for deeper insights into the complex interplay among topological attributes influencing NN performance. By shedding light on the advantages and limitations of complex topologies, this study provides valuable guidance for practitioners and paves the way for future endeavors to design more efficient and adaptable neural architectures across various applications.

Keywords: Bio-inspired computing; Complex networks; Manifold learning; Neural networks; Robustness.

MeSH terms

  • Forecasting
  • Learning*
  • Neural Networks, Computer*