Analog neuron hierarchy

Neural Netw. 2020 Aug;128:199-215. doi: 10.1016/j.neunet.2020.05.006. Epub 2020 May 11.

Abstract

In order to refine the analysis of the computational power of discrete-time recurrent neural networks (NNs) between the binary-state NNs which are equivalent to finite automata (level 3 in the Chomsky hierarchy), and the analog-state NNs with rational weights which are Turing-complete (Chomsky level 0), we study an intermediate model αANN of a binary-state NN that is extended with α≥0 extra analog-state neurons. For rational weights, we establish an analog neuron hierarchy 0ANNs ⊂ 1ANNs ⊂ 2ANNs ⊆ 3ANNs and separate its first two levels. In particular, 0ANNs coincide with the binary-state NNs (Chomsky level 3) being a proper subset of 1ANNs which accept at most context-sensitive languages (Chomsky level 1) including some non-context-free ones (above Chomsky level 2). We prove that the deterministic (context-free) language L#={0n1n∣n≥1} cannot be recognized by any 1ANN even with real weights. In contrast, we show that deterministic pushdown automata accepting deterministic languages can be simulated by 2ANNs with rational weights, which thus constitute a proper superset of 1ANNs. Finally, we prove that the analog neuron hierarchy collapses to 3ANNs by showing that any Turing machine can be simulated by a 3ANN having rational weights, with linear-time overhead.

Keywords: Analog neuron hierarchy; Chomsky hierarchy; Deterministic context-free language; Recurrent neural network; Turing machine.

MeSH terms

  • Language*
  • Neural Networks, Computer*
  • Neurons* / physiology