Thermodynamic State Machine Network

Entropy (Basel). 2022 May 24;24(6):744. doi: 10.3390/e24060744.

Abstract

We describe a model system-a thermodynamic state machine network-comprising a network of probabilistic, stateful automata that equilibrate according to Boltzmann statistics, exchange codes over unweighted bi-directional edges, update a state transition memory to learn transitions between network ground states, and minimize an action associated with fluctuation trajectories. The model is grounded in four postulates concerning self-organizing, open thermodynamic systems-transport-driven self-organization, scale-integration, input-functionalization, and active equilibration. After sufficient exposure to periodically changing inputs, a diffusive-to-mechanistic phase transition emerges in the network dynamics. The evolved networks show spatial and temporal structures that look much like spiking neural networks, although no such structures were incorporated into the model. Our main contribution is the articulation of the postulates, the development of a thermodynamically motivated methodology addressing them, and the resulting phase transition. As with other machine learning methods, the model is limited by its scalability, generality, and temporality. We use limitations to motivate the development of thermodynamic computers-engineered, thermodynamically self-organizing systems-and comment on efforts to realize them in the context of this work. We offer a different philosophical perspective, thermodynamicalism, addressing the limitations of the model and machine learning in general.

Keywords: active equilibration; input functionalization; machine learning; scale integration; thermodynamic computing; thermodynamicalism.

Grants and funding

This research received no external funding.