On Explaining the Surprising Success of Reservoir Computing Forecaster of Chaos and Other Random Neural Network Architectures
Machine learning has become a widely popular and successful paradigm, including for data-driven science. A major application problem is forecasting complex dynamical systems. Artificial neural networks (ANN) have evolved as a clear leading approach, and recurrent neural networks (RNN) are considered to be especially well suited for. In this setting, reservoir computer (RC) has emerged for simplicity and computational advantages. Instead of a fully trained network, an RC trains only read-out weights. However, why and how an RC works at all, despite randomly selected weights is perhaps a surprise. To this end, we offer some simple explanations as to why it works at all, connections to classical methods in time-series forecasting, and operator methods, but also improvements such as our recent next generation reservoir computing formalism (NG-RC) which side steps much of the random aspects but allows for an exactly equivalent but even simpler and more successful approach. Furthermore we for the first time offer some geometric interpretations of another random network.