Deep Learning Recipes | RNNs, Echo State Networks

This page includes an introduction to echo state networks and their applications that I worked during my PhD research.

During my PhD research, I spend a decent amount of time on studying and employing reservoir computing approaches, including different variant of echo state network (ESN). ESNs have been illustrated to be very effective and efficient in forecasting highly nonlinear time series. Alongside with my research, I have implemented several versions of ESN architectures, including the baseline ESN, Deep ESN, Clustered ESN, and a physics-informed version called Hybrid ESN. Recently, I combined these architectures with LSTM autoencoders and also convolution layers, which resulted in robust predictive models for chaotic time series and reconstructing complex spatiotemporal image data, respectively. The packages that I developed are in both Python and MATLAB, and both version will be available to download after completing the unit tests and adding the documentations soon.

Moreover, the ESN Cell and ESN layer has recently been added to TensorFlow addons, but the documentations and examples have not been provided yet. I will also add a well-documented example of a benchmark solved by this newly added ESN layer to this section.

A simple architecture of baseline ESN (top) and hybrid ESN (bottom)