The reservoir computing paradigm of information processing has emerged as a natural response to the problem of training recurrent neural networks. It has been realized that the training phase can be avoided provided a network has some well-defined properties, e.g. the echo state property. This idea has been generalized to arbitrary artificial dynamical systems. In principle, any dynamical system could be used for advanced information processing applications provided that such a system has the separation and the approximation property. To carry out this idea in practice, the only auxiliary equipment that is needed is a simple read-out layer that can be used to access the internal states of the system. In the following, several applications scenarios of this generic idea are discussed, together with some related engineering aspects. We cover both practical problems one might meet when trying to implement the idea, and discuss several strategies of solving such problems.
@inproceedings(Konkoli-CompMatter-2018-RC, author = "Zoran Konkoli and Stefano Nichele and Matthew Dale and Susan Stepney", title = "Reservoir computing with computational matter", chapter = 14, pages = "269-292", crossref = "CompMatter-2018" ) @proceedings(CompMatter-2018, editor = "Susan Stepney and Steen Rasmussen and Martyn Amos", title = "Computational Matter", booktitle = "Computational Matter", publisher = "Springer", year = 2018, )