<script type="application/ld+json">
{
 "@context": "https://schema.org",
 "@type": "FAQPage",
 "mainEntity": [{
   "@type": "Question",
   "name": "What is reservoir computing?",
   "acceptedAnswer": {
     "@type": "Answer",
     "text": "Reservoir computing is a computation framework that uses a Recurrent Neural Network but does not update all the parameters of the network. It only updates some parameters, the other parameters are chosen at random and are left fixed."
   }
 },{
   "@type": "Question",
   "name": "What are the types of classical reservoir computing?",
   "acceptedAnswer": {
     "@type": "Answer",
     "text": "1. Context reverberation network.
2. Echo state network.
3. Liquid-state machine.
4. Nonlinear transient computation.
5. Deep reservoir computing."
   }
 }]
}
</script>

Reservoir computing

What is reservoir computing?

Reservoir computing is a computation framework that uses a Recurrent Neural Network but does not update all the parameters of the network. It only updates some parameters, the other parameters are chosen at random and are left fixed.

It uses the dynamics of a fixed, non-linear system known as a reservoir to map input signals to higher dimensional computational spaces. The reservoir is used as a black box, and once an input signal is fed to it, a simple readout mechanism is then trained to read the state of the reservoir and map it to the desired output.

Since the reservoir dynamics are fixed, training is only carried out at the readout stage. 

In classical reservoir computing, the reservoir needs to have two properties: first, it needs to consist of individual, non-linear units, and second, it has to be able to store information.

What are the types of classical reservoir computing?

1. Context reverberation network

In the Context reverberation network, an input layer feeds into a high dimensional dynamical system and a trainable single-layer perceptron reads it out. There are two types of dynamical systems: an RNN with randomized and fixed weights, and a continuous reaction-diffusion system.

The perceptron associates current inputs with the signals that reverberate in the dynamical system.

2. Echo state network

Echo state networks have a sparsely connected hidden layer. The hidden layer usually has less than 10% connectivity. They seek to drive a large, random, fixed RNN with the input signal, thereby inducing a nonlinear response signal in each neuron in the reservoir and then connect it to the desired output signal using a trainable linear combination of all of the response signals.

3. Liquid-state machine

A liquid-state machine (LSM) employs a spiking neural network. The LSM is made up of a huge collection of nodes (or neurons). Every neuron gets time-varying inputs from other neurons and from external sources.

Due to the recurring nature of the connections, the time-varying input turns into a spatio-temporal pattern of activations in the network nodes. These spatio-temporal patterns of activation are then read out by linear discriminant units.

4. Nonlinear transient computation

This is most important when the time-varying input signals leave the mechanism’s internal dynamics. That results in transients (temporary altercations) which are represented in the output of the device.

5. Deep reservoir computing

This makes it possible to develop efficiently trained models for the purpose of processing temporal data in a hierarchical manner. It also allowed the investigation of the role of layered composition in RNNs.

What is Quantum reservoir computing?

In Quantum reservoir computing, the nonlinear nature of quantum mechanical interactions or processes can be used for the purpose of forming the characteristic nonlinear reservoirs, but it can also be performed with linear reservoirs when the input is injected into the reservoir to create non-linearity. 

The types of quantum reservoir computing are:

  • Gaussian states of interacting quantum harmonic oscillators
  • 2-D quantum dot lattices
  • Nuclear spins in a molecular solid
  • Reservoir computing on gate-based near-term superconducting quantum computers

Thanks for reading! We hope you found this helpful.

Ready to level-up your business? Click here.

About Engati

Engati powers 45,000+ chatbot & live chat solutions in 50+ languages across the world.

We aim to empower you to create the best customer experiences you could imagine. 

So, are you ready to create unbelievably smooth experiences?

Check us out!

Reservoir computing

October 14, 2020

Table of contents

Key takeawaysCollaboration platforms are essential to the new way of workingEmployees prefer engati over emailEmployees play a growing part in software purchasing decisionsThe future of work is collaborativeMethodology

What is reservoir computing?

Reservoir computing is a computation framework that uses a Recurrent Neural Network but does not update all the parameters of the network. It only updates some parameters, the other parameters are chosen at random and are left fixed.

It uses the dynamics of a fixed, non-linear system known as a reservoir to map input signals to higher dimensional computational spaces. The reservoir is used as a black box, and once an input signal is fed to it, a simple readout mechanism is then trained to read the state of the reservoir and map it to the desired output.

Since the reservoir dynamics are fixed, training is only carried out at the readout stage. 

In classical reservoir computing, the reservoir needs to have two properties: first, it needs to consist of individual, non-linear units, and second, it has to be able to store information.

What are the types of classical reservoir computing?

1. Context reverberation network

In the Context reverberation network, an input layer feeds into a high dimensional dynamical system and a trainable single-layer perceptron reads it out. There are two types of dynamical systems: an RNN with randomized and fixed weights, and a continuous reaction-diffusion system.

The perceptron associates current inputs with the signals that reverberate in the dynamical system.

2. Echo state network

Echo state networks have a sparsely connected hidden layer. The hidden layer usually has less than 10% connectivity. They seek to drive a large, random, fixed RNN with the input signal, thereby inducing a nonlinear response signal in each neuron in the reservoir and then connect it to the desired output signal using a trainable linear combination of all of the response signals.

3. Liquid-state machine

A liquid-state machine (LSM) employs a spiking neural network. The LSM is made up of a huge collection of nodes (or neurons). Every neuron gets time-varying inputs from other neurons and from external sources.

Due to the recurring nature of the connections, the time-varying input turns into a spatio-temporal pattern of activations in the network nodes. These spatio-temporal patterns of activation are then read out by linear discriminant units.

4. Nonlinear transient computation

This is most important when the time-varying input signals leave the mechanism’s internal dynamics. That results in transients (temporary altercations) which are represented in the output of the device.

5. Deep reservoir computing

This makes it possible to develop efficiently trained models for the purpose of processing temporal data in a hierarchical manner. It also allowed the investigation of the role of layered composition in RNNs.

What is Quantum reservoir computing?

In Quantum reservoir computing, the nonlinear nature of quantum mechanical interactions or processes can be used for the purpose of forming the characteristic nonlinear reservoirs, but it can also be performed with linear reservoirs when the input is injected into the reservoir to create non-linearity. 

The types of quantum reservoir computing are:

  • Gaussian states of interacting quantum harmonic oscillators
  • 2-D quantum dot lattices
  • Nuclear spins in a molecular solid
  • Reservoir computing on gate-based near-term superconducting quantum computers

Thanks for reading! We hope you found this helpful.

Ready to level-up your business? Click here.

Share

Continue Reading