BxD Primer Series: Liquid State Machine (LSM) Neural Networks
Hey there 👋
Welcome to BxD Primer Series where we are covering topics such as Machine learning models, Neural Nets, GPT, Ensemble models, Hyper-automation in ‘one-post-one-topic’ format. Today’s post is on Liquid State Machine (LSM) Neural Networks. Let’s get started:
The What:
Liquid State Machine (LSM) is a type of recurrent neural network used for processing sequential data.
In an LSM, the input is fed into a large pool of neurons called ‘liquid reservoir’. The liquid is connected randomly and have a rich internal dynamics. The neurons are constantly firing and interacting with each other in a nonlinear way. This results in a high-dimensional representation of input signal, which is then fed into a readout layer for classification or regression tasks.
Readout layer receive the output of liquid reservoir and perform a classification or regression task. It is trained to map the output of liquid reservoir to desired output of network.
LSMs are able to learn quickly and adapt to changing environment. They perform computation in a distributed and parallel manner, which makes it more efficient than traditional sequential processing models.
Additionally, LSMs require relatively little training data compared to other types of neural networks. They have been successfully applied for speech recognition, gesture recognition, time-series prediction, music analysis, and language modeling tasks.
Applications of LSMs:
An Analogy:
Say, you are randomly throwing stones into water. Depending of what kind of stones you have throwed into the water, there's a wave pattern that changes with each timestamp.
From this wave pattern you can make conclusions about the features of different stones. Out of this pattern you can tell what kind of stones you threw in.
Similarly, LSM’s liquid reservoir have a pool of neurons that fire up differently at different input sequences. From this pattern of firing neurons, the readout layer is able to make a prediction or classification regarding input.
Note: Training of the readout layer is separate from the training of liquid reservoir.
Anatomy of a Liquid State Machine:
Echo State Property of LSM:
The liquid reservoir of an LSM can be thought as a ‘memory’ that retains information about past input signals. When a new input signal is presented, the reservoir’s response is influenced by its previous states (only) to produce a output that reflects the current input. This is as if LSM is echoing the past inputs.
Echo state property is important for time-series prediction tasks because it enables the LSM to capture complex temporal dependencies in input. By "remembering" past states of input, LSM is able to predict the future states with high accuracy.
Time Constant in LSM:
A time constant is typically set by user and controls the rate at which internal state of reservoir forgets past inputs and adapts to new ones.
The choice of time constant depends on the characteristics of input signal and task at hand.
Time constant also affect the performance of LSM.
Use of Spiking Neurons:
Spiking neurons are commonly used in liquid reservoir of LSM. They mimic the behavior of biological neurons, which communicate through generation and propagation of impulses called spikes.
Consider the basic model for spiking neurons called the Leaky Integrate-and-Fire (LIF) neuron model, which is defined by following equations:
✪ Membrane potential dynamics:
Where,
✪ Spike generation: If the membrane potential V reaches a threshold value V_th, a spike is generated, and the membrane potential is reset to a reset value V_reset.
✪ Refractory period: After a spike is generated, the neuron enters a refractory period during which it is unable to generate additional spikes for refractory period duration t_ref.
Liquid reservoir is formed by a large number of randomly interconnected spiking neurons.
Recommended by LinkedIn
The How:
Understanding and developing a working LSM is typically involves below steps:
✪ Architecture: LSM consists of three main components: input, liquid reservoir, and readout.
✪ Input: Denote the input signal as x(t), where t represents the time index.
✪ Liquid Reservoir is a randomly connected network of neurons. Denote the reservoir state at time t as r(t). The dynamics of reservoir neurons can be described by following equation:
Where,
✪ Readout is responsible for mapping the reservoir states to desired output. Denote readout weights as W_out, a matrix of size N_out*N_r, where N_out is the dimensionality of output. The readout computes its output y(t) based on current reservoir state r(t):
✪ Training: Readout is trained using supervised learning. Given a training dataset of input-output pairs x(t), y(t), where y(t) is desired output at time t, the readout weights are learned to minimize a regularized mean squared error:
β is a regularization parameter to prevent overfitting. Optimization problem is expressed as:
✪ Prediction: Once the LSM is trained, it can be used for prediction. Given a new input signal x_test(t), the reservoir state r_test(t) is computed using reservoir dynamics equation:
The readout then generates predictions y_test(t) based on current reservoir state:
✪ Adaptation: The readout can be fine-tuned periodically to maintain accuracy in prediction tasks. This is useful in online-learning tasks.
Effect of Parameters on Performance:
✪ Spectral Radius refers to the maximum absolute eigenvalue of weights of recurrent connections in reservoir.
✪ Sparsity of Liquid Reservoir refers to the percentage of connections that are present between neurons in the reservoir. It is set by randomly selecting a fixed percentage of connections and setting weights of remaining connections to zero.
✪ Size of Input Window refers to the number of consecutive time steps of input that are presented to the network as a single input vector.
The Why:
Reasons for using Liquid State Machine:
The Why Not:
Reasons for not using Liquid State Machine:
Time for you to support:
In next edition, we will cover Extreme Learning Machine Neural Networks.
Let us know your feedback!
Until then,
Have a great time! 😊