The 3 most common types of recurrent neural networks are vanilla recurrent neural network (RNN), long short-term memory (LSTM) and gated recurrent units (GRU).There are many illustrated diagrams for recurrent neural networks out there..My personal favourite is the one by Michael Nguyen in this article published in Towards Data Science, because he provides us with intuition on these models and more importantly the beautiful illustrations that make it easy for us to understand..But the motivation behind my post is to have a better visualisation what happens in these cells, and how the nodes are being shared and how they transform to give the output nodes..I was also inspired by the Michael’s nice animations .This article looks into vanilla RNN, LSTM and GRU cells..It is a short read and is for those who have read up on these topics..(I recommend reading Michael’s article before reading this post.) It is important to note that the following animations are sequential to guide the human eyes, but do not reflect the chronological order during vectorised machine computation.Here is the legend that I have used for the illustrations.Fig..0: Legend for animationsIn my animations, I have used an input size of 3 (green) and 2 hidden units (red) with a batch size of 1.Let’s begin!RNNFig..1: Animated RNN cellt — time stepX — inputh — hidden statelength of X — size/dimension of inputlength of h — no..of hidden units..Note that different libraries call them differently, but they mean the same:- Keras — state_size ,units- PyTorch — hidden_size – TensorFlow — num_unitsLSTMFig..2: Animated LSTM cellC — cell stateNote that the dimension of the cell state is the same as that of the hidden state.GRUFig..3: Animated GRU cellHope these animations helped you in one way or another!. More details