Recurrent Neural Networks

LSTM

RNNs suffer from memory problems. The solution is to use gated neuron units, i.e, neurons that have a complex systems of masks/filters/… that process the information flowing in. A classical first read on the subject is Colah’s blog post. The main difference with a typical cell is the presence, in addition to the input and the hidden state, of a cell state, intended to capture the long-term memory of the network. That cell state is then modified through 3 gates:

There exists of course variants:

[ deeplearning  rnn  lstm  ]