Recurrent Neural Network & Long short term memory is one of the hottest areas in Deep Learning. 

 

1. Recurrent Neural Network (RNN)

RNN can handle sequence input as well as sequence output.

 

2. Long short term memory (LSTM)

LSTM is  one of  RNN architectures that provide a solution to the exploding and vanishing gradient problem. LTSM also provides methods to deal with long term dependencies that arise from variable assignment. This is a structure of LSTM provided by Alex Graves, Google Deep Mind (Generaing Sequences With Recurrent Neural Networks, June 2014, p5) 

Theses are formulas that present structure of LSTM. Ct means cell or memory. Therefore when ft is 1, the past information at t-1 is kept. If ft is zero, the past information at t-1 is deleted. LSTM is one of the defect standard models for natural language processing or sequential data analysis.

LSTM structure i:input gate, f:forget gate, o:output gate, c:memory cell

LSTM structure i:input gate, f:forget gate, o:output gate, c:memory cell