Long Short-Term Memory Network Tutorials and Examples for Beginners
Long Short-Term Memory Network (LSTM) was firstly introduced
by Hochreiter and Schmidhuber in 1997, which is a Variant RNN and contains three gates: forget gate, input gate and output gate.
In this page, we write some tutorials and examples on Long Short-Term Memory Network, you can learn how to use this network by following our tutorials.
IndRNN is proposed in paper: Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN. In this tutorial, we will use some examples to introduce this model.
We often use RNN/GRU/LSTM/BiLSTM to encode sequence. In order to get the output of these models. We can average outputs or use attention to compute. In this tutorial, we will introduce how to average their outputs.
Advanced LSTM is a variation of LSTM, which is proposed in paper <> In this tutorial, we will compare it with Conventional LSTM, which will help us to understand it.
LSTM is a good method to process sequence in NLP, however, how long sequence can be handled effectively by it? In this tutorial, we will discuss this topic.
Highway Networks is proposed in paper: Highway Networks. It is proposed based on LSTM. In this tutorial, we will introduce it for machine learning beginners.
LSTMP (LSTM with Recurrent Projection Layer) is an improvement of LSTM with peephole conncections. In this tutorial, we will introduce this model for LSTM Beginners.
Tree LSTM model is widely used in many deep learning fields. It is often used to process tree structure data. In this tutorial, we will introduce it for deep learning beginners.