Long Short-Term Memory Network Tutorials and Examples for Beginners
Long Short-Term Memory Network (LSTM) was firstly introduced
by Hochreiter and Schmidhuber in 1997, which is a Variant RNN and contains three gates: forget gate, input gate and output gate.
In this page, we write some tutorials and examples on Long Short-Term Memory Network, you can learn how to use this network by following our tutorials.
There are many models that have improved LSTM, GRU (Gated Recurrent Unit) is one of them. In this tutorial, we will introduce GRU and compare it with LSTM.
As to GRU, there is a reset gate in it. Can we remove this reset gate in GRU? If we remove it, the performance of GRU will decreased? The answer is we can remove the reset gate.
We have created a customized lstm model (lstm.py) using tensorflow. In this tutorial, we will use this customized lstm model to train mnist set and classify handwritten digits.
In this tutorial, we list some tips on lstm kernel, which will help you to understand and use lstm in you tensorflow application, you can get more useful information by referring our tutorial.
In this tutorial, we discuss the weights in lstm cell and how to get it. The shape of kernel is very important, which can help us to understand lstm, you can follow our tutorial to more detail.
In this tutorial, we discuss what is adaptive gating mechanism in deep learning and how to use it in ai application, the key is gate is a sigmoid function of inputs. You can learn how to apply it in your research by following our tutorial.
In this tutorial, we introduce some basic knownledge of LSTM including LSTM three gates and structure, you can learn how lstm works and try to create a lstm by this tutorial.