I'm learning a tensorflow tutorial about LSTM: Truncated Backpropagation.
This section says the code uses "truncated backpropagation", so what exactly does this mean?
In a neural network setting in general (well, most of the time) you perform two steps during training:
FORWARD PASS
BACKWARD PASS
In the backward pass it might be that, for some reason, you only want to train the top layer or only some specific parts of your net. In this case you would want to stop the backwards passing of gradients at that point. This is what truncating backpropagation does (often done via https://www.tensorflow.org/versions/r0.9/api_docs/python/train.html#stop_gradient).