Backpropagation through time makes rnns feasible

By unfolding an RNN into a feedforward one using a clone for each time point, one can apply backpropagation to RNNs. This makes the infamously difficult to train models somewhat easier to train. However, there are drawbacks. For instance, only a certain number of time points can be considered in the unrolled network, limiting the ability of RNNs to learn memory effects.

Resources

Backlinks