[D] Can Recurrent Neural Networks have loops that go backward?
https://youtu.be/oJNHXPs0XDk?t=333
I was watching this video where the guy says that RNNs can have loopbacks (5:33).
I always thought they were called “Recurrent” because they have units that can be appended over and over to an architecture to form a sequence.
Is it really correct to say that they can have feedback loops to a previous layer and that “it’s not just a feedforward network”?
Thanks
submitted by /u/adkyary
[link] [comments]