Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
https://youtu.be/oJNHXPs0XDk?t=333
I was watching this video where the guy says that RNNs can have loopbacks (5:33).
I always thought they were called “Recurrent” because they have units that can be appended over and over to an architecture to form a sequence.
Is it really correct to say that they can have feedback loops to a previous layer and that “it’s not just a feedforward network”?
Thanks
submitted by /u/adkyary
[link] [comments]