Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Effect of chaining multiple transformers (attention)

For recurrent neural networks (RNNs) increasing the number of units allows the network to (better) model a relationship over more distant inputs in an input sequence.

However what’s the effect of increasing the number of layers in a transformer? Since the transformer looks at multiple inputs of the sequence simultaneously at each layer – it doesn’t have an analogue with RNNs.

submitted by /u/mellow54
[link] [comments]