Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
I have heard that GPUs love power of 2s, and that’s why embeddings and batch sizes are often seen as some power of 2, (64, 128, 256, 512, 1024, etc).
But I never have seen a concrete explanation for why this is.
Also, should a max batch size to be considered the biggest even number that the GPU memory can handle, or the biggest power of 2 that the GPU memory can handle?
submitted by /u/BatmantoshReturns
[link] [comments]