Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
I understand that the multiple workers do gradient update to the global network is done asynchronously in A3C ( https://arxiv.org/abs/1602.01783 ).
But how do the workers ensure that they won’t retrieve the same parameters from the global network they just updated?
Thank you.
submitted by /u/ml4564
[link] [comments]