Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
Interesting research from DeepMind:
“Our new work on memory uses a neural network’s weights as fast and compressive associative storage. Reading from the memory is performed by approximate minimization of the energy modeled by the network.”
“Unlike classical associative memory models such as Hopfield networks, we are not limited in the expressivity of our energy model, and make use of the deep architectures with fully-connected, convolutional and recurrent layers.”
“For this to work, stored patterns must be local minima of the energy. We use recent advances in gradient-based meta-learning to write into the memory such that this requirement approximately holds.”
submitted by /u/Quantum_Network
[link] [comments]