Learn About Our Meetup

5000+ Members



Join our meetup, learn, connect, share, and get to know your Toronto AI community. 



Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.



Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] nn2vec

How to encode the whole neural net into a single vector from which it can be recreated? I am trying to think about the problem of how to simulate one neural net on another. As UTM can simulate any TM, there should be a way to translate this concept to differentiable programming. Adversarial reprogramming might be an attack on this, but the injected task must encode the weights of the simulated neural net and and algorithm of how to use the weights. This is a problem because there is just too much data to feed in. The best scenario would be that there is UNN that can simulate all networks that are “smaller” (by the description length of the simulated net). Also, I don’t think there can be a perfect simulation – UNN can probably compute the hosted NN only as an approximation of the real one. Note: The edibles are too strong. Anyway, does somebody have any idea how to make any NN force to compute “(+ 1 1)”?

submitted by /u/AsIAm
[link] [comments]

Toronto AI is a social and collaborative hub to unite AI innovators of Toronto and surrounding areas. We explore AI technologies in digital art and music, healthcare, marketing, fintech, vr, robotics and more. Toronto AI was founded by Dave MacDonald and Patrick O'Mara.