Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
I will take two examples to illustrate the discussion.
So from the hardware accelerator design perspective
If you take a look at the BitFusion Paper : Github –> https://github.com/hsharma35/bitfusion
: ArXiv –> https://arxiv.org/pdf/1712.01507
If you look at the Quantized Neural networks example
Github — > https://github.com/MatthieuCourbariaux/BinaryNet
https://github.com/itayhubara/BinaryNet
ArXiv — > https://arxiv.org/pdf/1609.07061
So one can see from reading the papers and the code what tools were used for the experimentation. Here the first one uses some hardware modelling tools (like CACTI) to get preliminary results. In the case of the latter they use torch and theano and modify it to get the results (as far as I understand).
So are there any other suggestions that the Machine learning community would like to mention?
submitted by /u/elhiruko
[link] [comments]