Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

Category: Reddit MachineLearning

[R] Acoustic, optical, and other types of waves are recurrent neural networks!

Paper: Open access in Science Advances

Code: Available on GitHub

Lately, there has been a lot of cross-pollination of ideas between different areas of physical and numerical science and the field of machine learning. This has lead to interesting demonstrations of optimizing physical models using machine learning frameworks, but also to the development of a number of exciting new machine learning models (e.g. neural ODEs, Hamiltonian neural networks, etc) that borrow concepts from physics.

My group has been particularly interested in the view point that physics itself can be used as a computational engine. In other words, we’re interested in physical systems that can serve as hardware accelerators or as specialized analog processors for fast and efficient machine learning computations.

In our paper that was recently published in Science Advances (open access) we have shown that the physics of waves map directly into the time dynamics of recurrent neural networks (RNNs). Using this connection, we demonstrated that an acoustic / optical system (through a numerical model developed in PyTorch) could be trained to accurately classify vowels from recordings of human speakers. Essentially, we launched the vowel waveforms into the physical model and allowed the optimizer to add and remove material at 1000’s of individual points within the domain, essentially acting as the weights of the model.

Because this machine learning model actually corresponds to a physical system, it means that we could take the trained material distribution and “print it” into a real physical device. The result would be something like an ASIC (application specific integrated circuit), but for a specific RNN computation. We’re really excited about these results because they point to being able to perform complex recurrent machine learning calculations completely passively, with no energy consumption, aside from the energy carried by the pulse itself.

submitted by /u/ian_williamson
[link] [comments]

[D] AI Residency 2020: All You Need to Know (+ examples)

[D] AI Residency 2020: All You Need to Know (+ examples)

https://preview.redd.it/2nxf32qxye841.jpg?width=1920&format=pjpg&auto=webp&s=041a95ce97e4f6eb763eea2d835bfd64028c1b08

Hi folks! I’m a current AI Resident at Facebook. I know you have a lot of questions about this program, so I made a video answering most of them. I’ve also added my examples and some resources. Hope it helps!

(I’m not really into reddit, so better ask your questions through youtube comments. Good luck with your applications!)

submitted by /u/acecreamu
[link] [comments]

[Discussion] Resources for data architecture design for enterprise to support data science/ML

[Discussion] Resources for data architecture design for enterprise to support data science/ML

Hello,

I am looking for resources (books, blogs, courses) to learn more about designing data architecture for enterprise that supports executing data science projects at scale. I realize the importance of having an architecture that supports rapid experimentation, data access/abstraction, security baked in, while also being scale-ready whenever projects are ready for production.

Are you aware of any such resources? Have you read a book/blog that helped you with these questions for the enterprise you support? Thanks for your help!

https://preview.redd.it/z2jdothyrd841.png?width=1342&format=png&auto=webp&s=7b9ac09318fb491cb42a964f9cd0227c9c874583

submitted by /u/ucancallmebiru
[link] [comments]

[P] Building a community on Artificial Intelligence for Life Sciences

It’s no secret that ML and AI can deliver a great value in Life Sciences.

I have spent past 10 years as a researcher in the field of Computational Biology working very close to wetlab.

Now, together with a lovely bunch of peops, I am building virtual community dedicated to AI in Life Sciences in the format of a not-for-profit organization. We now unite several people working in the field in the UK and Switzerland. The mission: share tech expertise, code, links and papers.

For the sake of experiment at the moment we are running fully from a telegram channel, come have a look at https://ails.institute and join us! All ideas are welcome!

submitted by /u/ayakimovich
[link] [comments]

[D] Please suggest me material fo maths

I want to deeply understand maths related to deep learning and machine learning. I am mentioning my background so someone can suggest me some materials for learning purpose.

Backgroud: Undergrad completed.I have completed deep learning specialization. I can calculate derivation of different things like sigmoid and thanks to deep learning specialization, i also know how actually neural network work. I have also implemented few research paper on my own and open sourced the code.

But, now i want to re-learn calculus, linear algebra and probabilities for better understanding of dl and ml methods. Why we choose sigmoid or tanh, how actually relu is implemented and how actually auto-grad works. things like that.

I searched for some book on calculus and linear algebra but recommendations are to off. Many people recommended Spivak and i started reading it. but starting chapter looks too much dull and only explains theoretically. I want materials where i can understand thing pratically. If possible, then with programming exercises.

please suggest for calculus, LA and Stats.

submitted by /u/canntdecode
[link] [comments]

[P] Fast Face Aging GAN

Hello people,

This project is basically me trying to replicate the faceapp app. Where it can make you look older. This project supports aging a human to different age ranges (can make you younger or older). It also uses identity preserving techniques from the paper “Identity preserved face aging with CGANs” so that the aged face is not too dissimilar from the non-aged face. It’s fast because I benchmarked the model on the fritz benchmarking utility, and the iPhone X runtime comes out to about 30fps on a 512×512 sized image.

You can also train the network with a higher weight on the age loss, to make the effect more drastic. I kept it small to make the effect subtle. From what I have observed, it can be used on non-cropped, raw images of humans in the wild, and it will still do a pretty decent job of aging the face and leaving all other things unchanged (as you can see on the samples on the github page).

For those interested, here’s a link: Fast Face Aging GAN

Suggestions and improvements are more than welcome. There’s also a demo script to try out the pretrained model on your images.

submitted by /u/abnormdist
[link] [comments]