Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[R] Acoustic, optical, and other types of waves are recurrent neural networks!

Paper: Open access in Science Advances

Code: Available on GitHub

Lately, there has been a lot of cross-pollination of ideas between different areas of physical and numerical science and the field of machine learning. This has lead to interesting demonstrations of optimizing physical models using machine learning frameworks, but also to the development of a number of exciting new machine learning models (e.g. neural ODEs, Hamiltonian neural networks, etc) that borrow concepts from physics.

My group has been particularly interested in the view point that physics itself can be used as a computational engine. In other words, we’re interested in physical systems that can serve as hardware accelerators or as specialized analog processors for fast and efficient machine learning computations.

In our paper that was recently published in Science Advances (open access) we have shown that the physics of waves map directly into the time dynamics of recurrent neural networks (RNNs). Using this connection, we demonstrated that an acoustic / optical system (through a numerical model developed in PyTorch) could be trained to accurately classify vowels from recordings of human speakers. Essentially, we launched the vowel waveforms into the physical model and allowed the optimizer to add and remove material at 1000’s of individual points within the domain, essentially acting as the weights of the model.

Because this machine learning model actually corresponds to a physical system, it means that we could take the trained material distribution and “print it” into a real physical device. The result would be something like an ASIC (application specific integrated circuit), but for a specific RNN computation. We’re really excited about these results because they point to being able to perform complex recurrent machine learning calculations completely passively, with no energy consumption, aside from the energy carried by the pulse itself.

submitted by /u/ian_williamson
[link] [comments]