Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[P] ‘ceviche’ — Simulating Maxwell’s Equations using Automatic Differentiation.

[P] 'ceviche' -- Simulating Maxwell's Equations using Automatic Differentiation.

We recently released our ceviche package on github, which simulates electromagnetic physics using automatic differentiation. Thought it might be interesting to this community as an application of backpropagation techniques to science & engineering applications outside of ML.

https://github.com/twhughes/ceviche

https://i.redd.it/rn724c41mjk31.png

Using automatic differentiation allows one to effortlessly differentiate the results of the simulation with respect to various design parameters defining the simulation. This allows you to do a lot of interesting things, for example:

– Perform automated, gradient-based optimization of photonic devices.

– Wrap the E&M solver in a machine learning model and do end to end training of physical hardware, like we did in this paper.

Most importantly, in contrast with what is common practice in the field of photonics, this can all be done *without* needing to do any tedious analytical calculations by hand, and one can rest assured that the derivatives are accurate and efficiently computed.

If you’re interested in some of the nitty gritty details about reverse vs. forward mode differentiation in electromagnetic simulations, check out our pre-print as well, linked here.

submitted by /u/BarnyardPuer
[link] [comments]