Skip to main content


Learn About Our Meetup

5000+ Members



Join our meetup, learn, connect, share, and get to know your Toronto AI community. 



Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.



Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Explaining Feedforward, Backpropagation and Optimization: The Math Explained Clearly with Visualizations. I took the time to write this long article (>5k words), and I hope it helps someone understand neural networks better.

I have been studying Machine Learning in the last few months, and I wanted to really get to understand everything that goes on in a basic neural network (excluding the many architectures). Therefore, I took the time to write this long article, to explain what I have learned. In particular, the post on purpose very extensive and goes into the smaller details; this is to have everything in one place. As the site says, it is machine learning from scratch, and I share what I have learned.

The particular reason for posting here, is that I hope someone else could learn from this. The goal is to share the knowledge in the easiest absorbable way possible. I tried to visualize much of the process going on in neural networks, but I also went through the math, to the detail of the partial derivatives.

This was quite a journey, and it took about 1 month to read all the things I have read, and write it down, have it make sense and creating the graphics.

Regardless, here is the link. Any constructive feedback is appreciated.

submitted by /u/permalip
[link] [comments]