Skip to main content


Learn About Our Meetup

5000+ Members



Join our meetup, learn, connect, share, and get to know your Toronto AI community. 



Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.



Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D]Fathoming the Deep in Deep Learning – A Practical Approach

[D]Fathoming the Deep in Deep Learning – A Practical Approach

Deep in ‘Deep Learning’ is elusive yet approachable with a bit of mathematics. This beckons a practical question: Is elementary calculus sufficient to unravel deep learning? The answer is yes indeed. Armed with an unbound curiosity to learn and re-learn new and old alike and possibly if you can methodically follow below sections, I reckon you’ll cross the chasm to intuitively understand and apply every concepts including calculus in their glory to de-clutter all intricacies of deep learning. I’m covering the steps I took and what I researched, read and understood – being captured to reveal each concept as intuitively as possible and additional topics that piques your interest:

Read the full article @ and share your thoughts.

Steps to fathom the depth:

The Beginnings – Modelling Decisions with Perceptrons, Workhorses inside Nodes – Activation Functions, A Gentle Detour on Basics – Differential CalculusThe Underpinnings – Essential Statistics and Loss Reduction, The Grand Optimization – Gradient Descent, Intuitive Examples to the Rescue – Descent Demystified, Ensemble directed Back & Forth – Feed Forward & Back Propagation, Inner Workings of Bare NeuralNet – Matrices matched to Code & Learning Curve Retraced – References & Acknowledgements

submitted by /u/avanttech
[link] [comments]