[D]Fathoming the Deep in Deep Learning – A Practical Approach
Deep in ‘Deep Learning’ is elusive yet approachable with a bit of mathematics. This beckons a practical question: Is elementary calculus sufficient to unravel deep learning? The answer is yes indeed. Armed with an unbound curiosity to learn and re-learn new and old alike and possibly if you can methodically follow below sections, I reckon you’ll cross the chasm to intuitively understand and apply every concepts including calculus in their glory to de-clutter all intricacies of deep learning. I’m covering the steps I took and what I researched, read and understood – being captured to reveal each concept as intuitively as possible and additional topics that piques your interest: Read the full article @ https://avantlive.wordpress.com/2019/04/29/fathoming-the-deep-in-deep-learning-a-practical-approach/ and share your thoughts. https://i.redd.it/tx1lqu3dp8v21.jpg Steps to fathom the depth: The Beginnings – Modelling Decisions with Perceptrons, Workhorses inside Nodes – Activation Functions, A Gentle Detour on Basics – Differential CalculusThe Underpinnings – Essential Statistics and Loss Reduction, The Grand Optimization – Gradient Descent, Intuitive Examples to the Rescue – Descent Demystified, Ensemble directed Back & Forth – Feed Forward & Back Propagation, Inner Workings of Bare NeuralNet – Matrices matched to Code & Learning Curve Retraced – References & Acknowledgements submitted by /u/avanttech |