Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Statistical Physics and Neural Networks question.

If you look at the theoretical physics literature, there’s a ton of research being done on the statistical physics of neural networks and the statistical physics of deep learning, etc…where they use analogies between spin glasses and condensed matter models to get to all sorts of theoretical results about neural networks.

To be clear, I’m not talking about studies were neural nets were used to model and solve a problem in statistical physics. I’m thinking about the line of research were the mathematics of statistical physics and spin glasses are used as frameworks to analyze the behavior of neural nets, and then arrive at conclusions like “The loss surface of neural nets have this particular topological property” or “CNN show a phase transition when the number of classes jumps from x to y”, etc…..

My question is: Did any of these theoretical results from the analysis of neural nets using methods from physics ever lead to any practical results, such as a faster training algorithm, or improved generalization ability, etc….?

As far as I can tell: No, none of the popular NNet models incorporate results from these physics inspired studies. All the improvements come from purely mathematical insights, or originally from biological insights.

But I might be wrong: Did any of the significant practical developments in NNets and Deep Learning (better activation functions, training algorithms, regularizations methods,…) stem from the statistical physics approaches?

submitted by /u/AlexSnakeKing
[link] [comments]