Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[R] Gap between the actual and theoretical neural net capacity?

Intuitively, larger networks have higher capacity than smaller ones. However, the theoretical capacity of a huge network would never be reached in practice due to inefficient optimization procedure, limited dataset etc. So if we scale a network by 10 times, its actual capacity might only increase by eg. 5 times, and if we scale it by 100 times, the actual capacity could increase by only 20 times.

Is such a claim correct? Are there any papers that study the gap between the actual and theoretical network gap or relevant topic?

submitted by /u/vernunftig
[link] [comments]