Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
Intuitively, larger networks have higher capacity than smaller ones. However, the theoretical capacity of a huge network would never be reached in practice due to inefficient optimization procedure, limited dataset etc. So if we scale a network by 10 times, its actual capacity might only increase by eg. 5 times, and if we scale it by 100 times, the actual capacity could increase by only 20 times.
Is such a claim correct? Are there any papers that study the gap between the actual and theoretical network gap or relevant topic?
submitted by /u/vernunftig
[link] [comments]