[R] Gap between the actual and theoretical neural net capacity?
Intuitively, larger networks have higher capacity than smaller ones. However, the theoretical capacity of a huge network would never be reached in practice due to inefficient optimization procedure, limited dataset etc. So if we scale a network by 10 times, its actual capacity might only increase by eg. 5 times, and if we scale it by 100 times, the actual capacity could increase by only 20 times.
Is such a claim correct? Are there any papers that study the gap between the actual and theoretical network gap or relevant topic?
submitted by /u/vernunftig
[link] [comments]