Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
Most weight initialization strategies for neural networks depend on controlling variance (and mean) of the propagating signals. However, they rely on some assumptions which do not hold always in the practice.
For example using relu immediately shifts the mean.
I am thinking it might be possible to learn the initialization of the network with an appropriate cost function. A cost function to adjust weights so the variance of the input variance smoothly transforms into output variance.
Anybody wants to collaborate on a quick project to test these, perhaps somebody who has worked on a problem like this before.
I have tried something preliminary but I need advice from an experienced NN experimenter:)
submitted by /u/fbtek
[link] [comments]