Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

Collaborate on an idea for initialization of neural networks [R] [P]

Most weight initialization strategies for neural networks depend on controlling variance (and mean) of the propagating signals. However, they rely on some assumptions which do not hold always in the practice.

For example using relu immediately shifts the mean.

I am thinking it might be possible to learn the initialization of the network with an appropriate cost function. A cost function to adjust weights so the variance of the input variance smoothly transforms into output variance.

Anybody wants to collaborate on a quick project to test these, perhaps somebody who has worked on a problem like this before.

I have tried something preliminary but I need advice from an experienced NN experimenter:)

submitted by /u/fbtek
[link] [comments]