Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Neural Network Implementation

Hello reddit,

I’m following this lecture series: https://www.youtube.com/watch?v=SGZ6BttHMPw&list=PL6Xpj9I5qXYEcOhn7TqghAJ6NAPrNmUBH and am trying to implement a neural network from scratch.

This is my forward pass:

res_1 = np.dot(W_1.T, x) + b1

res_1_activation = sigmoid(res_1)

output =
np.dot(W_2.T, res_1_activation) + b2

output_activation = sigmoid(output)

This is my backward pass:

grad_pre_output = -(y[i] - output_activation)

grad_W_2 = np.dot(np.expand_dims(res_1_activation, axis=1),np.expand_dims(grad_pre_output, 1).T)

grad_b2 = np.expand_dims(grad_pre_output, 1)

grad_post_1 = np.dot(W_2, np.expand_dims(grad_pre_output, axis=1))

grad_pre_1 =np.multiply(grad_post_1.T, d_sigmoid(res_1))

grad_W_1 = np.dot(np.expand_dims(x, axis=1), grad_pre_1)

grad_b1 = grad_pre_1.T

W_1 = np.add(W_1, lr * ((grad_W_1) - lamda * 2 * W_1))

W_2 = np.add(W_2, lr * ((grad_W_2) - lamda * 2 * W_2))

b1 = np.add(b1, np.reshape(grad_b1, (grad_b1.shape[0],)))

b2 = np.add(b2, np.reshape(grad_b2, (grad_b2.shape[0],)))

I think this implementation should work, but when I run it for >2 epochs all the output neurons get saturated and always return a value of 1. I tried using L2 regularization but it still saturates. Can anyone please tell me what I am doing wrong?

Any help would be appreciated.

Thank you!

submitted by /u/cronoz30
[link] [comments]