Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] how can I get a global minimum

[D] how can I get a global minimum

here is my code

for _ in range(10): K.clear_session() model = Sequential() model.add(LSTM(256, input_shape=(None, 1))) model.add(Dropout(0.2)) model.add(Dense(256)) model.add(Dropout(0.2)) model.add(Dense(1)) model.compile(loss='mean_squared_error', optimizer='adam') hist = model.fit(x_train, y_train, epochs=20, batch_size=64, verbose=0) p = model.predict(x_test) print(mean_squared_error(y_test, p)) plt.plot(y_test) plt.plot(p) plt.legend(['testY', 'p'], loc='upper right') plt.show() ... plt.plot(hist.history['loss']) 

`Total params` : 330,241

`samples` : 2264

and below is the result

https://i.redd.it/2gjgr2uonig31.png

I haven’t changed anything.

I only ran for loop.

As you can see in the picture, the result of the MSE is huge, even though I have just run the for loop.

I think the fundamental reason for this problem is that the optimizer can not find global maximum and find local maximum and converge.

The reason is that after checking all the loss graphs, the loss is no longer reduced significantly. (After 20 times)

So in order to solve this problem, I have to find the global minimum. How should I do this?

I tried adjusting the number of batch_size, epoch.

Also, I tried hidden layer size, LSTM unit, kerner_initializer addition, optimizer change, etc. but could not get any meaningful result.

I wonder how can I solve this problem.

Your valuable opinions and thoughts will be very much appreciated.

if you want to see full source here is link https://gist.github.com/Lay4U/e1fc7d036356575f4d0799cdcebed90e

submitted by /u/GoBacksIn
[link] [comments]