Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Reasons for an increasing loss while having a steady accuracy?

I have this phenomenon that I don’t understand.

My loss is the cross entropy of the softmax prediction with one-hot encoded labels. The accuracy is a simple argmax comparison.

While the accuracy stays the same, the loss is heavily increasing. I would guess the reason is that all outputs get more equal but the correct label stays the highest values.

But what could be the reason for such a behavior? Why is that happening? What can I do about it?

submitted by /u/Spenhouet
[link] [comments]