Blog

Learn About Our Meetup

4200+ Members

[D] Reasons for an increasing loss while having a steady accuracy?

I have this phenomenon that I don’t understand.

My loss is the cross entropy of the softmax prediction with one-hot encoded labels. The accuracy is a simple argmax comparison.

While the accuracy stays the same, the loss is heavily increasing. I would guess the reason is that all outputs get more equal but the correct label stays the highest values.

But what could be the reason for such a behavior? Why is that happening? What can I do about it?

submitted by /u/Spenhouet
[link] [comments]

Next Meetup

 

Days
:
Hours
:
Minutes
:
Seconds

 

Plug yourself into AI and don't miss a beat