Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Instance weighting with soft labels.

Suppose you are given training instances with soft labels. I.e., your training instances are of the form (x,y,p), where x ins the input, y is the class and p is the probability that x is of class y.

Some classifiers allow you to specify an instance weight for each example in the training set. The idea is that a misprediction for a particular example is penalized proportionality to its weight, so instances with high weight are more important to get right and instances with a low weight are less important.

When examples are of the form (x,y,p), it’s clear that the class probabilities could be used as instance weights. A simple way to do this is to weight the loss for each instance by its probability, as suggested here:

https://stats.stackexchange.com/questions/277435/how-can-i-integrate-confidence-of-class-labels-into-my-classifier

Does anyone know of a paper/book where this simple weighting approach is discussed? I can’t find references on this simple idea.

submitted by /u/ockidocki
[link] [comments]