[D] Instance weighting with soft labels.
Suppose you are given training instances with soft labels. I.e., your training instances are of the form (x,y,p), where x ins the input, y is the class and p is the probability that x is of class y.
Some classifiers allow you to specify an instance weight for each example in the training set. The idea is that a misprediction for a particular example is penalized proportionality to its weight, so instances with high weight are more important to get right and instances with a low weight are less important.
When examples are of the form (x,y,p), it’s clear that the class probabilities could be used as instance weights. A simple way to do this is to weight the loss for each instance by its probability, as suggested here:
Does anyone know of a paper/book where this simple weighting approach is discussed? I can’t find references on this simple idea.
submitted by /u/ockidocki
[link] [comments]