Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] How to handle noisy training labels in supervised learning?

In machine learning, it is often the case that training labels are subject to noise such as mislabelling. For neural networks that require large quantities of training data, this manifests as a trade-off between dataset quality and quantity. For instance, a model may have good performance on a training set (with noisy labels), but when we evaluate on a manually annotated test set, the model appears to generalize poorly.

What are some ways a machine learning practitioner can better deal with this problem?

submitted by /u/ProjectPsygma
[link] [comments]