Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[R] A General and Adaptive Robust Loss Function

Hi /ml, I presented a paper at CVPR last week that seemed to go over well, so I thought I’d promote it beyond the vision community.

Video (much more approachable than the paper, and identical to the talk): https://www.youtube.com/watch?v=BmNKbnF69eY

Abstract: We present a generalization of the Cauchy/Lorentzian, Geman-McClure, Welsch/Leclerc, generalized Charbonnier, Charbonnier/pseudo-Huber/L1-L2, and L2 loss functions. By introducing robustness as a continuous parameter, our loss function allows algorithms built around robust loss minimization to be generalized, which improves performance on basic vision tasks such as registration and clustering. Interpreting our loss as the negative log of a univariate density yields a general probability distribution that includes normal and Cauchy distributions as special cases. This probabilistic interpretation enables the training of neural networks in which the robustness of the loss automatically adapts itself during training, which improves performance on learning-based tasks such as generative image synthesis and unsupervised monocular depth estimation, without requiring any manual parameter tuning.

Arxiv: https://arxiv.org/abs/1701.03077

TensorFlow Code: https://github.com/google-research/google-research/tree/master/robust_loss

PyTorch Code: https://github.com/jonbarron/robust_loss_pytorch

submitted by /u/jnbrrn
[link] [comments]