Learn About Our Meetup

5000+ Members



Join our meetup, learn, connect, share, and get to know your Toronto AI community. 



Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.



Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Conditional GANs and class imbalance

I have a small and highly imbalanced low-resolution image dataset with 6 classes where 50% of the observations are from a single class. With unconditional GANs, I can get stably obtain samples with seemingly sufficient diversity under most setups (architecture, losses, etc).

In the conditional case, there is significant class leakage in the samples generated. I’ve tried various standards on class conditioning such as, (1) conditional batch norm in the generator; (2) a projective layer in the critic and; (3) increasing the batch size significantly to cover more modes in each batch. (1) and (2) seem to be helping, and (3) doesn’t appear to be helping with sample diversity and is making sample quality worse. Training is still in early stages though, so maybe things change (or modes collapse).

Are there any strategies or heuristics for conditional GANs specifically dealing with the class imbalanced case? At this point, I’m considering using balanced subsampled batches in each iteration or weighing the hinge loss by class distribution and hoping for the best.

submitted by /u/ligamentouscreep
[link] [comments]

Toronto AI is a social and collaborative hub to unite AI innovators of Toronto and surrounding areas. We explore AI technologies in digital art and music, healthcare, marketing, fintech, vr, robotics and more. Toronto AI was founded by Dave MacDonald and Patrick O'Mara.