Skip to main content


Learn About Our Meetup

5000+ Members



Join our meetup, learn, connect, share, and get to know your Toronto AI community. 



Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.



Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[P] Batch Normalization in GANs

Hello everyone. I’ve been working on generating paintings for my Masters thesis. So far I’ve been having a really difficult time training GANs, which is par for the course.

One of the issues I’ve run into so far has been that outputs seem to share the same characteristics i.e all the paintings that it produces are of the same color palette. I’ve used the Least Squares GAN with Spectral Normalization in it. I’ve read that one of the ways of combating this issue is to use Minibatch discrimination but that seems to be making results worse for whatever reason (maybe there’s an optimum number of features you’re supposed to concatenate?).

So my question is to do with Batch Normalization which seems to be the perpetrator of this same color palette/texture issue; is it better just to use Instance Norm/Pixel Norm/Layer Norm instead of Batch Norm? Do those produce good results? I’ve been having a lot of issues with tensorflow so I’d like to know if anybody else has tried these and gotten results out of them. Let’s imagine we’re talking about just a DCGAN with batch norm replaced by any of the above.

As a bonus question, it doesn’t seem to be picking up fine detail. Any tips? (I’ve tried self attention, also not a big help)

P.S Resources are limited, I’m running on a Quadro P1000 for a day at most

submitted by /u/96meep96
[link] [comments]