[D] Batchnorm inference is a function of batch size / batch data
Why is it that in tensorflow, the batchnorm results during inference time (e.g. using model.predict()) are dependent on the data in the batch? I would expect the network computation graph to be completely frozen.
submitted by /u/idg101
[link] [comments]