Skip to main content


Learn About Our Meetup

5000+ Members



Join our meetup, learn, connect, share, and get to know your Toronto AI community. 



Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.



Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[R] [OC] Intrinsic Multi-scale Evaluation of Generative Models

Generative models are often used to sample high-dimensional data points from a manifold with small intrinsic dimension. Existing techniques for comparing generative models focus on global data properties such as mean and covariance; in that sense, they are extrinsic and uni-scale. We develop the first, to our knowledge, intrinsic and multi-scale method for characterizing and comparing underlying data manifolds, based on comparing all data moments by lower-bounding the spectral notion of the Gromov-Wasserstein distance between manifolds. In a thorough experimental study, we demonstrate that our method effectively evaluates the quality of generative models; further, we showcase its efficacy in discerning the disentanglement process in neural networks.

TL;DR: We introduce a metric based on heat kernels that is able to compare data manifolds in unaligned spaces. We test it on GAN evaluation (it works), and on tracking the disentanglement and training progress of NNs (also works!).

ArXiv page: Code is available here: I am here to answer your questions about the paper! 🙂

Extra shoutout to /u/augustushimself who wrote a brilliant GAN post, and the discussion that followed on this sub. It gave a lot of inspiration for writing this.

submitted by /u/olBaa
[link] [comments]