Learn About Our Meetup

5000+ Members



Join our meetup, learn, connect, share, and get to know your Toronto AI community. 



Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.



Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[Discussion] Is MINE (Mutual Information Neural Estimation) suitable for reducing the mutual information?

Hello, i got a old-fashioned but confused question about Belghazi et al., Mutual Information Neural Estimation, ICML 2018.

In the paper, the lower bound of mutual information is achieved by neural-net-parameterized function (what they call ‘statistics network’), and various experiments are conducted including information bottleneck which is case of ‘reducing’ I(X; Z).

Here i’m quite interested with reducing mutual information, so i started to regenerate their results, but it’s quite stucked.

Unfortunately not much details about IB implementation are included in paper, so if you have any experience employing MINE to reduce mutual information, it’d be a big pleasure, please share your way.

The paper is well-written with clear theoretical background, but i’m not sure how lowering the ‘approximated lower bound’ is helpful to reduce the actual mutual information. For those kinds of lower-bound mutual information models; Do you think those models are also practically useful to reduction of MI?

submitted by /u/pky3436
[link] [comments]

Toronto AI is a social and collaborative hub to unite AI innovators of Toronto and surrounding areas. We explore AI technologies in digital art and music, healthcare, marketing, fintech, vr, robotics and more. Toronto AI was founded by Dave MacDonald and Patrick O'Mara.