Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[R] Beyond Vector Spaces: Compact Data Representations Differentiable Weighted Graphs

[R] Beyond Vector Spaces: Compact Data Representations Differentiable Weighted Graphs

Paper: https://arxiv.org/abs/1910.03524 (NeurIPS 2019)

Code: https://github.com/stanis-morozov/prodige

The paper proposes an embedding layer based on weighted graph instead of vectors. Intuitively, this layer learns to represent concepts/words by their relation to other. Trains by backprop w.r.t. graph edges.

(Left) PRODIGE learned on a subset of MNIST. (Right) zoom-in of some clusters.

  • + Learns interpretable hierarchies from raw objects;
  • + The model is much smaller than typical vector embeddings;
  • The official code is CPU-only, it aint too fast

Interactive version of the plot above: https://neurips-anonymous.github.io/index.html

submitted by /u/justheuristic
[link] [comments]