Hyperbolic Graph Convolutional Networks | Geometric ML Paper Explained

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 ต.ค. 2024

ความคิดเห็น • 17

  • @alexanderchernyavskiy9538
    @alexanderchernyavskiy9538 2 ปีที่แล้ว +8

    I think, sometimes we're getting tired of all of these large language models and tons of data to watch how great mathematically inspired ideas gracefully help in certain cases. It could be a great inspiration.
    I may be a complete noob in math but I got your explanation and possibly dive in more later, so good it was. Do appreciate your videos.
    And, for those who is still in doubts: a lot of PDEs, like wave equation, strange attractors (I may have mistakes in terms, sorry) could be described and solved with hyperbolic diff. geometry, so it is so useful for life and, possibly, for everyday math xD
    Thank you for the video once more!

    • @TheAIEpiphany
      @TheAIEpiphany  2 ปีที่แล้ว +3

      I feel you. Definitely fun to read and share non-mainstream papers.
      Oh for sure, hyperbolic spaces have a lot of usecases, and the tools developed for analyzing them are even more broadly applicable.

  • @TheAIEpiphany
    @TheAIEpiphany  2 ปีที่แล้ว +9

    Continuing on with Geometric deep learning in this video I cover Hyperbolic Graph Convolutional Networks introducing a class of GCNs operating in the hyperbolic space!
    Exceptional results for the class of scale-free/hierarchical/tree-like graphs.

    • @keeperofthelight9681
      @keeperofthelight9681 2 ปีที่แล้ว

      The Michael bronstein course on geometric deepl learning quickly becomes unintelligible after the start. Thanks for this!!

  • @hannesstark5024
    @hannesstark5024 2 ปีที่แล้ว +3

    Great video, thanks!

  • @hnr651
    @hnr651 2 ปีที่แล้ว +1

    The fact that that you made this paper reasonably approachable to me in less than 45 minutes proves that magic is real, QED.

  • @BlissfulBasilisk
    @BlissfulBasilisk 2 ปีที่แล้ว +1

    Loving the Geometric works! So many good papers are coming out in ML, but the ones talking about using exotic spaces and manifolds are my favorite

    • @TheAIEpiphany
      @TheAIEpiphany  2 ปีที่แล้ว +2

      For me it's a delicate balance - it's not a good thing to do fancy math for the sake of it (unless they explain it very well that is haha - then I enjoy it!)

  • @munozariasjm
    @munozariasjm 2 ปีที่แล้ว

    Great content!, keep the incredible work on

  • @hassaannaeem4374
    @hassaannaeem4374 2 ปีที่แล้ว

    As always, great breakdown

  • @jsunrae
    @jsunrae 2 ปีที่แล้ว

    What I dont understand is when you would want to consider doing this? Like what sort of data/models, or is there a way you could asses it?

  • @allehelgen
    @allehelgen ปีที่แล้ว

    the approach is interesting, but there is something that I do not get (or that is just wrong). When we talk about graphs being hyperbolic, we mean that the topology of the graph is tree-like. But here, it is not the topology itself that is transformed into hyperbolic space, it's the input features, which may come from various processes (bag of words, BERT, you name it) which have nothing to do with hyperbolicity. It's great that it works empirically, but the theoretic justification is wrong as it confuses hyperbolic-like topology with hyperbolic-like feature representation. Or, I'm just wrong and I didn't understand something in the paper, which is quite possible.

  • @rodguinea
    @rodguinea 8 หลายเดือนก่อน

    Hi! Is there an implementation out there?

  • @oliveiracaio57
    @oliveiracaio57 2 ปีที่แล้ว

    everything fails in the hyperboloid model if we want to make it a vector space in the traditional way. if you are in H^{2,3}, then the points (2,1,0) and (2,0,1) are both in the hyperboloid because = - 4+1+0 = - 3 and = - 4+0+1 = - 3, but then (2,1,0)+(2,0,1) = (2,1,1) and = - 4+1+1 = - 2. the same goes for k different from 3, just tune in the correct values. and also the scalar product fails, since for any real value r and any point x in the hyperboloid we have = r^2 = - r^2k. and this, of course, is equal to - k if, and only if, r = +/- 1.