Graphormer - Do Transformers Really Perform Bad for Graph Representation? | Paper Explained

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ก.ค. 2024
  • ❤️ Become The AI Epiphany Patreon ❤️ ► / theaiepiphany
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    Paper: Do Transformers Really Perform Bad for Graph Representation?
    In this video, I cover Graphormer a new transformer model that achieved SOTA results on the OGB large-scale challenge benchmark.
    It achieved that by introducing 3 novel types of structural biases/encodings into the classic transformer encoder.
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    ✅ Paper: arxiv.org/abs/2106.05234
    ✅ What can GNNs learn blog: andreasloukas.blog/2019/12/27...
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    ⌚️ Timetable:
    00:00 Key points of the paper
    02:15 Graph-level predictions and why we need structural information
    05:35 GNNs basics
    09:40 Centrality encoding explained in depth
    14:15 Spatial encoding explained in depth
    18:00 Edge encoding explained in depth
    22:30 Results on OGB LSC
    23:30 Graphormer handles over-smoothing
    25:10 Other results and ablation study
    28:20 Graphormer is more expressive than WL
    30:30 Mean aggregation as a special case of Graphormer
    35:00 Sum aggregation as a special case of Graphormer
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
    If these videos, GitHub projects, and blogs help you,
    consider helping me out by supporting me on Patreon!
    The AI Epiphany ► / theaiepiphany
    One-time donation:
    www.paypal.com/paypalme/theai...
    Much love! ❤️
    Huge thank you to these AI Epiphany patreons:
    Petar Veličković
    Zvonimir Sabljic
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    💡 The AI Epiphany is a channel dedicated to simplifying the field of AI using creative visualizations and in general, a stronger focus on geometrical and visual intuition, rather than the algebraic and numerical "intuition".
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    👋 CONNECT WITH ME ON SOCIAL
    LinkedIn ► / aleksagordic
    Twitter ► / gordic_aleksa
    Instagram ► / aiepiphany
    Facebook ► / aiepiphany
    👨‍👩‍👧‍👦 JOIN OUR DISCORD COMMUNITY:
    Discord ► / discord
    📢 SUBSCRIBE TO MY MONTHLY AI NEWSLETTER:
    Substack ► aiepiphany.substack.com/
    💻 FOLLOW ME ON GITHUB FOR COOL PROJECTS:
    GitHub ► github.com/gordicaleksa
    📚 FOLLOW ME ON MEDIUM:
    Medium ► / gordicaleksa
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    #graphormer #graphs #transformers

ความคิดเห็น • 13

  • @chengxuanying710
    @chengxuanying710 3 ปีที่แล้ว +10

    Thanks for sharing our paper!

  • @zixuwang2904
    @zixuwang2904 ปีที่แล้ว +1

    very detailed and your drawing helped a lot! thanks!

  • @user-iv7bg9qo5g
    @user-iv7bg9qo5g ปีที่แล้ว +3

    I would like to ask when will there be a video explaining the graphormer code or does the blogger have any recommended articles explaining the source code?

  • @user-co6pu8zv3v
    @user-co6pu8zv3v 2 ปีที่แล้ว

    Thank you!

  • @joehua2580
    @joehua2580 3 ปีที่แล้ว +5

    do you think graphormer perform the best maybe mainly because the huge number of parameters? it looks like it's somewhat 50-100 times more parameters than the other method....

    • @user-qu3wu3yk6r
      @user-qu3wu3yk6r 2 ปีที่แล้ว

      It could be one of the reasons but the perfect design should be the main factor.

  • @user-op6le6zs4n
    @user-op6le6zs4n 2 ปีที่แล้ว

    老哥,强啊

  • @siddhantdoshi860
    @siddhantdoshi860 ปีที่แล้ว

    The output of the transformer will be node-level embeddings,can you help with what is readout function here in this paper used to get a graph level embedding, which can be further used to classify the graphs

    • @5_min_summary
      @5_min_summary 5 หลายเดือนก่อน

      average or sum of the node embeddings

  • @MrMIB983
    @MrMIB983 3 ปีที่แล้ว

    Still waiting for sparse transformer :(

    • @TheAIEpiphany
      @TheAIEpiphany  3 ปีที่แล้ว +2

      Which one? 😂 Link the paper

    • @MrMIB983
      @MrMIB983 3 ปีที่แล้ว +1

      @@TheAIEpiphany ahh, the only one. The core of Dall-e and gpt arxiv.org/abs/1904.10509