Lecture 30 - The Graph Laplacian Matrix (Advanced) | Stanford University

แชร์
ฝัง
  • เผยแพร่เมื่อ 6 ต.ค. 2024
  • 🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( / mtnayeem ) on Twitter 🐦 for real-time updates, news, and discussions in the field.
    Check out the following interesting papers. Happy learning!
    Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Prediction"
    Paper: aclanthology.o...
    Dataset: huggingface.co...
    Paper Title: "Abstractive Unsupervised Multi-Document Summarization using Paraphrastic Sentence Fusion"
    Paper: aclanthology.o...
    Paper Title: "Extract with Order for Coherent Multi-Document Summarization"
    Paper: aclanthology.o...
    Paper Title: "Paraphrastic Fusion for Abstractive Multi-Sentence Compression Generation"
    Paper: dl.acm.org/doi...
    Paper Title: "Neural Diverse Abstractive Sentence Compression Generation"
    Paper: link.springer....

ความคิดเห็น • 17

  • @bhargavvasist5919
    @bhargavvasist5919 ปีที่แล้ว +6

    Quite possibly the best concise definition to a problem and solution I've seen
    Give this man everything

  • @Miodiarioditrading200EMA
    @Miodiarioditrading200EMA 4 ปีที่แล้ว +18

    This guy deserves work in Stanford!! He is awesome!!

    • @Arpytanshu
      @Arpytanshu 2 ปีที่แล้ว +3

      This guy is a prof in stanford. :p

  • @marktensen2626
    @marktensen2626 7 ปีที่แล้ว +8

    Very clear so far! Thank you

  • @jayeshdalal7
    @jayeshdalal7 4 ปีที่แล้ว +3

    Great explanation of Laplacian matrix and graph concepts.

  • @Iamfafafel
    @Iamfafafel 3 หลายเดือนก่อน

    such great presentation with great insights.
    so we can interpret a vector as a scalar function on the vertex set. multiplication with the adj matrix can be seen as averaging out this function wrt to its neighbors. so in a way, this is like the linear-algebraic version of the heat equation

    • @Iamfafafel
      @Iamfafafel 3 หลายเดือนก่อน

      this smoothing interpretation makes the normalized laplacian the undisputed discrete counterpart to the riemannian laplacian. in short, the C0 story of the laplacian is measuring the difference between the function value and its average (this is an adaptation of harmonic functions satisfying the mean-value-property). therefore, the discrete analog of this story is to consider I - AD^{-1}, where I is the id matrix, A is the adj matrix, and D is the diagonal matrix.

  • @christyn7888
    @christyn7888 4 ปีที่แล้ว

    The clusters are so clear when you look at the adjacency matrix

  • @abdulhameedafridi9524
    @abdulhameedafridi9524 6 ปีที่แล้ว +5

    outstanding

  • @mariovela5669
    @mariovela5669 2 ปีที่แล้ว +2

    I understood the concepts but he never defined what the Graph Laplacian Matrix was....

  • @bintoanto1471
    @bintoanto1471 2 หลายเดือนก่อน

    If A and B nodes have two edges to each other, in the adjacency matrix (UNDIRECTED G) is it Aij=1 or 2(there's an edge yes=1, there are two edges =2)? Which one is correct? Also if A-B there's no edge but B-A a has an edge(direction) it should be calculated for both right?

  • @wenqizhang7719
    @wenqizhang7719 4 ปีที่แล้ว +1

    Hi, I'd like to ask what if the undirected graph is weighted (not binary), will the process of spectral clustering be changed?

  • @loryruta2149
    @loryruta2149 ปีที่แล้ว

    can’t ignore how he pronounces “pieces”

  • @saeedsahoor
    @saeedsahoor 6 ปีที่แล้ว +2

    good

  • @fredxu9826
    @fredxu9826 3 ปีที่แล้ว +2

    he sounds like a young slovaj zizek

  • @spongel1345
    @spongel1345 3 ปีที่แล้ว +3

    I REALLY WANNA TO LEARN, BUT I CAN'T ACCEPT THE PRONUNCIATION! Drive Me Crazy.