OUTLINE: 0:00 - Graph Neural Networks and Halicin - graphs are everywhere 0:53 - Introduction example 1:43 - What is a graph? 2:34 - Why Graph Neural Networks? 3:44 - Convolutional Neural Network example 4:33 - Message passing 6:17 - Introducing node embeddings 7:20 - Learning and loss functions 8:04 - Link prediction example 9:08 - Other graph learning tasks 9:49 - Message passing details 12:10 - 3 'flavors' of GNN layers 12:57 - Notation and linear algebra 14:05 - Final words
From the correct level of mathematical precision, to the pedagogy of the content and up to the voice of the speaker. It all fits like a charm. Chapeau!
Amazing how you managed to include so much information in a relatively short video without compromising the depth of explanation. Subscribed and hoping for more content in future.
Such a lovely content man! I was having trouble understanding GNNs from other sources, but only your animation made it crystal clear in one go. Cant be thankful enough. Hope you keep making such wonderful explanatory videos on other topics in ML.
What an amazing video, I’m subscribing for sure!! And definitely checking the rest of your video. I always struggle to learn from math to concept, but your approach is inverted in that regard, and works so well for me!
This is fabulously done; the low-level explanation of the CNN analogy and layer's function expression and then abstracting that to a generalised function expression as seen in research papers. Thank you so much!
Such a great explanation for GNN. The examples are easier to understand so that I could clearly get the concept right!! Thanks for the wonderful video!!
12:18 I do not understand: The weights in the convolutional flavor are fixed based on the structure of the graph...? What does that mean? So they are not learnable? To my understanding they are independent of the structure and just the same for every connection, but not fixed. Do you meanthat they are *shared* over all positions in the graph? 13:51 Why is the weight matrix dxd and not more generally dxk for a hidden size k ?
Wonderful! I use perceptrons / multipreceptrons and finally getting into GNN. You have a wonderful explanation! Thank you so much. I am very interested in GNN classification and anomaly detection
great video, but i have one question: if the GNN was Directed instead of undirected, how would a nodes message be aggregated if the node was one from the edge of the graph (doesn't have any incoming messages), or would the nodes message just be a constant (or unchanging)?
Thanks Alex Foo, for such great content. I am working on MultiVariate Time Series Anomaly Detection using GNNs, Transformers, and GANs, do you know of any resource where I can start? I searched a lot but couldn't find anything other than papers, which are not that much useful. Thanks
Ah yes multivariate time series anomaly detection is pretty specific so there might only be papers currently, but you could check out this interesting GAT paper (arxiv.org/abs/2106.06947) with code (github.com/d-ailin/GDN ) and this transformer based method for forecasting (towardsdatascience.com/multivariate-time-series-forecasting-with-transformers-384dc6ce989b)
Thank you for the great explanation, Alex. I’ve only just heard of GNNs in recent days and you have clarified things for me. They appear to be very efficient. 👍🙏
It may be because I'm inexperienced with neural networks, but what does it mean with categorical data to be multiplied and added together, eg how is 0.25*doctor+0.8*scientist actually represented in the network? Is it one-hot encoded or something else?
Great question! Usually it'll be much simplier if the input features are ordinal or numerical. Otherwise, we will try one-hot encodings, which might cause problems related to sparisity. A common alternative is to use interger (or label) encodings where each category is given an integer. Another alternative is to just allow the network to learn the encoding themselves through a learned encoding. See here: machinelearningmastery.com/how-to-prepare-categorical-data-for-deep-learning-in-python/ In practice we usually have to run experiments with these choices to determine which is optimal given the data distribution :)
Really a great job. As i was banging my head. Now understood it well in overview. If possible can u please make elaborative video on message passing and KGCN Knowledge Graph Convolution Network
still not entirely clear to me. The main questions I have is. 1) What constitutes a training sample? In a Convolutional Neural Net that would be a particular image, and the training set typically contains millions of them. But here somehow, we only have a single graph (you can't have multiple because this would change the architecture itself). So are you using the same training sample over and over again? And 2) How do you know how many layers of Message Passing you have to do, how do you know this process even converges?
Overall good video, thanks. It is excellent but the weakness is the part where it discusses how the embeddings are generated after the message passing is done. That point about the embeddings went by to fast for me and some more details and explanation on that point would help. Thanks again.
I’ve been trying to give feedback on every SoME1 submission I watch, but I just don’t have much to say about yours since it’s mostly theory. It’s well made and everything’s reasonably clear, so over all it's pretty good. My only real suggestion is to add examples to some of the sections. In Message Passing, it was a little unclear how messages were aggregating data, and since the envelopes just changed color, I didn’t get what was happening the first time I watched this section.
Agreed! I did consider adding in examples to parts like message passing to make things clearer, though I eventually decided not to as it seemed to distract from the introductory objective of this video. Might consider going further in depth for future videos :) thanks so much for taking the time to watch this so closely and for the thoughtful feedback!
OUTLINE:
0:00 - Graph Neural Networks and Halicin - graphs are everywhere
0:53 - Introduction example
1:43 - What is a graph?
2:34 - Why Graph Neural Networks?
3:44 - Convolutional Neural Network example
4:33 - Message passing
6:17 - Introducing node embeddings
7:20 - Learning and loss functions
8:04 - Link prediction example
9:08 - Other graph learning tasks
9:49 - Message passing details
12:10 - 3 'flavors' of GNN layers
12:57 - Notation and linear algebra
14:05 - Final words
This is by far the best introduction to GNNs in TH-cam today. I habe seen many of them. Congratulations and thank you!
From the correct level of mathematical precision, to the pedagogy of the content and up to the voice of the speaker. It all fits like a charm. Chapeau!
Too kind! Thanks so much :)
7 Months later i can just agree to every point of that. Wonderful
Really great job and great animations. What tools do you use to make all these animations? Must be very time consuming
Yes this is the best video on graph neural network BY FAR!!
Only one video on this chanel? Come on. This is top quality content. I would definitely watch anything that gets published there.
Amazing introduction to GNN's, summarizing all the important basics in a beginner-friendly fashion while providing very helpful visuals
Thanks for the kind review! Glad you found this useful :)
Today, I understood Message Passing very well. Amazing interactive explanation. People like you make life easier. Thank you, Alex.
Thanks for the kind words! Glad this helped you :)
Best introduction tutorial on GNNs. Many tutorials throw statistics around as an explanation but very few provide the intuition behind it. Well done.
Amazing how you managed to include so much information in a relatively short video without compromising the depth of explanation. Subscribed and hoping for more content in future.
Crazy amounts of work has been put into this video. The simplicity was the cherry on the top. Thanks a ton. Gained a new sub.
Such a lovely content man! I was having trouble understanding GNNs from other sources, but only your animation made it crystal clear in one go. Cant be thankful enough. Hope you keep making such wonderful explanatory videos on other topics in ML.
Your video deserves millions of views. SEO your video properly and you will get that. best of luck.
indeed
Thanks for your kind words! Happy this was helpful :) any tips for SEO would be appreciated!
I'm writing a master thesis where I'm going to use graph neural networks to calculate traffic flow, so grateful for this thanks Alex!!
Sound awesome! Glad this was helpful, Richard!
I spent like a week reading all the papers and now I stumble upon this video. God I wish I watched this before
Glad this was helpful! The paper reading will definitely pay off haha
Bro, you nailed it! This type of explanation is what we need. You are a legend
What an amazing explanation, wondering if you are going to add further on this line .if so ,looking forward for this moment.big thanks
Thanks so much! Glad you found it helpful :) yes I intend to try different topics and different video lengths soon!
What an amazing video, I’m subscribing for sure!! And definitely checking the rest of your video. I always struggle to learn from math to concept, but your approach is inverted in that regard, and works so well for me!
The best introduction to GNN i have seen so far. Please upload more videos on GNN
This is fabulously done; the low-level explanation of the CNN analogy and layer's function expression and then abstracting that to a generalised function expression as seen in research papers. Thank you so much!
Cleverly explained, beautifully animated! Great job!
This is amazing, I can't tell you how much I needed this to see exactly where my models are messing up. Thank you😭!
Wow, this is an amazing explanation of GNNs, hats off! Thank you so much!
Such a great explanation for GNN. The examples are easier to understand so that I could clearly get the concept right!! Thanks for the wonderful video!!
Best explanation I've found sofar on this topic. We need more videos from you!
Thanks for the kind words!
bro this is the best introduction of gnn I have ever watch.
Amazing content...probably the best one which I have watched till now for GNN
Alex! Until I watched this video I felt like I was scrambling my brain trying to understand GNNs. Thank you for the gorgeous and clear explanation.
That's a very impressive way to explain graph nn ... Well done!
Cool infographics man!! And nice explanation.
What do you use to create such animations
Thank you!! Glad you enjoyed this. I used After Effects for all the animations :)
dude, the effort you put into this is amazing! Thank you
Really great video and very clear explanation! If you don't mind me asking, may I know what software do you use to create the animation? Thanks!
Thank you! Glad you found it clear :) I used After Effects for all the animations, and Illustrator for the objects
Great video! You saved my life, thousands tons of love from Vietnam
Lovee your pho
Your video is amazing. Well explained with beautiful visualizations. Thanks a lot.
You have a gift Alex. Thanks for this beautiful explanation. I hope you continue your amazing work.
So amazingly explained there is so little confusion it almost becomes boring haha. Amazing video!!!
The best high level explanation I've found, thanks!
Wow. This is one of the best videos I've seen. Thank you for putting the effort into the helpful animations too.
Just noticed that this is your first video. you have a gift! please keep making more
this is the most fun and informative video about GNN
Wow what a nice work! Must have put much effort in this! Thanks for sharing
I still struggle a bit to understand, but I will rewatch it
one of the best intros to GNN i found on youtube 👍👍
Great video! Keep up the good work!
What an honor! Thanks so much, you guys are an inspiration! :)
Keep them coming Alex. An amazing explainer. Hats off.
Thank you for your kind words!
That is the best video explain the GNN and the more intuitive i have seen, thank you a lot.
Hey Alex I thought this was a really good video. May I ask how you made the animations and the slides?
Easy explained but still has good depth. Pinnacle of good edu videos.
A new star is born 🌟 good job Alex, with thanks.
Hahaha too kind, thank you! Glad this was helpful to you :)
great introduction & the animation helps a lot! thanks Alex!
It's a shame you didn't make more videos, this is like the 3Blue1Brown of NN. Best video on GNN i have ever seen.
Man where are you.... we need videos from you
amazing explanation, we would love a series on graph neural networks from you.
The best explanation on TH-cam!!!
Thank you! Glad you found it useful :)
Nicely done! Well illustrated that keeps the learning interesting 👍
Thanks Jia Hao!!
Very clear explanation of the high level concepts with a remarkably understandable example!
Really helpful visual stimulations! ✨
Thanks Serene!!!
Amazing way of explaining it, the animations helped a lot, GREAT JOB! Thanks a loooooooooot.
12:18 I do not understand: The weights in the convolutional flavor are fixed based on the structure of the graph...? What does that mean? So they are not learnable? To my understanding they are independent of the structure and just the same for every connection, but not fixed. Do you meanthat they are *shared* over all positions in the graph?
13:51 Why is the weight matrix dxd and not more generally dxk for a hidden size k ?
Appreciable introduction to make concept totally clear. Keep up the good work !!
Hey your way of explaining it is very simple and beautiful 🤩.. Please make more videos like this 🙏🏻 you are too good 🔥
You’re too kind, glad you enjoyed it!
Beautifully made and explained, MORE!!
best video so far on the topic of graph neural networks.
That was my best presentation video that ı've ever watched. Thanks for everything.
What a lucid explanation to a complex topic. Take a bow..
Great job w the animation! It makes the topic easier to understand 💯
Thanks filmmaker Elton!!
Magnificent, spectacular presentation, helped a lot in all aspects of my studies.
Please continue. Don't let us hang dry after this addictive introduction.
Thank you ,sir. This has been the most helpful GNN video for me.
Wonderful! I use perceptrons / multipreceptrons and finally getting into GNN. You have a wonderful explanation! Thank you so much. I am very interested in GNN classification and anomaly detection
Thank you! Glad you found this helpful :) ah yes anomaly detection is very appropriate for GNNs, do pursue it further!
Amazing explanation. We have lots of hope from Alex. All the best.
Thanks so much for the kind words! :) Glad you found this helpful
Which software do you use for this clean and great visualizations?? Waiting for next videos!
I use After Effects for these! :)
Brooo! I like the presentation. Which tool did you use to make it this educational (it would help me with my school work). Thanks!
Ah I used After Effects for this :)
@@alexfoo_dw thanks for the answer!
great video, but i have one question:
if the GNN was Directed instead of undirected, how would a nodes message be aggregated if the node was one from the edge of the graph (doesn't have any incoming messages), or would the nodes message just be a constant (or unchanging)?
really good video, hope you are able to put it into good use and benefitting from it, also spread the knowledge of things you have learnt!
This is best video on GNN for beginners
Great video! I really liked the animation, what tool did you use?
You did an amazing job. I was wondering why did you stop making such videos . If possible please continue doing so
really great and detailed explanation. Amazing graphics and text. Please do more !
very good explanations. cleared all the doubts, thanks
I studied Forney style graphs and Bayesian message passing in uni. This seems extremely similar, but also cool
Really great explanation. GNNs seemed hard 15 min back...Thanks a lot!
Thanks Alex Foo, for such great content.
I am working on MultiVariate Time Series Anomaly Detection using GNNs, Transformers, and GANs, do you know of any resource where I can start?
I searched a lot but couldn't find anything other than papers, which are not that much useful.
Thanks
Ah yes multivariate time series anomaly detection is pretty specific so there might only be papers currently, but you could check out this interesting GAT paper (arxiv.org/abs/2106.06947) with code (github.com/d-ailin/GDN ) and this transformer based method for forecasting (towardsdatascience.com/multivariate-time-series-forecasting-with-transformers-384dc6ce989b)
@@alexfoo_dw Thanks alot.
really good explaination. keep making the videos like this your work is good.
Thanks so much! Glad that this was helpful
Very clear explanation. Perfect work!
Thanks Alex. I'm not a Maths person, yet I enjoyed your video. Looking forward to learn from basics from you 🙏
Yeah, this is the 3Blue1Brown equivalent of Graph networks. Nothing else comes close. This is spectacular :)
This was a terrific video, thanks. I hope it gets more views.
Thank you for the great explanation, Alex. I’ve only just heard of GNNs in recent days and you have clarified things for me. They appear to be very efficient. 👍🙏
This is amazing!! I sincerely hope you make more!
Thank you! Glad you found this helpful :)
Could you explain again the process of extracting information within the context of the pixel’s neighbor regions?
Great Video, Simple and easy to understand! appreciate the efforts taken!
It may be because I'm inexperienced with neural networks, but what does it mean with categorical data to be multiplied and added together, eg how is 0.25*doctor+0.8*scientist actually represented in the network? Is it one-hot encoded or something else?
Great question! Usually it'll be much simplier if the input features are ordinal or numerical. Otherwise, we will try one-hot encodings, which might cause problems related to sparisity. A common alternative is to use interger (or label) encodings where each category is given an integer. Another alternative is to just allow the network to learn the encoding themselves through a learned encoding. See here: machinelearningmastery.com/how-to-prepare-categorical-data-for-deep-learning-in-python/
In practice we usually have to run experiments with these choices to determine which is optimal given the data distribution :)
We need more videos like this!
please upload more details about GNN. Can you upload videos about applying GNN to Heterogeneous Information networks
Fantastic method of explanation.❤
This is the best GNN explainer I've seen. Thank you so much for this! Just one question: What did you use for the animations?
Absolutely amazing video! Subscribed.
Really a great job.
As i was banging my head. Now understood it well in overview.
If possible can u please make elaborative video on message passing and
KGCN
Knowledge Graph Convolution Network
still not entirely clear to me. The main questions I have is. 1) What constitutes a training sample? In a Convolutional Neural Net that would be a particular image, and the training set typically contains millions of them. But here somehow, we only have a single graph (you can't have multiple because this would change the architecture itself). So are you using the same training sample over and over again? And 2) How do you know how many layers of Message Passing you have to do, how do you know this process even converges?
Hey, Why didnt you create more content??? This absolutely brilliant ❤
This video was incredibly helpful. Thank you.
Glad you found this helpful! Thanks for the positive feedback :)
thank you so much for this video! helped me a lot to understand GNNs for my report
Overall good video, thanks. It is excellent but the weakness is the part where it discusses how the embeddings are generated after the message passing is done. That point about the embeddings went by to fast for me and some more details and explanation on that point would help. Thanks again.
This is the best gnn video i've seen
I’ve been trying to give feedback on every SoME1 submission I watch, but I just don’t have much to say about yours since it’s mostly theory. It’s well made and everything’s reasonably clear, so over all it's pretty good.
My only real suggestion is to add examples to some of the sections. In Message Passing, it was a little unclear how messages were aggregating data, and since the envelopes just changed color, I didn’t get what was happening the first time I watched this section.
Agreed! I did consider adding in examples to parts like message passing to make things clearer, though I eventually decided not to as it seemed to distract from the introductory objective of this video. Might consider going further in depth for future videos :) thanks so much for taking the time to watch this so closely and for the thoughtful feedback!
@@alexfoo_dw We are waiting for your next video on GNN please upload as soon as possible