I have so much respect for mr Zak - the only person who actually explained clearly the GNN concept that I can grasp it immediately - I follow his channel, so amazed to see him in most important podcast on the Internet!
I appreciate the opportunities you give viewers to pause and "play catch up" on myriad concepts, here. This works better than making entirely separate "Crash course" videos IMHO. Good job.
Absolutely fantastic conversation! I think Zak is implying it at 6:33, but a reason to model the symmetries of a problem is not just parameter efficiency, it's generalization. Presumably if you accurately capture all the symmetries (structure) of a problem, then you get the most accurate possible approximation given whatever data you have. Still watching, might come back and edit this comment as I get further. 🙃
I have to make a bucket list with all the videos I like to watch. This one is definetly one of them. Such a great channel to stay up to date with ML world
Glad to hear that bronsteins work is hard to digest for others. I did both wavelets and affine invariance with the Laplace beltrami operator, good times. Great explanations in this video
Very interesting episode.. thanks! Btw, what is exactly the paper mentioned by Zak in chapter "back to information diffusion" named "correcting smooth". It sounded very interesting to me but can unfortunately not find anything by that name. Thanks again. Cheers, Gerard
I wonder if it would be significantly higher quality to add a vector representing the [ distance, direction, width or mass, and spacial confusion ] of the tag's source, to the information being shared with each node of the graph along side that tag.
I have so much respect for mr Zak - the only person who actually explained clearly the GNN concept that I can grasp it immediately - I follow his channel, so amazed to see him in most important podcast on the Internet!
This channel is gold…I really appreciate how much prep work and effort goes into these. Really valuable content…thanks very much!
I appreciate the opportunities you give viewers to pause and "play catch up" on myriad concepts, here. This works better than making entirely separate "Crash course" videos IMHO. Good job.
Absolutely fantastic conversation! I think Zak is implying it at 6:33, but a reason to model the symmetries of a problem is not just parameter efficiency, it's generalization. Presumably if you accurately capture all the symmetries (structure) of a problem, then you get the most accurate possible approximation given whatever data you have.
Still watching, might come back and edit this comment as I get further. 🙃
That's a good call out.
Came for the GNNs, stayed for this guy's soothing voice.
As always - very interesting! Fantastic overview of GNNs and the understandable explanations were great. Thanks !
Always impressed with your studio serup, lighting, and voice quality (and also those dope long intro edits). And of course amazing content.
I have to make a bucket list with all the videos I like to watch. This one is definetly one of them. Such a great channel to stay up to date with ML world
Glad to hear that bronsteins work is hard to digest for others. I did both wavelets and affine invariance with the Laplace beltrami operator, good times. Great explanations in this video
Great discussion Tim and Zak. Really interesting. Thank you Tim for including all the links too. Awesome. Mike - The AI Finder
Amazing episode :) Thanks for diving more into geomDL
Do a short of this message passing explanation, straight 🔥🔥🔥
Amazing! Are you guys doing an episode about diffusion models or new generative models in general?
Why not 😄
Very interesting episode.. thanks! Btw, what is exactly the paper mentioned by Zak in chapter "back to information diffusion" named "correcting smooth". It sounded very interesting to me but can unfortunately not find anything by that name.
Thanks again.
Cheers,
Gerard
I love you guys- thank you for all of this
I wonder if it would be significantly higher quality to add a vector representing the [ distance, direction, width or mass, and spacial confusion ] of the tag's source, to the information being shared with each node of the graph along side that tag.
amazing and so interesting as always!
What prerequisites will I need to understand Geometric Deep Learning?
Suggest you check this video series from Bronstein et al th-cam.com/video/PtA0lg_e5nA/w-d-xo.html