Thanks for making this video. Went over the paper multiple times and definitely this video helped me understanding it a little better. However, 1 thing I'm still not sure about is what input features are being used board. I can understand the pins have image and text embeddings, but unsure how they generated embeddings for board. 1 simple approach I can think of is some mean operation of all pin vectors belonging to the node, but would like to know what exactly was used as this info seems missing in the paper.
Hello, great explanation. I'm implementing the algorithm and one thing is still not completely clear to me, even after this video - the negative sample distribution in the loss function. If I understand correctly, to construct a batch they take a subset of positive examples (pairs of similar nodes). Then they sample 500 "random negative items" (these are "easy" negatives) for the whole batch. I'm not sure exactly what this means but let's say they take 500 random nodes out of all nodes not included in the batch of positive pairs. Where do these probabilities (aka number of times each negative was sampled) come from?
@The AI Epiphany - How to derive the strength of recommendations while comparing the output with other models say from all of the GNN models, I got 50 odd recommendations and I want to see which one model is performing best while comparing recommendation performance without using it on related time, provided the solution is getting prepared for NLP based recommendation.
I did read the TH-cam paper from 2016! 😅 I know they are optimizing for the maximum watch time (or at least they did back then). TikTok is, from an engineering stand point, marvelous. I should check it out - do you have some resources to recommend?
Notification squad, where are you?
Hahaha you're paving the way buddy!
The tech behind the recommender systems at Pinterest.
Exciting app for GNNs.
Me like it.
Great explanation as always! Thank you!
Your videos are really great. I find them very helpful. I wish you could explain all my papers lol. Keep up with the good work !!
Hahaha thank you Jonas!
flowers for algorithm... comment for youtube to recommend me your stuff.
Hahaha. I appreciate it
Thanks for making this video. Went over the paper multiple times and definitely this video helped me understanding it a little better.
However, 1 thing I'm still not sure about is what input features are being used board. I can understand the pins have image and text embeddings, but unsure how they generated embeddings for board. 1 simple approach I can think of is some mean operation of all pin vectors belonging to the node, but would like to know what exactly was used as this info seems missing in the paper.
Hello, great explanation. I'm implementing the algorithm and one thing is still not completely clear to me, even after this video - the negative sample distribution in the loss function.
If I understand correctly, to construct a batch they take a subset of positive examples (pairs of similar nodes).
Then they sample 500 "random negative items" (these are "easy" negatives) for the whole batch. I'm not sure exactly what this means but let's say they take 500 random nodes out of all nodes not included in the batch of positive pairs.
Where do these probabilities (aka number of times each negative was sampled) come from?
life saver
Please add videos related to robustness in gnn
Thanks for your feedback I'll put it on the list 😅
@The AI Epiphany - How to derive the strength of recommendations while comparing the output with other models say from all of the GNN models, I got 50 odd recommendations and I want to see which one model is performing best while comparing recommendation performance without using it on related time, provided the solution is getting prepared for NLP based recommendation.
real* time*
Hello there! Wonderful video! Thanks a lot :-) just one question: are you aware of any good pytorch implementations?
try it with pytorch_geometric
How about TikTok recommendation system? Be curious.
I did read the TH-cam paper from 2016! 😅 I know they are optimizing for the maximum watch time (or at least they did back then).
TikTok is, from an engineering stand point, marvelous. I should check it out - do you have some resources to recommend?
@@TheAIEpiphany checkout article from Stratechery "The Tiktok War"