Nice video. Thank you for this. Please clarify the embedding section as you mentioned two kinds of embedding, one is graph embedding and the other is sentence embedding. While working with a knowledge graph , we have to use graph embedding and sentence embedding both?
In this paper, they have used sentenceEmbedding for graph nodes and triples. But in general, there are even specially trained graph embeddings. Again, in the paper, they have used sentence embeddings as it's mentioned in the paper as well. :)
This sounds great! I thought about combining RAG and KG before, but this results make me more interested. If you find a valid code for this paper or something like this, It would be great to announce and share that. Thanks a lot.
I think it's the opposite! The way you define and construct your KG is going to significantly help find more relevant information later in downstream tasks. Because you have explicitly defined what matters to you rather than simply dump everything in a database and try to find answers later. :)
You guys deserve so much more views, thank you so much!
Great video and thanks for sharing! In the future, if possible, it will be great if you can show some examples or demo on construction of KG.
This is a great job! Thanks for your efforts.
Nice video. Thank you for this. Please clarify the embedding section as you mentioned two kinds of embedding, one is graph embedding and the other is sentence embedding. While working with a knowledge graph , we have to use graph embedding and sentence embedding both?
In this paper, they have used sentenceEmbedding for graph nodes and triples. But in general, there are even specially trained graph embeddings. Again, in the paper, they have used sentence embeddings as it's mentioned in the paper as well. :)
Thanks for the valuable info
Thank you for another insightful video.
This sounds great! I thought about combining RAG and KG before, but this results make me more interested. If you find a valid code for this paper or something like this, It would be great to announce and share that. Thanks a lot.
Check the latest video!
Problem is the KG. The way it is constructed narrows what information is stored so the benefit for downstream part with LLM is limited.
I think it's the opposite! The way you define and construct your KG is going to significantly help find more relevant information later in downstream tasks. Because you have explicitly defined what matters to you rather than simply dump everything in a database and try to find answers later. :)
Any good github project on that I could possibly check?
We will share
@@TwoSetAI Please be so kind and tell me when ;) . cheers!