Wow! What an amazing interview! Really skilled questions. You will definitely have over 100k subs in 6 months if this is the normal level of quality that you deliver 💪🏻
It's my understanding that it's cosine similarity, aka dot product: (a1 + a2 + a3) . (b1 + b2 + b3) = (a1 b1 + a2 b2 + a3 b3) .... if you divide this vector by the length, it is literally cos( angle(a,b) )
This was fantastic. The questions you ask really get to the heart of it so quickly.
This is really helpful. Thank you ever so much 🎉😊
Wow! What an amazing interview! Really skilled questions. You will definitely have over 100k subs in 6 months if this is the normal level of quality that you deliver 💪🏻
Thanks so much! I hope your prediction comes true. 😁
This is such a great explanation of vector embeddings and vector databases for a nonmathematician audience!
What an outclass podcast!
Thanks for the talk, would be great to hear more on the vector databases from someone involved.
It's my understanding that it's cosine similarity, aka dot product: (a1 + a2 + a3) . (b1 + b2 + b3) = (a1 b1 + a2 b2 + a3 b3) .... if you divide this vector by the length, it is literally cos( angle(a,b) )
Well done thanks, very interesting
awesome interview, great information
Thanks for listening!
Very well done.
Awesome!
Thanks! 😊
clear explanation, thx
I need a whole podcast series about these topics
I am a new sub😂😂❤