I spent the past 2 days trying to study word2vec. Visited all the favourite yt channels, read articles on medium, even tried and read the paper from arxiv. This video most beautifully covered and explained word2vec. Amazing made me feel i actually did learn word2vec without any point being left. The diagrammatic explanation of CBOW and skip-grams took it away.
I just wanted to drop a quick note to say how much I appreciate your TH-cam videos. Your teaching style is fantastic, and the content is incredibly helpful. Thank you for making learning so enjoyable and accessible.
Hi Utkarsh, I am new to this course may I know why we have to get the final vector only by subtraction and addition, is there some other ways for the same? Kindly explain.
I Never Thought That One Day,The Knowledge OF " Game OF Thrones " would help me to understand the ML concept :D , Between Great Explanation 🙌🏻🙌🏻 Like Always Thank You So Much :)
Sir, thank you so much for your number 1, superb lectures.Pls keep making such type of lectures.After watching your lecture, I am very much satisfied and happy with complete concept understanding.
Best explanation for w2v that i ever came across ,was struggling with the actual detailed info but this let all the case to rest . Thank you very much for making such an detailed explanation
This is insanely good. I was studying embeddings from an Andrew Ng course. I understood stuff but wasn't sure. Then went for Krish Naik. I understood more, but not fully. Then a friend recommended this video. Seems like I have a new 'go to' data science TH-camr Subscribed. Keep up the good work.
finalllyyy .after 2 weeks wait krne k badd...aaahigaya..hope sir aapki holdiay achi gyi hogi...will watch and then comment later if i willl have doubts
Immensely helpful content, very well structured and explained.Thank you for sharing this on youtube while people are charging hefty amount for the same. Appreciate your intentions of social welfare.
very well explained. Its a blessing to see content from such a great and passionate teacher that too free of cost.... I don't think even the paid courses can explain things with such great clarity.
Hi I just wanted to know Have you tried to run the code because while downloading that 1.5 gb file I'm getting error please let me know if you are able to download
I really appreciate your work, i love to watch your videos. I found a small mistake at 54:15, it should be Small Dataset - SkipGram and for Large Dataset - CBow. Thanks
I can say your way of explaining the concepts is tooooo good and easy to understand. Your content is best among most of the youtube channels. Hope your videos reach larger audience in future👍
does someone feel like its illegal watching this content for free? No way this guy can explain the way such that literally even a new born child can understand easily if it learns from him
thanks Nitish for this, loving the series!! 2 queries 1. GOT example me hamne skip-gram use kiya ya cbow? 2. while selecting cbow vs skipgram, how do can we decide how much data is large/small data?
hello Nitish sir I am a fan of your machine learning series and I continue watching. my question is (1) model.wv.index_to_key and (2) model.wv.get_normed_vectors() are not working
First of all, thanks for all these amazing videos these really help so much, bas mera ek sawal tha ki is this NLP playlist enough, i mean does it cover entire NLP or is there anything more which i should explore by myself, please sir bas yeh ek doubt clear kardo bohot help ho jayega
Sir 1. Why have we used sent_tokenize ? Can't we just tokenize individual words directly ? 2. Which way was the model created - cbow or skipgram ? 3. Could you add links to your deep learning lectures that you kept referring in the video
In CBoW section, I think dimensions are wrong as total words are 5 and weights receiving from prior layer (hidden layer) are 3. So, dimension of last layer should be 5x3.
Sir your content and explanation is mind blowing. so thank you sir one request is, sir can you provide your One-Note notes file so that we can refer from it when we needed.
Stopped following Krish Naik after started watching your videos 😄, Great content! Refering many friends to follow your playlist 💪🏽 AttributeError: 'Word2VecKeyedVectors' object has no attribute 'get_normed_vectors' Getting this error while executing " model.wv.get_normed_vectors() " , any suggestion brother?
Problem Solved: my_dict = dict({}) for idx, key in enumerate(model.wv.vocab): my_dict[key] = model.wv[key] X = list(my_dict .values( )) y = list(model.wv.vocab.keys())
Nitish , I wanted to know whether judiciously following the assignments will do the thing for us , or do we need to study the NLTK, Spacy libraries in detail. Just asking !
Sir, I have doubts. 1. Why there are differences between Upper Case and Lower Case words? If we see the two vectors of 2 words "bag" and "Bag", the 2 vectors are different. 2. Why there is only one hidden layer? And one more thing. Google has its own GCP. Then google stored the file "GoogleNews-vectors-negative300.bin.gz" on the AWS S3 bucket. Why?
1. Because each character has different binary values. 2. Creators must have tried different architectures. 3. It's not Google who have hosted it. It's the creator of gensim I guess.
Kyuki sir hum aapke yaha Sikh to rahe hai but sir actually interview Kai liye explaination Kai upar khas kar Kai how to explain a project khas to ml Wale so is pai ek video banate hai aap to bahut help ho jayegi sir
That amazon aws google news vector file is giving 404 Not Found error in colab notebook. how can i solve this to see that file?? anybody please help me
Thank you so much Sir for this nicely explained tutorial. I have just one doubt Sir. If a sentence is having only 3 words, but while training the model on GOT data at last, we have considered window size 10. So what are the context word it will consider other than it's two context word ? (Sir, Is it like or can we assume like considering the whole corpus as one single sentence with all the words in the vocabulary ?)
I spent the past 2 days trying to study word2vec. Visited all the favourite yt channels, read articles on medium, even tried and read the paper from arxiv. This video most beautifully covered and explained word2vec. Amazing made me feel i actually did learn word2vec without any point being left. The diagrammatic explanation of CBOW and skip-grams took it away.
He's the best
I agree. All videos are really helpful.
Hey bro can we connect on linkedin
I just wanted to drop a quick note to say how much I appreciate your TH-cam videos. Your teaching style is fantastic, and the content is incredibly helpful. Thank you for making learning so enjoyable and accessible.
I have never seen a video that has gone through minute details of word2vec except yours. Thanks a lot .
I dont know why but this guy is so underrated ! His content is to be promoted , cant believe I'm getting to see such content for free ! Thank you Sir
@00:57 - word embeddings
@04:48 - what is word2vec?
@09:34 - w2v banta kaise hai?
@10:00 - w2v Demo
@21:04 - w2v intuition
@34:33 - Types of w2v architectures
@35:52 - CBoW
@50:28 - Skip-gram
@56:23 - Training own model - GoT data
@01:14:23 - assignment
World needs more people like you bro 🫡🫡🫡
@campusX you can add this timestamp/chapters to your video for better reach. Thanks for the awesome content Sir.
Hi Utkarsh, I am new to this course may I know why we have to get the final vector only by subtraction and addition, is there some other ways for the same? Kindly explain.
Speak English. I m not an indian
I Never Thought That One Day,The Knowledge OF " Game OF Thrones " would help me to understand the ML concept :D , Between Great Explanation 🙌🏻🙌🏻 Like Always Thank You So Much :)
@Campusx Thank you so much sir..I really love your videos please continue this playlist till BERT,ALBERT,DISTILBERT and GPT also
yes much needed.......but ig his last video on this playlist is 1 year ago......sir please continue.....nobody can match your teaching skills
While training where we have choses the architecture..? Cbow or skip gram
Sir, thank you so much for your number 1, superb lectures.Pls keep making such type of lectures.After watching your lecture, I am very much satisfied and happy with complete concept understanding.
never found so easy to understand explanation in whole of the internet
Best explanation for w2v that i ever came across ,was struggling with the actual detailed info but this let all the case to rest . Thank you very much for making such an detailed explanation
very well explained the intuition , now i can finally say i know word2vec.
Thank you so much for putting such effort.
This is insanely good.
I was studying embeddings from an Andrew Ng course. I understood stuff but wasn't sure.
Then went for Krish Naik. I understood more, but not fully.
Then a friend recommended this video.
Seems like I have a new 'go to' data science TH-camr
Subscribed. Keep up the good work.
finalllyyy .after 2 weeks wait krne k badd...aaahigaya..hope sir aapki holdiay achi gyi hogi...will watch and then comment later if i willl have doubts
underrated channel....no one is teaching like u...higher edu is though difficult
Big Thank You Sir (Guru). This NLP series is very helpful for us.
Immensely helpful content, very well structured and explained.Thank you for sharing this on youtube while people are charging hefty amount for the same. Appreciate your intentions of social welfare.
I like the way how clearly the statements are addressed.. lot to learn. Thanks
A very indepth and great explaination for word 2vec...the best one by far..!! Thanks for the videos..👍👍
very well explained. Its a blessing to see content from such a great and passionate teacher that too free of cost.... I don't think even the paid courses can explain things with such great clarity.
Hi
I just wanted to know Have you tried to run the code because while downloading that 1.5 gb file I'm getting error please let me know if you are able to download
OMG ... man 100% you have followed the "feynman technique of learning". Awesome !!!
Sir.. You are really amazing person.... Your way of explanation is great.. It is easily understandable...one of the best channel CampusX☺
Aap Bahut acha padhate ho👍👍
Pls sir Continue
Very nicely explained. Many concepts got cleared in a single video.
Amazing Explanation bro, superb . clear complete concepts about Word2vec, thanks for making such a great content ...
I really appreciate your work, i love to watch your videos. I found a small mistake at 54:15, it should be Small Dataset - SkipGram and for Large Dataset - CBow.
Thanks
sir apki har new video dekhkar lgta hai ek hi dil hai kitni baar jitoge
I can say your way of explaining the concepts is tooooo good and easy to understand. Your content is best among most of the youtube channels. Hope your videos reach larger audience in future👍
not having words for this brillant explanation
Amazing, you wonderfully explained all aspects of the topic
53:45 small correction , Skipgram with smaller data and CBOW with larger data
Nitish Your explanations are mind boggling ! Superb !!
I first like the video, then watch it, because I know, this will be the best Explanation.
bro do you find friends dataset
Came here after watching krish naik's video , must say the explanation was too perfect
In a simple word, this tutorial is easy, easy, and easy....♥♥
best teacher in the domain of data science, ml, nlp, open cv & gen ai!!!!!!!!!!!!!
I am Glad i found you on TH-cam :) Lot of power to your work!!
Nice explanation, I am got clearity on word2vec and other techniques now.
Waiting for this lecture 😍😍❤
does someone feel like its illegal watching this content for free? No way this guy can explain the way such that literally even a new born child can understand easily if it learns from him
This series is extremely helpful!
Please continue this course.
Please cover topics like named entity recognition, topic modelling, sentiment analysis etc
Thanks Nitish for the detailed explaination..!😀
thanks Nitish for this, loving the series!! 2 queries
1. GOT example me hamne skip-gram use kiya ya cbow?
2. while selecting cbow vs skipgram, how do can we decide how much data is large/small data?
According to his explanation it is CBOW
Your explanation is tooo good 👍❤
thank you. watching in 2024 but very informative and your teaching style is very good
AT 12:38 IN THE VIDEO THE GIVEN URL IS NOT WORKING NOW? PLEASE PROVIDE SOME SOURCE TO GET DATASET OF GOOGLE NEWS 300
Excellent..! Please continue this playlist
Maza aaya.You are amazing...
Sir please think of taking some Online NLP classes.. where we can complete the course in fast track. These videos are very useful..
Can you please confirm the writting pad that you use for the videos. Thanks in advance.
AOA brother please make the video on GloVe because you have great skill in explaining the concepts.
game of thrones is my fav serias at 13 year
hello Nitish sir
I am a fan of your machine learning series and I continue watching. my question is
(1) model.wv.index_to_key and
(2) model.wv.get_normed_vectors()
are not working
Sir Please continue this playlist.. Please make video on transformers , BERT etc 😵
Perfectly explained
Sir, please make a complete playlist in Computer Vision as well...Eagerly waiting
First of all, thanks for all these amazing videos these really help so much, bas mera ek sawal tha ki is this NLP playlist enough, i mean does it cover entire NLP or is there anything more which i should explore by myself, please sir bas yeh ek doubt clear kardo bohot help ho jayega
th-cam.com/video/PKv_okm1H-k/w-d-xo.html
@@campusx-official thanks a lot sir, i completed the entire playlist and then am watching this roadmap now 😅
@@campusx-official Sir, data execution on Jupyter notebook taking more time😞
Sir
1. Why have we used sent_tokenize ? Can't we just tokenize individual words directly ?
2. Which way was the model created - cbow or skipgram ?
3. Could you add links to your deep learning lectures that you kept referring in the video
In CBoW section, I think dimensions are wrong as total words are 5 and weights receiving from prior layer (hidden layer) are 3. So, dimension of last layer should be 5x3.
sir, aap 29:00 me blog ka link dena shayd bhool gaye
Great explanation 👍
Best lecture ❤️
Sir please share the links for basics of nueral networks.
You have made my day. 🤩
According to wikipedia, CBOW is much faster than Skip-grams, then how can we use skip-grams for large dataset?
O Beeeeeeeeeeen stokes........maza a gya hai.
very well explained Sir!
Sir your content and explanation is mind blowing. so thank you sir
one request is,
sir can you provide your One-Note notes file so that we can refer from it when we needed.
sir please make a video on object detection model
Stopped following Krish Naik after started watching your videos 😄, Great content!
Refering many friends to follow your playlist 💪🏽
AttributeError: 'Word2VecKeyedVectors' object has no attribute 'get_normed_vectors'
Getting this error while executing " model.wv.get_normed_vectors() " , any suggestion brother?
Problem Solved:
my_dict = dict({})
for idx, key in enumerate(model.wv.vocab):
my_dict[key] = model.wv[key]
X = list(my_dict .values( ))
y = list(model.wv.vocab.keys())
True
Same with me bro.
Codebasics + Krish Naik is doing CCP job noting else .. just hype only sound no proper content nor flow 😏😏😏
Also, make a video on self supervised learning for computer vision applications.
Rarest knowledge ❤
Nitish , I wanted to know whether judiciously following the assignments will do the thing for us , or do we need to study the NLTK, Spacy libraries in detail. Just asking !
skip gram mai kaunse weights vector rep mai use hongai?
The link of google data isn't working, right? at 12:39
Kaggle
Sir, I have doubts.
1. Why there are differences between Upper Case and Lower Case words? If we see the two vectors of 2 words "bag" and "Bag", the 2 vectors are different. 2. Why there is only one hidden layer?
And one more thing. Google has its own GCP. Then google stored the file "GoogleNews-vectors-negative300.bin.gz" on the AWS S3 bucket. Why?
1. Because each character has different binary values.
2. Creators must have tried different architectures.
3. It's not Google who have hosted it. It's the creator of gensim I guess.
i still dont understand what happens with the two words after multiplication in cbow. will we average them?
Sir can u do one on Burt, please
So when you say 300 dimension vector used in word2vec , does that mean it has 300 nodes in second last layer?
same word can be present in different sentences ? so for each sentence we calc a word vector and take average?????????
very informative
sir when we import the file then this issu come BadGzipFile: Not a gzipped file (b'
Why did you take 10 nodes in neural network input?
Kyuki sir hum aapke yaha Sikh to rahe hai but sir actually interview Kai liye explaination Kai upar khas kar Kai how to explain a project khas to ml Wale so is pai ek video banate hai aap to bahut help ho jayegi sir
Now this googlenews link has only 94 entries , any different link?
nice explained sir
That amazon aws google news vector file is giving 404 Not Found error in colab notebook.
Did you find any solution ?
thank u so much for this video
🙏🙏👌👌😊😊
THANK YOU SIR JEE
Hi i was having issues with getting the google collab to work I consistently got error not found when trying to get the google pre-existing model
brilliant explanation
That amazon aws google news vector file is giving 404 Not Found error in colab notebook. how can i solve this to see that file?? anybody please help me
sir linear function use in hidden layer in cbow and skipgram
?????
Sir please complete DBSCAN and xgboost algorithm and anomaly detection in ml playlist PLZZ 🙏 Sir
Will do it in January
Sir complete machine learning playlist and NLP
Thank you so much Sir for this nicely explained tutorial. I have just one doubt Sir. If a sentence is having only 3 words, but while training the model on GOT data at last, we have considered window size 10. So what are the context word it will consider other than it's two context word ? (Sir, Is it like or can we assume like considering the whole corpus as one single sentence with all the words in the vocabulary ?)
good evening sir
kindly provide the updated link of the data set
please its humble request
to do practice
Sir how to explain Ml project interview Kai kiye uspai ek video banao na
Will make one in future
ERROR: get_normed_vectors
AttributeError: 'Word2VecKeyedVectors' object has no attribute 'get_normed_vectors'
where i can find the link of google news of aws?
please rephrase I loved your videos, please make a video on transformer too