Word Embedding - Natural Language Processing| Deep Learning
ฝัง
- เผยแพร่เมื่อ 8 ก.ย. 2024
- A word embedding is a learned representation for text where words that have the same meaning have a similar representation. It is this approach to representing words and documents that may be considered one of the key breakthroughs of deep learning on challenging natural language processing problems.
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
/ @krishnaik06
Please do subscribe my other channel too
/ @krishnaikhindi
If you want to Give donation to support my channel, below is the Gpay id
GPay: krishnaik06@okicici
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06
I'm literally binge watching your videos like Netflix!
This is amazing! You literally made the topic interesting, I never got bored throughout the entire video!
I'll definitely stick to your channel in preparing for my thesis!
This is the first subscription I have made in my you tube...that's only because of your teaching skills ....u make any topic very interesting and provide excellent information on research and it's techniques... Please upload videos on autoencoders .
Same here! 😊😊
you are 200% better than my professor in explaining Word Embedding
I thought I will not learn Deep Learning because it is too much complicated, you have just explained in an effective way, within a week, LSTM skill will be updated in my resume!
Thank you for sharing this wealth of knowledge with us. TBH I’ve been finding issues in grasping how word embedding works. It’s so clear that they are not as straight forward as basic vectorization approach like BoW and TF-IDF. Nevertheless, I gained something from this video
Try showing embedding projector. It’s an interesting way to visualise embedding and sparks interest.
His channel and Statquest are 2 of the best resources for ML and Data Science on TH-cam.
Can't wait for the next video.....
Thanks A Lot .....
Awsome video!! You are an excellent teacher wish I had a teacher like u at my Master program right now!
Keep posting great content. It's worth sharing your content with everyone! Thank you!!!
I don't usually subscribe to youtube channels ... but this first video I watched from you got me.
Amazing way of teaching sir !! Great work. Thanks alot
Most hot topic of the NLP I learnt it after giving too much time
I am sharing what I learnt
For embedding we use
One hot encoding technigue
Word2vec technique
Embedding layer of keras technique after pad sequencing
You can use it for recurrent neural network
I think I found a gem! Thank you Krish! You really make it so easy to understand.
Word Embedding is such a masterpiece!
Sir your explanations are fantastic
Thanks krish
Super great! Thanks very much. Krish. It is very good learning video, instructor Krish is great, passionate, exciting. The lesson is very interesting.
These videos are adding up my knowledge. Thank you, @krish naik sir!
Nice ,Clean discription. Well Done ,Krish !!!
Super sir I enjoyed before the exam
@Krish- How you got this features(like Gender,Royal,Age etc).is this features mention any where in word embedding
You are simply amazing sir....hats off to you💯💯
Subscribed !
Thanks a lot. Hopefully your videos will be helpful my thesis project !
thank you Sir. please make a video explaining how a sentence is beeing translated with the neural system. but with an example. thank you again you are amazing.
Every day im watching atleast 5-6 videos these days
amazing brother thank you
no words for you sir.. 👌👌
would you please make a video on gradient boosting and Xgboost ML algorithm with all maths stuffs....
Will upload soon
Thanks for the tutorial on word embedding models. I wonder how are features selected in these models? I think in some particular cases having control on customizing these feature might enhance the chance of getting more similar words than just using the pretarined ones.
You have done a great job, There are students like me, who really need such explannation for getting a rough image...atleast thanks a lot
rough image...means, I got an idea and now i can study more on it and try to clear the image, created by your wonderful explannation
Krish, you save my life every time
Excellent! Love from USA
Good explanation krish sir
What an awesome explanation. Thank you @Krish
Very well explained
Sir that's amazing 😍😊
Perfect explanation... thanks a lot
Sir Kindly guide - How can I use Pre Trained word embedding models for local languages (or languages written in Roman format) that are not available/trained in the pretrained model. Do I have to use an embedding layer(not pre trained) for creating embedding matrices for any local language? How can I get benefit from pretrained models for local language?
thanks this is awesome
Nicely explain Thanks for making things clear
thank you so much sir
You are amazing❤️
good explanation sir
please make a video on dimensional reduction technique in order to reduce into 2 dimension from more dimension (coding)
you are a gem man , love your style
sir can you please tell me how to choose number of embedding dimensions?
For eg., vocab_size = 10000,max_length = 120, and embedding_dim = ??
Your explanations are great, thank you
Your speciality is teaching from scratch sir
Thank you sir
Many thanks
Good content, keep doing videos like this!
applauding for you !! thank you again!
LSTM Preactical session also required Sir
excellent
Hi Krish, First of all thanks for making this content.
I have some doubts:
1. In predicting analogy,(King->queen), if we take up some other feature instead of gender, then result may not be necessarily true(or it selects the best feature to give results)
Nice explanation sir ,always waiting for new videos. we are looking for your book publication.
Excellent tutorials .
Thanks!
Wonderful Video Krish
Sir please make a video on sentiment analysis using VADER.
Amazing explanation thank you!
Good day, may I ask how to define specific dimensions of features (for example, I want to extract linguistic features such as part of speech tagging, word density, and word frequency) that is going to be vectorized?
You haven't made any video on gradient boosting yet...Actually.Boosting series is incomplete.
Please make video on GB technique
It was really helpful. Can u make videos on Grammer Correction using Rule based methord, Language Models & classifiers.
its really hard to understand it otherwise
Thanks krish for the video.
Waiting for your facenet embedding... and clustering process
Great explanation
very good, thank you sir !
superb ... Is the word embedding fixed or generated for every dataset given for training ?
sir, u left most important part w2v plzzzz do cover it
and do cover also maximum 300 dimension funda its really difficult for me to get
How do we decide features (gender, age...) in this technique
Can you please make a video on 'Hybrid Deep Learning Model for
Sentiment Classification' that is implementation of CNN and LSTM together for sentiment classification?
I would like to know how can be obtained that coefficients in vertical columns automatically and what scientific assumption and premisses should be used for doing it. As I see now, it can be done manually by logical consideration. Thanks!
Sir kindly make video on how we can embed source code into vector and use it for Training DL model
what kind of mathematics and statistics required for data science carrier? sir please make video on this topic
Basic level
sir please make video on Data scientist carrier degree is required or certification is enough for job
This is really really amazing. Thank you for your efforts. Can you make a video on sentence embedding as well.
There is no such thing as sentence embedding as per my knowledge
👏👏
Simply Awsome.
Can you explain the last video in this "Introduction to Word Embeddings" series? "Embeddings matrix"
Pytorch or Tensorflow 2.0 which one is better for beginners?
Keras
Hi krish,
What is the parameter update equation in SVM and logistic regression?
@Krish Naik at 7:08 i would like to know if you can share some material for the technique that we use to relate features such as gender to the words like boy and girl not to the apple and mangos but I have doubts what technique we use to make possible a machine learn the relation between features and words. Thanks
sir how will be these parameter will be decided as u did . like gender, royal, age , food etc
Can you make a video on balancing the imbalance text data set?
Sir,kindly make some basic videos on Pandas.
Check my complete ML playlist
Hello, Suppose we need to add more features in our X which are not text..i.e suppose we get a sparse matrix after count vectorizer and now we have one more feature length and we want both features.How to combine both?
Hey, In word embedding how features are defined? Is it extract from document itself or pre define features set available for related domain or whole language?
what is the difference between word embedding layer and WordtoVec class
Sir When u said that we related the gender to boy , then this is done by machine , or this is predefined , i mean to say that on what bases this all comes to an action that gender is related to boy and girl , and royal is related to queen and king ??? thnku sir
is word embedding similar to pandas pivot table expect that we provide features here?
Are you able to use nlp to build a siri type app in your own foreign language? Thanks for tutorial.
Hey can you please explain on which bases you are deciding these vector value like 0.01, 0.03 etc.
Refer word2vec video of ritvikmath TH-cam channel. Thank me later😇😆
I don’t really get word embedding. Does it work outside the mainstream English language? For example medical language is different. If I am studying about medical literature, a lot of my main vocabularies are medical words. What is your opinion on this?
MAKING NO ONE ANY WISER.
I am facing an issue, trying to install nltk and spacy, but it's asking to downgrade tensorflow from version 2 to version 1.X. What can be done to install it without downgrading the TF.
I have one doubt can you make it clear, my doubt is how it is assigning the values for opposite genders, like for Boy -1 and for girl 1
someone please explain me what is vocabulary and vocabulary size . I'm confused with these terms.
An example of vocabulary could be: [This, is, an, example] ---> a collection of words, right ?
What is this vocabulary size ? The answer is 4.
This = 1, is = 2, an = 3, example = 4
Can anyone please tell me that word embedding results 3d array which can only used in deep learning networks.or it can be convert into 2d array