Fantastic Tutorial mate you are a natural teacher. I was clicking around Huggingface like a noob for hours until I found your video. Thanks so much for this.
Hey, glad you enjoyed it. Sure, are you after a video on using RoBERTa on 'The Microsoft Research Sentence Completion Challenge' ? Happy to try get that done soon as a next video :)
bro ,am trying to create a speech to text system where only like 30 words need to be recognized...from what i know CNN can do the trick..but do i need to go for end to end models for speech recognition.hope you could reply me asap
Great video! Is it possible to apply either the manual or the pre-made pipeline approach to a dataframe/dataset, instead of just those small lists? I wanted to solve a binary classification task but have 2 different text columns. Thanks for the video, good stuff man!
Hi Rupert, I liked your video. I have one question, sorry if this is something you already answered, I want to create my own bert model from scratch using a specific dataset that's based on my use case. Any tutorials could you please suggest that I can follow for that? Thank you so much in advance.
you just need to stack some transformer encoders , and train them using your dataset with two specified tasks in the original bert paper, namely NSP and MLM.
Hi Karan, sure - it sort of depends what dataset you are using and what task you are trying to achieve? If you look at my other video on multi-label classification then you can get a good template for doing any task with hugging face with any dataset! Let me know if I can help any further :)
you dont need to build bert from scratch if you're only looking to re-train with your own dataset. You can add a machine learning layer on top of BERT where bert is used for word embedding and your layer(s) is used for optimizing BERT based on a data set you provide.
The best explanation that I've ever seen. You answer all the questions I've had. kudos to you
This is what I search for days, thank you so much for these very clear explanations on Hugging Face basics !
This guide is pure gold man! Thank you so much!
Wish you'd upload more videos, this is truly amazing. 😁
Thanks so much! : )
Fantastic Tutorial mate you are a natural teacher. I was clicking around Huggingface like a noob for hours until I found your video. Thanks so much for this.
Thanks Steve! Appreciated
Eagerly waiting for more videos from this channel!
Thank you really really much for explaining fundamental terms.
You're the reason i can now understand how to code such things easily. PLEASE KEEP DOING WHAT YOU'RE DOING. COZ NO OME EXPLAINS THESE THINGS LIKE U DO
It was really the most simplified and to the point video I watched on this topic. Great work!!
Great work! Would love to see more videos from you.
Well done to this young man. Well structured and nicely explained to cater to non experts
Thanks Minty!
Awesome video mate! Really appreciate how clear you explain everything
Glad it was helpful! :)
Thanks for your awesome walkthrough! Looking forward to your next video as well 😃
Nicely explained 👌. Looking forward to more such videos, thanks for sharing.
Thanks a lot!
This is one of the best videos ive ever come across on youtube ngl, GG
Nice intro to hf library. Really helpful 👌
Thanks :)
Thanks for detailed explanation.
You're welcome, thanks for watching :)
Very helpful 👍
Pretty helpful ! Keep going bro ! Thanks from France
Cheers dude! You're welcome from the UK !
Excellent video, It was well explained and helped me lot into hugging face models, thanks
Thank you Luis! Glad you found it helpful
Thanks! Really well explained
really helpful, thank you!
Thanks! :)
Useful video. Thank you.
Great explanation
Thanks
this video is worth watching the time!!
Thanks!!!
Very useful, thanks 👍
Thanks!
Anime -sh Kumar
Hey man, just found your channel and your videos are very good!
Thanks a whole big ton!
very good explanation thank you very much
Thanks for watching Paul!
Outstanding explanations, thanks a ton!
Thank you so much!
I've been looking for something like this for weeks! Easy and Simple explanation! Thank you so much for this informative video! 🤍
Thank you very much April! Glad you benefited from it :)
This is great. Hope you'll continue with LLMs.
helpful thanks
🤩🤩
Great content mister
Thanks ! :)
This is fantastic! You should be a professor in University!
You're too kind! Thank you!
Hey, Nice content!! Can you please make a video explaining your microsoft challenge task using Roberta model?
Hey, glad you enjoyed it. Sure, are you after a video on using RoBERTa on 'The Microsoft Research Sentence Completion Challenge' ? Happy to try get that done soon as a next video :)
@@rupert_ai yeah , that would be helpful. Thnx brother 😊
great video!
bro ,am trying to create a speech to text system where only like 30 words need to be recognized...from what i know CNN can do the trick..but do i need to go for end to end models for speech recognition.hope you could reply me asap
hey bro, if I were you I'd look at: huggingface.co/docs/transformers/model_doc/speech_to_text
hello @rupert ai, thank you for the amazing tutorial. It says in the tittle "SE1E1". Does that mean there are more episodes coming out? thanks!
Check out my channel for more hugging face and computer vision videos and tutorials! :)
create a model for automated long question and answers generation models
Great video! Is it possible to apply either the manual or the pre-made pipeline approach to a dataframe/dataset, instead of just those small lists? I wanted to solve a binary classification task but have 2 different text columns. Thanks for the video, good stuff man!
Very useful, is there any chance you can fine-tune roberta for sentence simplification? You will do me soo good
This guy really knows what he is talking about. Excellent walkthrough , Thank You !!
Zoom in can't see clearly 😢
Hi Rupert,
I liked your video. I have one question, sorry if this is something you already answered, I want to create my own bert model from scratch using a specific dataset that's based on my use case. Any tutorials could you please suggest that I can follow for that? Thank you so much in advance.
you just need to stack some transformer encoders , and train them using your dataset with two specified tasks in the original bert paper, namely NSP and MLM.
Hi Karan, sure - it sort of depends what dataset you are using and what task you are trying to achieve? If you look at my other video on multi-label classification then you can get a good template for doing any task with hugging face with any dataset! Let me know if I can help any further :)
you dont need to build bert from scratch if you're only looking to re-train with your own dataset. You can add a machine learning layer on top of BERT where bert is used for word embedding and your layer(s) is used for optimizing BERT based on a data set you provide.
Amazing tutorial
so simple and straightforward