Dropout Layers in ANN | Code Example | Regression | Classification
ฝัง
- เผยแพร่เมื่อ 12 ก.ย. 2024
- In this video, we explore the power of Dropout Layers with code examples for both Regression and Classification tasks.
Digital Notes for Deep Learning: shorturl.at/NGtXg
Classification Code - colab.research...
Regression Code - drive.google.c...
Paper link - jmlr.org/paper...
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.camp...
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
👍If you find this video helpful, consider giving it a thumbs up and subscribing for more educational videos on data science!
💭Share your thoughts, experiences, or questions in the comments below. I love hearing from you!
Wow! Aap kafi mehnat karke video banate hai.. Aur sabse baddi baat sarra content free mai available hai. Thanks a lot
You are a legend Nitish. The best teacher, ever.
The main reason you have less subscribers,people want to learn everything quickly and get the job but thats not how it works ...People who follow your channel definitely become best data scientists
you are absolutely right
ya you are 100 percent right....I have gone through all vedios so i know how worthy it is!
Absolutely 👍
Other channels are quick and shallow
@@pravinshende.DataScientist job mila kyaaaaa?
I am PhD student and looking for videos explaining theory with intuition behind. And I found your channel as a blessing. Thank you for starting this series.
help me understand a little bit ..... I am a UG student and currently following the this playlist to learn deep learning. i want to get a job what should i do ? The thought about learning this again in master makes me feel like i am taking the wrong approach and i should focus more on the implementation only . Please guide if you are free....
good video .... sir one day u will be popular
Excellent and deep explanation. Thank u.
love u sir third time revision chal rh tha pori playlist ke
i don't know why i always love these videos.
Thank you so much for the content sir, the teaching is really helpful and easy to understand.
Very good tutorial. Thank you very much for such a good tutorial.
God level teacher ❤️🤌🏻
Wah Wah Sir
(13:49)->
I think if the value of p is more then model will be underfitting since many nodes are dropped so the model will not understand the pattern of data due to absence of many nodes.
Isn't it Guys?
great video
Nice 👍
Thanks 🙏
Thanks a lot sir
Thank You Sir.
Good Explanation, Thanks
sir what happen if we used p-value in decreasing order for layers. means for 1st layer p-value=0.75, for 2nd layer p-value=0.6, and so on.
when i set dropout(0.5) the training loss spikeshigher at some instances and validattion loss is slight less than training loss most of the times,in simple words the scenario reversed😃
I tried regression problem with dropout as 0.5. My only observation here is that there is not much difference in training and testing loss but the curve is more smoothened. Am I missing anything major here?
I sort of had a different behaviour in terms of convergence rate while I was working on dropout. I was finetune a CNN model on 6 lakh images and around 10000 classes and I applied the droupout around 0.5 and due to that the tarining speed got faster. Any particualr reasons for that ?
Idk what's happening to your channel I saw another YT channel having less content, less videos but having sub base of 70k how is this possible? I think one of the reasons is you are starting unorganized videos.
How is it unorganized???? If you dont know how to type in English, then dont type and make yourself look like an idiot.
Nice explanation sir
but why loss is increasing as number of epoch increases?
Sir if possible can you share the notes
Can you turn on the subtitles? Thank you
plz put code/dataset in CSV file
what is input_dim means here?
numbers of inuput columns in dataset