@Nitish Shing. I really loved your all videos and the way you explain the concept. Looking fore more videos of such nature in future. I am learning AI through your videos. Thanks for amazing work you are doing. Lots of support from Bhutan
Its amazing, I was thinking Dropout layer is like Random forest with regression logic, Maybe the intuition of everything is kinda similar with slight difference. But Amazing Video. Really impressed !!
Brilliantly explained. May I please know what pen and drawing pad you use to write the mathematics and explanation? Thanks and keep uploading videos and I am sure you will soon have millions of subscribers
in last topic how prediction works there you told p = 0.25 means only one node will be ignored during training in every epoch(according to you) but i think if you want to drop or ignore only one node from one layer(one layer is made up by 4 layer) the p value should be 0.75 and this is the formula - p = Neurons to keep / Total neurons If i am wrong please correct me and if we both are right then let me know please 🙂
If the weight is calculated based on the ratio of availability of the nodes, i.e., 0.75, then during testing we will use the same weight value to predict the output. Why to multiply with the availability ratio of 0.75 during testing?
Hello Sir, just what you mentioned in the video that the probability of each weight being present while testing will be 0.75 given that p is 0.25. But as the nature being random for selecting the node to be dropout isnt it possible in the worst case scenario that one node being present all the time for eg the node was present for all the 100 epochs?
Is it more accurate to say for every combination of forward and backward pass instead of saying for each epoch? One epoch may consists of several mini batches. Also is it common to apply drop outs on input layer? Thanks...
This guy will become a brand in aiml
Thanks, sir jee for such videos! please start BERT/transformer series!
I think you nailed it with the random forest analogy. That's just brilliant.
Amazing video again ...
Thanks for this simple explanation.
very simple and precise explanation. Thanks
Very good tutorial. Thank you very much for such a good tutorial.
Amazing you are the king of deep learning
Amazing Explanation sir g
@Nitish Shing. I really loved your all videos and the way you explain the concept. Looking fore more videos of such nature in future. I am learning AI through your videos. Thanks for amazing work you are doing. Lots of support from Bhutan
Thank you so much sir 🙏🙏🙏
Very well explained.
Its amazing, I was thinking Dropout layer is like Random forest with regression logic, Maybe the intuition of everything is kinda similar with slight difference.
But Amazing Video. Really impressed !!
Wonderful videos all
Amazing❤❤
amazing video thank you, learned so much!
Thanks for providing the paper
I wish, I would watch this video before my interview.😥. Interviewer watched this before me.
sir you are a genius
Great Video
Thank you soo much Sir ....
Thank You Sir.
finished watching
best video.
Brilliantly explained. May I please know what pen and drawing pad you use to write the mathematics and explanation? Thanks and keep uploading videos and I am sure you will soon have millions of subscribers
in last topic how prediction works there you told p = 0.25 means only one node will be ignored during training in every epoch(according to you) but i think if you want to drop or ignore only one node from one layer(one layer is made up by 4 layer) the p value should be 0.75
and this is the formula -
p = Neurons to keep / Total neurons
If i am wrong please correct me and if we both are right then let me know please 🙂
If the weight is calculated based on the ratio of availability of the nodes, i.e., 0.75, then during testing we will use the same weight value to predict the output. Why to multiply with the availability ratio of 0.75 during testing?
Great
Awesome teaching...
hidden gem
good
Thanks awesome
I have a request. Can you please make a video about Monte Carlo (MC) Dropout too?
Good
sir please share the one note link for notes
Sir pls send the notes link
Hello Sir, just what you mentioned in the video that the probability of each weight being present while testing will be 0.75 given that p is 0.25. But as the nature being random for selecting the node to be dropout isnt it possible in the worst case scenario that one node being present all the time for eg the node was present for all the 100 epochs?
Sir how do you make these amazing tutorials? What do you use to write on the screen? Just curious
By pen tablet
addicted
agar koi weight multiple times dikh gaya toh?
💚
Is it more accurate to say for every combination of forward and backward pass instead of saying for each epoch? One epoch may consists of several mini batches. Also is it common to apply drop outs on input layer? Thanks...
ig we usually don't apply dropout to input layer
yea you are right
off-topic. do you have any connection with the Bengali language? I am a Bengali and for some reason, I think, you do. just curious. :)
Revising concepts.
August 15, 2023😅
1st
finished watching
now sleep sweet dreams.
@@umersiddiqui9580 with excitement