Honestly.....ur explaination is best among all.... literally I silute u sir and ur mother who got such a talented, genius minded son....❤❤❤❤❤❤ Slaam from Pakistan
can anyone correct me for leaky relu if z>=0 then output come z ,for example if the z become 0 then value of z become 0 again leaky relu will came under die relu problem
Your title is in English. When Somebody search for video in English your video comes up. If you activate translation, speak in English or put at the begining that it is in Hindi we wouldnt be wasting time. Otherwise we will think that you dont care about others.
Minor correction. You have said ReLU is discontinuous at X=0, ReLU is also continuous at X=0, But it is not differentiable.
26:50 slight correction, Relu is continuous at input zero, but not differentiable
Honestly.....ur explaination is best among all.... literally I silute u sir and ur mother who got such a talented, genius minded son....❤❤❤❤❤❤
Slaam from Pakistan
One of the best channels for data science
Thanks
you just nailed this topic too... Awesome.......
Thanks Sir Awesome Explanation
Superb, best video series i came across so far.. is it possible to upload the one note files also, sir? Thank you so much again!!
Hi Nitish bro ,please start MLops series ,stats playlist
learning a lot!
Thanks for the wonderful explanation!
Awesome explanation sir!!!
Nitish sir, we request you to make interview series as soon as possible.
Placement for upcoming batch would start in 1st week of July.
hey brother , have you got any job?
❤ sir you are amazing 😍
sir , is there any way i can get the pdf of all these lectures
Thank You Sir.
fantastic explanation...
Thank u sir
Best of best ❤❤❤❤❤
Top class content
Thanks sir
sir please start new playlist Time series analysis
Sir, just one doubt. Wouldn't ELU also lead to a Vanishing gradient problem if I have more number of negative z's in my network?
b1 to b2, do we really need connection, both can be same? looks some confusion here?
finished watching
SIR YOU HAVE MENTIONED REMAINING ACTIVATION FUNCTION????? PLEASE SHARE THEIR DETAILS
In Relu fulform there is term linear then why we say its non linear function???
you look good in this haircut
Awesome
Will you upload again??
@14:45,Nitish bhai because of negative input or intial negative weight values also might cause nagative output right
negative weight initialization and obviously negative input
You missed, Softmax Function ?
Thankss sirjii
thanx
best
can anyone correct me for leaky relu if z>=0 then output come z ,for example if the z become 0 then value of z become 0 again leaky relu will came under die relu problem
its extreamly rare for z being exactly 0, either it will be +ve or slightly -ve but in 99.99% cases it will not be exactly equal to 0
Nitish, could you please tell which graphic-pad you use for note making?
Galaxy tab s7+
@@campusx-official ipad air is a better option for 50k :)
One Correction -- 27:07 derivative of ELU is alpha*e^x and not ELU(x) + alpha
It's the same thing.
@@zarifhossain5650 not the same thing man what you even talking about
they are same bro just put the value of ELU(x) when x
20:00
9:35
Please put the language you talk in the title so we don't waste time
Hindi
Your title is in English. When Somebody search for video in English your video comes up. If you activate translation, speak in English or put at the begining that it is in Hindi we wouldnt be wasting time. Otherwise we will think that you dont care about others.
Thanks sir