Started with a small doubt in hard margin problem in SVM and saw the whole ML playlist on 2x in 2 days. It connected all the missing dots or missed concepts like a story. I am perusing MS in data science still never found such a good relation of deep learning and ml and such a concise explanation. Hats off. 🙌🙌 Day 4 and I am in the middle of revising deep learning concepts with such a good explanation
Yaa bro, but I am stuck in NLP playlists. I follow bhaiya video last one year and I don't understand from other teacher's teaching that's why I commented.
Sir, what are you made up off ?? inch by inch .. point by point, you explained this concept just like baby footsteps ... Sir ..in my entire 30+yrs of learning journey i never ever seen such a teacher / mentor ... I am mean YOU ARE GOD !! :-)
Niteshji...I am speachless to appreciate your efforts for us... Kabhi nhi socha tha ki higher level study me bhi itne achhe teacher mil sakte hai... Aap ka work awesome... And unique hai...
Excellent Sir Excellent! Currently I am pursuing my PhD and was here for a quick recap, and Sir you made my day. Keep on creating such wonderful content.
See how things like partial derivatives and derivatives are works in practical and solving real world problems . This man telling us by explaining them so easily. Big like to you sir.❤
Never seen such a simple explanation for back propagation. You are an exceptionally gifted teacher. Thank you for your hard work and clear teaching Nitish Sir.
Lectures are justt greatt , i mean jst great .....41:25 nd 42:02 , inplace of O21 , there should be O12, btw u can ignore this small thing also as i was doing it on my own , so found out nd wanted to comment something nd finding even this little error is so hard cz your teaching is just Mind Blowing . Thanku for being there nd Happie Dayiiii , i will continuee......Sayonaraaa💃💃💃💃
The way you explained the whole thing is literally awesome ! With this video am sure every student will understand the backpropagation clearly with an insightful way.
Sir ye sb prhny k bawjood b apse prhny ka apna mza ata he thanks aloot for this great content. Also a request is that please make some videos in future on the transformers using hugging face i really need this Thanks.
Niteshji, I'm heartly grateful and thankful to you teach us. While teaching derivatives to find update value of weight & bias, if you would used real values examples for calculation then it is more beneficial to us for the purpose of understanding.
@CampusX I have a question regarding the calculation at 43:16 in your video. When determining the derivative of O11 with respect to W1_11 , you seem to be using the pre-activation value. However, shouldn't O11 be the post-activation output (i.e., after applying the activation function)? Thank you very much for your efforts.
First teacher in my life , whose thinking exactly like me to understand any topics in terms of only what why and How. Thanks for your way of teaching sir.
Cancelled Netflix and Hotstar subscriptions. Binge watching CampusX these days! Magnificent 100/10 ratings!
Me too 😂Sir is really an amazing teacher. Itna clarity aur interesting way me bolte hai movie se kam nahi hai😁May god bless him for his selfless work🥰
Exactly 😊@@DebjaniMajumder-q5p
😅😅😅
Me too
Started with a small doubt in hard margin problem in SVM and saw the whole ML playlist on 2x in 2 days. It connected all the missing dots or missed concepts like a story.
I am perusing MS in data science still never found such a good relation of deep learning and ml and such a concise explanation. Hats off. 🙌🙌
Day 4 and I am in the middle of revising deep learning concepts with such a good explanation
bro 2 din mein 2x mein kaise complete kiya the whole ML playlist 🤔
I am very happy to see such detailed derivation: at 41:25, it should be O12, by typo you kept O21.. please have a look
Sir just complete Deep learning series the way you teach is magnificent, eventually people will find this treasure soon,just keep doing good work
Yaa bro, but I am stuck in NLP playlists. I follow bhaiya video last one year and I don't understand from other teacher's teaching that's why I commented.
Sir your 1hr video feels like 10minutes and it's satisfying as like we are watching a interesting web series thanks a lot the best teacher 🙏
I was not able to understand this from my professors in fancy universities in UK. A big salute to you Sir
Waah beta
paisa hai bhaii
Nitish Sir, can't thank you enough for creating this masterpiece of a playlist for free. It's the best resource. Will always be indebted to you.
He is the best teacher. His way of teaching is outstanding and better than all the universities of Canada.
i was so scared to learn backprop. But his video is too easy and too good. Excellent! 1000/10
Sir, what are you made up off ?? inch by inch .. point by point, you explained this concept just like baby footsteps ... Sir ..in my entire 30+yrs of learning journey i never ever seen such a teacher / mentor ... I am mean YOU ARE GOD !! :-)
ab toh buddha ho gya
Niteshji...I am speachless to appreciate your efforts for us...
Kabhi nhi socha tha ki higher level study me bhi itne achhe teacher mil sakte hai...
Aap ka work awesome... And unique hai...
Excellent Sir Excellent!
Currently I am pursuing my PhD and was here for a quick recap, and Sir you made my day. Keep on creating such wonderful content.
He is the best teacher. His way of teaching is outstanding and better than all the universities of Pakistan.
Best Data science teacher in the world. Period!!
कोई शब्द नहीं है, इतना बेहतरीन explanation.. a lot of thanks sir
Superlative Explanation. I have gone through many videos regarding back-prop, it is the best explanation by far. Thanks Nitish ji
Big fan of this deep learning series and the way you explain. Please upload daily in this series so I can finally get an internship 😂😂😂
I have seen all TH-cam Channels for backpropogation but yours explaination is best.
from machine learing to deep learning
great and full detail stuff
you are the BEST teacher on youtube.
See how things like partial derivatives and derivatives are works in practical and solving real world problems . This man telling us by explaining them so easily. Big like to you sir.❤
Never seen such a simple explanation for back propagation. You are an exceptionally gifted teacher. Thank you for your hard work and clear teaching Nitish Sir.
Lectures are justt greatt , i mean jst great .....41:25 nd 42:02 , inplace of O21 , there should be O12, btw u can ignore this small thing also as i was doing it on my own , so found out nd wanted to comment something nd finding even this little error is so hard cz your teaching is just Mind Blowing . Thanku for being there nd Happie Dayiiii , i will continuee......Sayonaraaa💃💃💃💃
Yes I also catch this
was searching this comment
The way you explained the whole thing is literally awesome ! With this video am sure every student will understand the backpropagation clearly with an insightful way.
best explanation of back propogation so far truly good way of teaching
I found your video on my final year project sir. since then I started following and learning from you. Still many things to learn.
You lecture is not boring but like a story.. ❤❤❤❤❤
among one of most amazing video part! touching the neurons of DL
CampusX is an Addiction🙏😍.
The greatest explanation of the Backpropagation and Chain Rule, Thanks a lot.
Very well explained. I am from Java background and understood Deep Learning concepts so far very good.
Just one word, Fantastic! Thanks a lot!
Best & clear compilation of all the important loss functions. Thank you for this video.
Sir ye sb prhny k bawjood b apse prhny ka apna mza ata he thanks aloot for this great content.
Also a request is that please make some videos in future on the transformers using hugging face i really need this Thanks.
Mind blowing, now everything clear in backpropagation
best video on Backpropagation sir thanks to you.
I like your teching method .. thank you sir .. you explain very best way .... you give ans. to every why question and i like it..thank again!!
You are going to be famous soon remember my words .the way you explain things it takes so much effort❤
you are a legend sir. seriously you made tghings soo easy to understand.. Hats off and thanks for all the knowledge.
Thank you so much sir for such clear explanation, the way you explained backpropagation is literally very awesome.
Great ,Hats off to you Sir ,your teaching style is really awesome 🙏🙂
Absolutely amazed by your teachings!!
This man take u in deep in every topic. Hatts off 🎉🎉🎉
your explanation is amazing, sir 🖤
wahh...really sir hats off to you...what a explanation ❤❤❤❤❤❤❤
one of the best explanation sir
Amazing video . Nice way of explanation . Thanks Alot Sir for this Amaziing Video
One of the best teachers🙏🙏
That's the reason why I am following nitish sir
Thanks for this much clarity ! Very very grateful to your efforts
Thanks, sir for this best content. Now I am understanding Backpropagation.
Bro. Just Incredible.😍
Thanks for the wonderful explanation!
moj kardi sir. dil garden garden sa ho gaya😊
now that's what a lecture from first principles called
awesome explanation.. I don't think anyone can explain this perfectly. hats off.
sir nice explanation for back propagation but i request pls upload video regularly
Bhaiya, two requests first is "Our Community" and second one is "NLP Playlist"
My good luck! i finally find the Gold mine❤
TH-cam 's best video, even better than Andrew NG
What a explanation.
Thank a lot😀
You are exceptional, Great teaching skills. Thanks sir
Niteshji, I'm heartly grateful and thankful to you teach us. While teaching derivatives to find update value of weight & bias, if you would used real values examples for calculation then it is more beneficial to us for the purpose of understanding.
Great explanation. Thank you so much for this. Keep up the great work.
Very well explained,
huge effort. amazing
thanks sir for simple and wonderful explanation
damm that was too ezzzzz .. you made it look like so simple anyone can understand. again thank you so much for your hard work.
@CampusX
I have a question regarding the calculation at 43:16 in your video. When determining the derivative of O11 with respect to W1_11 , you seem to be using the pre-activation value. However, shouldn't O11 be the post-activation output (i.e., after applying the activation function)? Thank you very much for your efforts.
Incredible!!!!!!!Hats Off to you...
best explanation ever. thanks thousand time for your effort to clear such complex thing in such wasy way to us :)
Well explained sir 🙏🏻
No one have ever explained back propagation like u🔥
Sir! you are great!!!
unparalleled explanation. Amazing
What an explanation!!!!!!!!!!!!!
amazing explanation for backprop , thanks sir 🙌🏻
Thank you sir 😇
Thank You Sir.
Very good tutorial. Thank you very much for such a good tutorial.
Thank you so much for explanation sir 🙏
there isa mistake at 42:02, it should be o12 instead of o21, for yhat equation.
25:55 What is the reason behind finding the gradient?
48:30 What is back propagation algorithm?
Just awesome sir🎉❤
First teacher in my life , whose thinking exactly like me to understand any topics in terms of only what why and How.
Thanks for your way of teaching sir.
Thank you so much sir 😊❤
Wonderful explanation sir.
excellent sir
Thanku so much sir for delivering a great session.
Guys, please watch the gradient descent wala video first cause it is very important
Whattttaa great explanation, thank you Sir! 🙏❤
Legend of Deep learning
Change the channel name as The Legend🤯
Crystal clear concept
Thank you sir you explained in detail 🔥🔥🔥
@28:00 -- Chain Rule Of Differentiation (Computing Gradient of the Loss Function)
such a clear explanation!!
Looked like a good series any chance you can do some content in English.
Amazing explanation 🖤
or sir ap mery guru hain jazak Allah yahan tk phnchany k lyee....
Excellent explanation !!!