I watched the video for a visual demonstration of cross-validation and found it to be exceptionally well explained and easily comprehensible. You did an excellent job in your explanation. Thank you very much.
lets assume you have 10 samples. In K fold let say we choose k=3 which means we will cerate 3 validation set. training and testing sample in each 3 set will be different. In leave one out is like sliding window technique. as we assume we have total 10 samples so in leave one out , we will create 10 validation set. like 1st have 9 train data 1 test data 2nd have 8 train data 2 test data ....so on
lets assume you have 10 samples. In K fold let say we choose k=3 which means we will cerate 3 validation set. training and testing sample in each 3 set will be different. In leave one out is like sliding window technique. as we assume we have total 10 samples so in leave one out , we will create 10 validation set. like 1st have 9 train data 1 test data 2nd have 8 train data 2 test data ....so on
Beautifully explained. Watched the video for visual explanation of Stratified K Fold. You nailed it in your explanation. Thank you.
Welcome
Do like share and subscribe
I watched the video for a visual demonstration of cross-validation and found it to be exceptionally well explained and easily comprehensible. You did an excellent job in your explanation. Thank you very much.
Welcome
Do like share and subscribe
Almost everyone studies from your videos in my college,thank you sir!
Welcome
Do like share and subscribe
In which college you are studying..?
@@MaheshHuddar sir SRM
very concise yet detailed, Thanks so much, you just explain what a 2 hours video cant explain in 9 mins. Thanks.
Welcome
Do like share and subscribe
K -fold and leave one out are same or not😊
Very informative video sir☺️
amazing, thanks a lot for clearing the concepts so well.
Welcome
Do like share and subscribe
Crisp. To the point. Awesome
Glad you liked it!
Do like share and subscribe
Sir, I could not differentiate between k fold n leave one out, both seemed to be same only
lets assume you have 10 samples.
In K fold let say we choose k=3
which means we will cerate 3 validation set. training and testing sample in each 3 set will be different.
In leave one out is like sliding window technique.
as we assume we have total 10 samples so in leave one out , we will create 10 validation set.
like
1st have 9 train data 1 test data
2nd have 8 train data 2 test data
....so on
lets assume you have 10 samples.
In K fold let say we choose k=3
which means we will cerate 3 validation set. training and testing sample in each 3 set will be different.
In leave one out is like sliding window technique.
as we assume we have total 10 samples so in leave one out , we will create 10 validation set.
like
1st have 9 train data 1 test data
2nd have 8 train data 2 test data
....so on
thankyou @@HarshPatel-iy5qe
Mrng me exam hai jaldi se padh leta hu ..... Best explanation
Thank You
Do like share and subscribe
In which university you are studying..?
@@MaheshHuddar RGPV
hahaha, deadliner, hope you made the best out of the examination already.....
Tqsm sir i want notes can u pls send me😊
Thank You
Thankyou🎉
awesome, you did a good job, explaining the topic
Thank you
Do like share and subscribe
Good explanation sir
Thanks and welcome
Do like share and subscribe
Thank you Sir ❤
Most welcome
Do like share and subscribe
Great video
Thank You
Do like share and subscribe
How to select the optimum value of K?
do we train same model in all fold?
Yes
What did you mean by the term “example”? Data point/row ?
Data point
Thanks
Welcome
Do like share and subscribe
Thank you!
Welcome
Do like share and subscribe
Good tutorial
Thank You
Do like share and subscribe
Wonderful sir
Thank You
Do like share and Subscribe
@@MaheshHuddar already subscribed
@@MaheshHuddar sir is stratified k-fold applied in genetic algorithm
sir plz give me link to ppt
Where is python code sir?