Machine Learning-Bias And Variance In Depth Intuition| Overfitting Underfitting

แชร์
ฝัง
  • เผยแพร่เมื่อ 30 ก.ย. 2024

ความคิดเห็น • 455

  • @heartsbrohi9394
    @heartsbrohi9394 4 ปีที่แล้ว +189

    Good turorial. My thoughts below (hope it adds to someone's understanding):
    We perform cross validation (to make sure that model has good accuracy rate and it can be used for prediction using unseen/new or test data). To do so, we use train and test data by properly splitting our dataset for example 80% for training, 20% for testing the model. This can be performed using train_test, train_test_split or K-fold (K-fold is mostly used to avoid under and overfiting problems).
    A model is considered as a good model when it gives high accuracy using training as well as testing data. Good accuracy on test data means, model will have good accuracy when it is trying to make predictions on new or unseen data for example, using the data which is not included in the training set.
    Good accuracy also means that the value predicted by the model will be very much close to the actual value.
    Bias will be low and variance will be high when model performs well on the training data but performs bad or poorly on the test data. High variance means the model cannot generalize to new or unseen data. (This is the case of overfiting)
    If the model performs poorly (means less accurate and cannot generalize) on both training data and test data, it means it has high bias and high variance. (This is the case of underfiting)
    If model performs well on both test and training data. Performs well meaning, predictions are close to actual values for unseens data so accuracy will be high. In this case, bias will be low and variance will also be low.
    The best model must have low bias (low error rate on training data) and low variance (can generalize and has low error rate on new or test data).
    (This is the case for best fit model) so always have low bias and low variance for your models.

    • @karthikparanthaman634
      @karthikparanthaman634 4 ปีที่แล้ว +2

      Wonderful summary!

    • @farhathabduljabbar9879
      @farhathabduljabbar9879 3 ปีที่แล้ว +8

      You should probably create articles coz you are good at summarising concepts!
      If you have one please do share!

    • @adegboyemoshood809
      @adegboyemoshood809 3 ปีที่แล้ว

      Great

    • @roshanid6523
      @roshanid6523 3 ปีที่แล้ว +1

      Very well written 👍🏻
      Thanks for sharing
      👍🏻 Consider writing blogs

    • @AnandKumar-to6ez
      @AnandKumar-to6ez 3 ปีที่แล้ว +1

      Really very nice and well written. After watching video, if we go through your summery, its a stamp on our brains. Thanks to both for your efforts.

  • @sandipansarkar9211
    @sandipansarkar9211 3 ปีที่แล้ว +225

    This video need to be watched again and again.Machine learning is nothing but proper understanding of ovrfitting and underfitting..Watching the second time.Thanks Krish

    • @adipurnomo5683
      @adipurnomo5683 3 ปีที่แล้ว +4

      Ageeed!

    • @batman9937
      @batman9937 3 ปีที่แล้ว +7

      This is what they asked me in OLA interview. And the interviewer covered great depth on this topic only. It's pretty fundamental to ML. Sad to report they rejected me though.

    • @ashishbomble8547
      @ashishbomble8547 2 ปีที่แล้ว

      @@batman9937 hi man plz help to know what other questions they asked .

    • @carti8778
      @carti8778 2 ปีที่แล้ว +2

      @@ashishbomble8547 buy the book :: ace the data science interview by Kevin Huo and nick singh .

  • @emamulmursalin9181
    @emamulmursalin9181 3 ปีที่แล้ว +32

    At 06:08 it is said that the underfitted data, the model has high bias and high variability. To my understanding, the information is not correct.
    Variance is the complexity of a model that can capture the internal distribution of the data points in the training set. When variance is high, the model will be fitted to most (even all) of the traiining data points. It will result in high training accruacy and low test accuracy.
    So in summary :
    When the model is overfitted : Low bias and high variance
    When the model is underfitted :High bias and Low variance
    Bias : The INABILITY of the model to be fit on the training data
    Variance : The complexity of the model which helps the model to fit with the training data.

    • @tejanarora1221
      @tejanarora1221 2 ปีที่แล้ว +2

      yes bro, you are correct

    • @rohitkumarchoudhary
      @rohitkumarchoudhary 2 ปีที่แล้ว

      I also have same doubt. @Krish Naik sir , please have a look on it.

    • @swapnilparle1391
      @swapnilparle1391 2 ปีที่แล้ว +1

      But under fitting suppose to have low accuracy of training data know ? Confusing !!

    • @prachinainawa3055
      @prachinainawa3055 2 ปีที่แล้ว +1

      Have I learned the wrong definition of bias and variance by krish sir's explanation? Now I am confused😑

    • @swapnilparle1391
      @swapnilparle1391 2 ปีที่แล้ว

      @prachi... not at all concept is at the end same

  • @rafibasha1840
    @rafibasha1840 3 ปีที่แล้ว +37

    Hi Krish,thanks for the explanation ..6:02 it should be high bias and low variance in case of under fitting

    • @mohitzen
      @mohitzen 2 ปีที่แล้ว +4

      Yes exactly i was looking for this comment

    • @singhvianuj
      @singhvianuj 2 ปีที่แล้ว

      Amazing video by Krish. Thanks for pointing out this. @Krish Naik please make a note of this

    • @shailzasharma5619
      @shailzasharma5619 2 ปีที่แล้ว

      yess!!!

    • @rohitkumar-gi8bo
      @rohitkumar-gi8bo ปีที่แล้ว

      yess

    • @ramyabc
      @ramyabc ปีที่แล้ว

      Exactly! I searched for this comment :)

  • @westrahman
    @westrahman 4 ปีที่แล้ว +22

    XGBoost, the answer cant be simple, but what happens is when dealing with high bias, do better feature engineering n decrease regularization, so in XGBoost we increase depth of each tree and other techniques to handle it to minimize the loss...so you can come to conclusion that if proper parameters are defined (including regularization etc) it ll yield low bias and low variance

  • @jiviteshvarshney3644
    @jiviteshvarshney3644 9 หลายเดือนก่อน +2

    6:00 Small correction in your video.
    Underfitting - High Bias & Low Variance
    Overfitting - Low Bias & High Variance

  • @sauravb007
    @sauravb007 4 ปีที่แล้ว +40

    XGBoost should have low bias & low variance !

    • @brahmihassane8499
      @brahmihassane8499 3 ปีที่แล้ว

      Not really it will depend how do you tune the hyperparameters of the model, for this reason it is important to tune a model in order to find a compromise that ensure a low biais (capacity of the model to fit a theoritical function) and low variance (capacity of model to generalisation)

  • @DS_AIML
    @DS_AIML 4 ปีที่แล้ว +49

    Underfitting : High Bias and Low Variance
    OverFitting : Low Bias and High Variance
    and Generalized Model : Low Bias & Low Variance.
    Bias : Error from Training Data
    Variance : Error from Testing Data
    @Krish Please confirm

    • @videoinfluencers3415
      @videoinfluencers3415 4 ปีที่แล้ว +1

      I am confused ...
      It means that underfitted model has high accuracy on testing data?

    • @akashhajare223
      @akashhajare223 4 ปีที่แล้ว +8

      Underfitting : High Bias and HIGH Variance

    • @kiran082
      @kiran082 4 ปีที่แล้ว +4

      @@videoinfluencers3415 I mean under fitting model has low accuracy on Testing and Training Data and the difference between the Training accuracy and test accuracy is very less, that's why we get low variance and high biased in Under fitting models.

    • @hiteshchandra156
      @hiteshchandra156 4 ปีที่แล้ว +1

      You are correct bro I checked on Wikipedia also..and in some different sources too.
      @Krish Please Confirm.

    • @sindhuorigins
      @sindhuorigins 4 ปีที่แล้ว +2

      If it makes it any clear for other learners, here's my explanation...
      BIAS is the simplifying assumptions made by a model to make the target function (the underlying function that the ML model is trying to learn) easier to learn.
      VARIANCE refers to the changes to the estimate of the target function that occur if the dataset is changed when implementing the model.
      Considering the linear model in the example, it makes an assumption that the input and output are related linearly causing the target function to underfit and hence giving HIGH BIAS ERROR.
      But the same model when used with similar test data, will give quite similar results and hence giving LOW VARIANCE ERROR.
      I hope this clears the doubt.

  • @vatsal_gamit
    @vatsal_gamit 4 ปีที่แล้ว +18

    at 6:10 you made it all clear to me in just 2 lines!! Thank you for this video :)

  • @adharshga9806
    @adharshga9806 3 ปีที่แล้ว +2

    Hi sir, I have a doubt. You said that in under fitting, model will have high bias and high variance @ 6:00 . But from many of the articles that I have read, they say that under fitting is when model has high bias and low variance. Which is correct. Please clear my doubt

    • @snrmedia8965
      @snrmedia8965 2 ปีที่แล้ว

      Right High bias and low variance

  • @ravibhat2849
    @ravibhat2849 4 ปีที่แล้ว +21

    Beautifully explained.
    But in underfitting, model shows High Bias and Low variance instead of high variance.

    • @krishnaik06
      @krishnaik06  4 ปีที่แล้ว +17

      Yes u r right...made a minor mistake

    • @namansinghal3685
      @namansinghal3685 4 ปีที่แล้ว

      @@krishnaik06 But then sir you said Bias is error and in underfitting training data error is low.. so should it be low bias?

    • @ravibhat2849
      @ravibhat2849 4 ปีที่แล้ว

      @@namansinghal3685 when data has high bias, it misses out on certain observations.. So the model will be underfit..

    • @jitenderthakur697
      @jitenderthakur697 4 ปีที่แล้ว

      @@namansinghal3685 in case of underfitting training error is high..not low

    • @gourav.barkle
      @gourav.barkle 3 ปีที่แล้ว +1

      @@krishnaik06 You should pin this comment

  • @premprasad3511
    @premprasad3511 3 หลายเดือนก่อน +1

    How do you know so much ? ?? You talk about machine language as if you are born embedded with all that knowledge ! God bless you with more knowledge and intelligence so that you will share with more people !

  • @shashankverma4044
    @shashankverma4044 4 ปีที่แล้ว +15

    This was my biggest doubt and you clarified it in so easy terms. Thank you so much Krish.

  • @MANISHKUMAR-c2d3c
    @MANISHKUMAR-c2d3c ปีที่แล้ว +1

    for underfitting the condition will be high bias and low variance which is mentioned as high bias and high variance in this video

  • @nidhimehta9278
    @nidhimehta9278 3 ปีที่แล้ว +4

    Shouldn't underfitting condition have low variance?

    • @smit9803
      @smit9803 3 ปีที่แล้ว +2

      No. You can consider low variance as how well your model is generalizing on real world data i.e. how well your model is performing on test data. When there is underfitting, your model didn't get trained properly on training data and as model has not trained well it won't be able to perform well on test data.

  • @ellentuane4068
    @ellentuane4068 3 ปีที่แล้ว +8

    Can't express my gratitude enough ! Thank you for explaining it so well

  • @devasheeshvaid9057
    @devasheeshvaid9057 4 ปีที่แล้ว +4

    Hi @Krish
    I read the following in a resource:
    "Bias refers to the gap between the value predicted by your model and the
    actual value of the data. In the case of high bias, your predictions are likely
    to be skewed in a particular direction away from the actual values.
    Variance
    describes how scattered your predicted values are in relation to each other."
    This doesn't imply bias as the training data error and variance as the test data error. Am I missing any point here? Please elaborate.

    • @sandygaddam
      @sandygaddam 4 ปีที่แล้ว +1

      Hi Devasheeesh,
      Variance occurs when the model performs good on the trained dataset but does not do well on a dataset that it is not trained on, like a test dataset or validation dataset. Variance tells us how scattered are the predicted value from the actual value. For easier understanding of the concept, we can take it as test or validation data error.
      Bias is how far are the predicted values from the actual values. If the average predicted values are far off from the actual values then the bias is high.

  • @anujvyas9493
    @anujvyas9493 4 ปีที่แล้ว +5

    XGBoost - Low Bias and Low Varience

  • @ashisranjanlahiri
    @ashisranjanlahiri 3 ปีที่แล้ว +4

    Hi... your topic explanation is awesome. Just to be curious about, how you are saying bias means training error and variance means test error. Is there any intuitive explanation or mathematical derivation for that?

  • @nayanisateeshreddy5124
    @nayanisateeshreddy5124 3 ปีที่แล้ว +1

    do you mean to say that for both Underfitting and Over fitting has high Variance? please check once
    i think for Underfitting : HIgh Bias and Low Variance where as Overfitting : Low Bias and High Variance.Am I thinking wrong?

    • @kalpthakkar5018
      @kalpthakkar5018 3 ปีที่แล้ว

      I too think so, probably you are correct. This video gives the wrong explanation.

  • @Hitesh-Salgotra
    @Hitesh-Salgotra 3 ปีที่แล้ว +4

    krish sir i hope God bless you with whole heart you are doing great job and thanks for the INEURON it made my life easy.

  • @sain5275
    @sain5275 2 ปีที่แล้ว +1

    as far as iknow .. Underfitting is low variance and high bias... Nott high nd high.. pls correct me if my understanding is incorrect

    • @siddhawan5190
      @siddhawan5190 2 ปีที่แล้ว +1

      yes you are right I was searching in comments no one pointed out this

  • @vinayakbachal8134
    @vinayakbachal8134 3 ปีที่แล้ว +2

    bhai, tu bahot sahi hai, 2.80 lacs fees bharke jo baat nahi samzi easily wo tumne 16 minutes me bata di..kudos..amazing word dear, all the very best

  • @The_Pavanputra
    @The_Pavanputra 2 ปีที่แล้ว

    Bias is in training data set and variance is in testing dataset - this line costed me linkedin machine learning job

  • @ParnikaTutorials
    @ParnikaTutorials 3 ปีที่แล้ว

    @6:13 Under fitting will be high bias and low variance.

  • @arjun8647
    @arjun8647 ปีที่แล้ว +1

    Bruh, Underfitting has high bias and LOW variance.... am i right??

  • @freshersadda8176
    @freshersadda8176 2 ปีที่แล้ว +4

    2:30 - underfitting and overfitting
    6:10 - Bias variance

  • @Neuraldata
    @Neuraldata 4 ปีที่แล้ว +2

    Can u plz rectify the issue Krish, as it will cause confusion to many.....we highly regard you ❣️

    • @krishnaik06
      @krishnaik06  4 ปีที่แล้ว +2

      Already updated the comment and pinned it

    • @Neuraldata
      @Neuraldata 4 ปีที่แล้ว +3

      @@krishnaik06 is it possible for you to caption during video....so that people watching will notice at that point itself (like us many may not see your comment, just a suggestion 🙂)

  • @ritikkumar876
    @ritikkumar876 ปีที่แล้ว +1

    very good explanation

  • @vipindube5439
    @vipindube5439 4 ปีที่แล้ว +4

    For Xgboost low bias high variance at start in the last it low variance and low bias.(Extreme gradient bossting)

    • @Prajwal_KV
      @Prajwal_KV 4 ปีที่แล้ว

      Then what is the difference between Random forrest and xgboost?what is the need for xgboost?when we can solve the problem using randomforrest?

    • @HARDYBOY290988
      @HARDYBOY290988 3 ปีที่แล้ว +1

      @@Prajwal_KV Regularization is there in XGBOOST

  • @garath
    @garath 3 ปีที่แล้ว +8

    Very thorough and good explanation! Thank you.
    Side note: Would like to point out that 2:12 the degree of polynomial is still 2 (its still a quadratic function).

  • @kajalkush1210
    @kajalkush1210 2 ปีที่แล้ว +1

    Way of explanation is woww.

  • @raushan3292
    @raushan3292 3 ปีที่แล้ว +2

    Let's simplify:
    Bias: Error on Train Data
    Variance: Error on Test Data.
    Thanks❤️

    • @adarshraj1467
      @adarshraj1467 2 ปีที่แล้ว

      What he said is wrong, underfitting doesnt ve high variance.

  • @Mary-gl4lz
    @Mary-gl4lz ปีที่แล้ว

    Hello Sir l For buas_var_decomp method, if we are not mentioning loss, by default what valuw loss parameter will take? loss, bias, var =bias_variance_decomp(model,X_train.values, y_trainnp, X_test.values, y_testnp) what is the measurement unit for bias variance and loss?

  • @aayush4056
    @aayush4056 4 ปีที่แล้ว +1

    Sir please check ur Insta dm👈

  • @subhajitsaha345
    @subhajitsaha345 3 ปีที่แล้ว +1

    Nice explanation..............

  • @sandipansarkar9211
    @sandipansarkar9211 3 ปีที่แล้ว +1

    watched it oncce again for better clarity

  • @aashishk4252
    @aashishk4252 ปีที่แล้ว

    Isn't it High Bias, Low Variance for underfitting? @KrishNayak
    Or is it both HB, HV and HB, LV?

  • @husseinfadin3354
    @husseinfadin3354 3 ปีที่แล้ว

    I thought High bias and Low Variance in underfitting conditions but when you have both high it just means your model is terrible? what i mean is not doing well on training set not very good on test set.
    So I did some research most articles they referred as High bias and low Variance for underfittings but I guess you can call it underfitting for both high bias and high variance too.
    in Summary:
    **Overfitting: Good performance on the training data, poor generalization to other data.(Low Bias, High Variance)**
    **Underfitting: Poor performance on the training data and poor generalization to other data.(High Bias, High Variance)or(High Bias, Low Variance)**

  • @VishwaTharunChalla
    @VishwaTharunChalla ปีที่แล้ว

    Bagging Algorithms have Low Bias and High Variance, so model needs to reduce the Variance here, while Boosting Algorithms have High Bias and Low Variance, so that model needs to reduce Bias here.

  • @samirkhan6195
    @samirkhan6195 2 ปีที่แล้ว

    At 1:45 you mentioning R-squared error , do you really think R-squared is an error term in ML Lingo? , You may want to use word Squared of Residual error , and both are used for different objectives , thus i kindly request , before uploading a video make sure you have your terminologies corrected upto the mark because it really affect understandings of beginnier and give them False-Positive sense of ML ..

  • @chimadivine7715
    @chimadivine7715 14 วันที่ผ่านมา

    Krish, your videos hit the nail on the head. You explained the meaning of bias and variance. Thanks a lot!

  • @basavarajag1901
    @basavarajag1901 2 ปีที่แล้ว

    Excellent Explanation.. Krish , in the same video you example of XG boost i.e it model learns from the previous DT and implement the same subsequently.

  • @BMEMDSALMANSHAMS
    @BMEMDSALMANSHAMS ปีที่แล้ว

    Guys, Underfitting me High bias and low Variance, (acc. to U-shaped curve for generalization error), isn't it?

  • @ManojKumar-wr3os
    @ManojKumar-wr3os 9 หลายเดือนก่อน

    Xgboost will reduce the bias as well as variance by training the subsequent model and by splitting the data. It will help us to reduce the underfitting.

  • @LogicorCode
    @LogicorCode 3 ปีที่แล้ว

    en.wikipedia.org/wiki/Overfitting as per this: Underfitting occurs if the model or algorithm shows low variance but high bias (to contrast the opposite, overfitting from high variance and low bias). buy the video in underfitting it is high bias and high variance ....please explain

  • @Aetherix-b7r
    @Aetherix-b7r 10 หลายเดือนก่อน

    xgboost is an optimized method of gradet boosting, than mean xgboost has low bias, and high variannce - mean can lead to overfitting, to aoide overfitting Post Pruning can be used, which is inbuilt in sklearn

  • @suchitramohanty2091
    @suchitramohanty2091 2 ปีที่แล้ว

    Sir..How to apply classification method on my dataset...can u plz help me ..i am just learning those things & want to devlope a project .

  • @VibhorMalik-gb6xs
    @VibhorMalik-gb6xs ปีที่แล้ว

    Please make correction in your video - underfitted model have high bias and low variance. Low variance because slight change in training data will not make any impact in the coefficients because its underfitted. It will have high test error terms but variance will remain low.

  • @aniruddhadeshmukh3571
    @aniruddhadeshmukh3571 ปีที่แล้ว

    iner underfiting HIGH BIAS AND LOW VARIANCE,PLEASE CORRECT IT

  • @adityavipradas3252
    @adityavipradas3252 ปีที่แล้ว

    Quick question. Shouldn't underfitting have high bias and low variance?

  • @lahiru954
    @lahiru954 3 ปีที่แล้ว

    I have a simple problem, might be a stupid one. How do we measure the test accuracy. because we just predict the labels for the test set, we don't know the output labels. So how do we define a test accuracy?

  • @bauyrjanjyenis3045
    @bauyrjanjyenis3045 2 ปีที่แล้ว +1

    Very succinct explanation of the very fundamental ML concept. Thank you for the video!

  • @himanshukarki
    @himanshukarki 2 ปีที่แล้ว

    Hi Krish from model perspective explanation is fine, but cudn't figure out whats the actual meaning of Bias and Variance... Is it a term related to model or data distribution

  • @pramodb9260
    @pramodb9260 ปีที่แล้ว

    overfitting: low bias and high variance
    underfitting: high bias and low variance

  • @dharmamaharjan1354
    @dharmamaharjan1354 4 ปีที่แล้ว +1

    XGBoost uses LASSO and Ridge regularization to prevent overfitting(low bias and high variance)

  • @prachinainawa3055
    @prachinainawa3055 2 ปีที่แล้ว

    If you have done a mistake in the video then can't you just correct it in a comment and pin the comment?
    I was preparing for an interview watched the video twice and learnt the wrong stuff, now I am very confused.
    Please pin the comment with correction

  • @ben6
    @ben6 4 ปีที่แล้ว +1

    If this is true, then why is both bias/ variance AND underfitting/ overfitting used? If one can be generated from another.

  • @GaneshSah-q8k
    @GaneshSah-q8k ปีที่แล้ว

    I think in underfitting scenario, there should be high bias and low variance. Please correct me if I am wrong.

  • @himalayakilaru
    @himalayakilaru 2 ปีที่แล้ว

    Great Video , but from my understanding underfitting is high bias and low variance...

  • @vinne1234
    @vinne1234 3 ปีที่แล้ว

    Cricket -- India all out 36 against Australia is Overfitting.. low bias, high variance

  • @cheeku5568
    @cheeku5568 4 ปีที่แล้ว +2

    One video all clear content... thanks bro it was really a nice session.. u really belong to low bias n low variance human. Keep posting such clear ML videos..

  • @radhikawadhawan4235
    @radhikawadhawan4235 ปีที่แล้ว

    xg boost or any other boosting techniqueis used of models that show underfitting and hence boosting technique shows high bias and low variance.

  • @vishnukv6537
    @vishnukv6537 3 ปีที่แล้ว

    you have a data science related doubt ? Krish got the solution :)...............so good video sir

  • @rishipatel2221
    @rishipatel2221 3 ปีที่แล้ว

    In here there is mistake in Underfitting condition. It should be High bias and low variance. If I am incorrect plz correct me.

  • @RAHUDAS
    @RAHUDAS 2 ปีที่แล้ว

    Why ,no one is mentioning the definition of bais and varience, I saw multiple tutorials every one bringing the definition of bais and varience from overfitting and underfitting . Is there any definition exists independently ??

  • @zrmsraggot
    @zrmsraggot 3 ปีที่แล้ว +1

    I was a bit confused at 10:25 when you underlined CV error in red since the red curve in the training error

  • @052_priti2
    @052_priti2 ปีที่แล้ว

    Sir ,there should be high bias and low variance for underfitting condition of model , please confirm it once

  • @rutdvajrawal7933
    @rutdvajrawal7933 2 ปีที่แล้ว

    Hi krish. Does bootstrap aggregration work for time series data. Becase, in time series data every event is correlated and it is not possible to send random events to each and every module.

  • @SAN-te3rp
    @SAN-te3rp 2 ปีที่แล้ว

    I can tell you when it comes to maths explaination no Indian youtuber can match Krish and he is also giving full course at ineuron for ml dl with affordable cost

  • @madhurjoshi2543
    @madhurjoshi2543 ปีที่แล้ว

    plz tell me in video u are telling underfitting is high bias and high variance but on gfg its telling low variance and high bias which is correct plz tell

  • @kumarssss5
    @kumarssss5 4 ปีที่แล้ว +2

    Excellent teaching

  • @osho_magic
    @osho_magic 2 ปีที่แล้ว

    so random forest is like taking audience poll in kbc and decision tree is like phone -a-friend lifeline in kbc
    so varience will automatically become low due to multiple cross check

  • @kaushalpatwardhan
    @kaushalpatwardhan 2 ปีที่แล้ว +2

    I have been trying to understand this concept since long ... But never knew its this simple 😀 thank u Krish for this amazingly simple explanation to understand.

  • @ganeshbhadrike452
    @ganeshbhadrike452 3 ปีที่แล้ว

    bro underfitting gives high bias n low variance and u have told high bias and high variance plz check it

  • @shekharpandey9776
    @shekharpandey9776 4 ปีที่แล้ว +3

    Please give a video on some mathematical terminology like gradient descent etc. You are really doing a great job.

  • @arpitadas6221
    @arpitadas6221 ปีที่แล้ว

    bestttt ...sir please make videos like this means in board....its better to understand this way

  • @shravanshukla5352
    @shravanshukla5352 ปีที่แล้ว

    Underfiting always high bais and low variance not high bais and high variance

  • @nasreenbegum3569
    @nasreenbegum3569 ปีที่แล้ว

    xgboost is having the property of low bias and low varience.

  • @loganwalker454
    @loganwalker454 3 ปีที่แล้ว

    After watching this video, you will subscribe to the channel.

  • @yourtube92
    @yourtube92 ปีที่แล้ว

    Very good video, easiest video for understanding logic of bias & variance.

  • @narmadaa2106
    @narmadaa2106 ปีที่แล้ว

    Sir
    I think variance is not the error on test data and
    In case of underfittin, Bias is high but variance is low

  • @tagoreji2143
    @tagoreji2143 2 ปีที่แล้ว

    Tqsm Sir.Very Valuable Information

  • @milanbhor2327
    @milanbhor2327 4 หลายเดือนก่อน

    The most clear and precise information 🎉 thank you sir❤

  • @slowhanduchiha
    @slowhanduchiha 4 ปีที่แล้ว +1

    Sir in the underfit model i suppose the variance wil be low because of the difference in the train set and test set

  • @Araj763
    @Araj763 2 ปีที่แล้ว

    Impt concept to remember :Bias is error of training data , variance is error of testing data.

  • @anime_on_data7594
    @anime_on_data7594 3 ปีที่แล้ว

    Krish sir .in ensemble learning method . Dose it have high bias and low variance?

  • @mahikhan5716
    @mahikhan5716 3 ปีที่แล้ว

    i reckon answer should be low bias and low variance since xgboost is a classifier from trees

  • @pranjalgupta9427
    @pranjalgupta9427 3 ปีที่แล้ว +1

    Thanks 🙏

  • @chandrachalla3466
    @chandrachalla3466 2 ปีที่แล้ว +1

    This is an awesome video - was fully confused earlier - this video made it all clear !! Thanks a lot sir !!

  • @sufyankhan4800
    @sufyankhan4800 2 ปีที่แล้ว

    Bhai vo blue line graph me.....aisa suddenly upar kyu gaya

  • @delwarhossain43
    @delwarhossain43 4 ปีที่แล้ว +1

    Very important discussion on important words in ML. Thanks. Easy explanation on hard words.

  • @RAHULPANCHALMUSIC
    @RAHULPANCHALMUSIC 2 ปีที่แล้ว

    Sir superb explanation 🙏🙏

  • @rajatsahu7024
    @rajatsahu7024 3 ปีที่แล้ว

    Krish I am big fan of your seriously.
    I want to meet you in my life.

  • @teamoak9427
    @teamoak9427 4 ปีที่แล้ว +3

    UNDERFITTING-high Bias,low variance
    OVERFITTING-low Bias,high variance
    GeneralModel-low Bias,low variance

    • @heartsbrohi9394
      @heartsbrohi9394 4 ปีที่แล้ว +1

      You are incorrect mate:
      Underfiting is high bias, high variance
      Overfiting
      Low bias, high variance
      Best fit
      Low bias, low variance

  • @pieropilcoreynoso3700
    @pieropilcoreynoso3700 ปีที่แล้ว

    bias=error of the training data
    variance= error of the test data

  • @deepthi5970
    @deepthi5970 ปีที่แล้ว

    Confusion regarding: underfitting and overfitting
    In many of the articles it is mentioned that
    Underfitting: High Bias, Low Variance and Overfitting: Low Bias, High Variance
    But in your video, you mentioned it as Underfitting: High Bias, High Variance, which one is correct.
    Also, in graph the variance curve seems to be drawn wrongly. A little bit confusing. Please clarify this

    • @ImBharathK
      @ImBharathK ปีที่แล้ว

      Yea same doubt
      Underfiting: High Bias, Low Variance
      Overfiting: Low Bias, High Variance

  • @saadiqbal6928
    @saadiqbal6928 3 ปีที่แล้ว

    What 'Factors' cause overfitting?

  • @kanhataak1269
    @kanhataak1269 4 ปีที่แล้ว +1

    After watching this video doubt is clear really helping this. And Thanks given ur precious time...

  • @aseemjain007
    @aseemjain007 3 หลายเดือนก่อน

    Brilliantly explained !! Thank you !!