(ML 15.1) Newton's method (for optimization) - intuition

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 ธ.ค. 2024

ความคิดเห็น • 40

  • @walete
    @walete 2 ปีที่แล้ว

    The way you explain this is so helpful - love the comparison to the linear approximation. Thank you!

  • @amirreza08
    @amirreza08 ปีที่แล้ว

    It was one of the best explanations, so informative and helpful. Thank you!

  • @evilby
    @evilby 2 ปีที่แล้ว +1

    man, perfect explanation. clear and intuitive!

  • @AjaySharma-pg9cp
    @AjaySharma-pg9cp 6 ปีที่แล้ว

    Wonderful video for clearing optimization of newtons method for finding minima of function in machine learning

  • @MrPaulrael
    @MrPaulrael 12 ปีที่แล้ว +5

    I am a PhD student and I will be using optimization methods in my research.

  • @johnjunhyukjung
    @johnjunhyukjung 2 ปีที่แล้ว

    This was exactly what I needed, thank you!
    after learning Newton's method for finding the x-intercept, I was confused at first on how it was being used for minimization problems

  • @rounaksinghbuttar9083
    @rounaksinghbuttar9083 3 ปีที่แล้ว

    Sir your way of explaining is really good.

  • @moazzammalik1410
    @moazzammalik1410 7 ปีที่แล้ว +1

    Can you please make a video on levenberg method. Since there is no lecture available on this topic

  • @AJ-et3vf
    @AJ-et3vf ปีที่แล้ว

    Great video. Thank you

  • @max2buzz
    @max2buzz 11 ปีที่แล้ว +1

    So here is the thing ... My function y = (guess^2 - x )
    Now i want to minimize y by approximating guess
    so i use first order with guess = guess - (guess^2 - x)/2guess ....
    which is xt+1 = xt - f(x)/f'(x)
    but if i do on more derivative
    then xt+1 = xt - f ' (x)/f " (x) which is guess = guess - (2guess)/2 = guess - guess = 0
    What to do
    the function is finding square root by newtons method

  • @nikpapan
    @nikpapan 9 ปีที่แล้ว +1

    Thanks for posting these videos. They are quite helpful. So, to ensure that we minimize and not maximize, is it sufficient to ensure that the newton step has the same sign (goes towards the same direction) as the gradient? Is it ok to just change the sign of the step if that's not the case? (my experiments seem to indicate its not, but what should be done then?)

  • @alexn2566
    @alexn2566 4 ปีที่แล้ว +1

    Soooo, if 2nd order is faster than 1st order, why not try 3rd order too?

    • @rushipatel5241
      @rushipatel5241 3 ปีที่แล้ว

      Hii @Alex N ,as per my knowledge this methods are used for Machine learning, where gradient descent is a classical algorithm to find minimum of a function( not always zero), If you know basics about ML then you will be familiar with loss function , so we have to minimize that function, for that we need its derivative to be zero, for finding that we use gradient as direction where the change in function is maximum.Now we have the direction but the we dont have the magnitude , for that we use a learning rate as a constant which is what 1st order does.In 2nd order we would use the magnitude which gives us the magnitude for which the point where derivative of function is 0 can be reached in less iterations.Thus 3rd order will ultimately result in finding the minimum of dervative of the loss function , but we need to find minimum of the loss function so ,it will be useless. Hope this was helpul

  • @AK-vb2dp
    @AK-vb2dp 6 ปีที่แล้ว +13

    Video makes sense up until the point where the "pictorial" representation of the 2nd order method comes in. That to me makes absolutely no sense whatsoever, the "pictorial" should not be the function itself but you rather the 1st derivative of the function and you apply Newton's method to that.

    • @Dupet
      @Dupet 4 ปีที่แล้ว +1

      I think that the visualization makes sense if we think about approximating the function f(x) by its second order Taylor expansion around x_t. Taking the derivative of the second order Taylor expansion and setting it equal to zero leads us to the formula of the Newton's method for optimization. This operation is the same as minimizing the second order approximation of the function at x_t as depicted in the video.

  • @TheCoolcat0
    @TheCoolcat0 7 ปีที่แล้ว

    Illuminating! Thank you

  • @abhinavarora6574
    @abhinavarora6574 9 ปีที่แล้ว

    Your videos are awesome!

  • @minivergur
    @minivergur 11 ปีที่แล้ว +2

    This was actually quite helpful :)

  • @lordcasper3357
    @lordcasper3357 8 หลายเดือนก่อน

    this is so good man

  • @kevin-fs5ue
    @kevin-fs5ue 5 ปีที่แล้ว

    really appreciate your work :)

  • @ericashivers5489
    @ericashivers5489 2 ปีที่แล้ว

    Amazing! Thanks

  • @aviraj017
    @aviraj017 9 ปีที่แล้ว

    thanks , very informative

  • @KKyrou
    @KKyrou 4 ปีที่แล้ว

    very good. thank you

    • @dellpi3911
      @dellpi3911 3 ปีที่แล้ว

      th-cam.com/video/kxftUHk7NDk/w-d-xo.html

  • @dariocline
    @dariocline หลายเดือนก่อน

    thanks boss

  • @mfurkanatac
    @mfurkanatac ปีที่แล้ว

    THANK YOU

  • @danielseita5552
    @danielseita5552 9 ปีที่แล้ว

    Thank you for the video!

  • @anhthangyeu
    @anhthangyeu 13 ปีที่แล้ว

    Thanks so much for posting!!

  • @akulsinator7680
    @akulsinator7680 3 ปีที่แล้ว

    Thank you u god among men

    • @dellpi3911
      @dellpi3911 3 ปีที่แล้ว

      th-cam.com/video/kxftUHk7NDk/w-d-xo.html

  • @MolotovWithLux
    @MolotovWithLux 5 ปีที่แล้ว

    #IntuitiveAlgorithm finding where zero of a function

  • @fireboltthegod
    @fireboltthegod 11 ปีที่แล้ว +1

    Shreyas Rane Now I see where you study from.

    • @max2buzz
      @max2buzz 11 ปีที่แล้ว +1

      Busted

  • @vijayd15
    @vijayd15 4 ปีที่แล้ว

    damn good!

  • @rafaellima8146
    @rafaellima8146 10 ปีที่แล้ว

    cool! ;D

  • @albertyao6181
    @albertyao6181 5 ปีที่แล้ว +2

    comparing to Andrew Ng's explanation, this one is hard to understand

  • @chonssdw
    @chonssdw 12 ปีที่แล้ว

    thanks for the video. could you please check your inbox, I have some further questions, thanks!!!