Multivariate Newton's Method and Optimization - Math Modelling | Lecture 8

แชร์
ฝัง
  • เผยแพร่เมื่อ 10 ก.พ. 2025
  • In this lecture we introduce Newton's method for root-finding of multivariate functions. This lecture extends our discussion in Lecture 4 for single-variable root-finding. Once the method is introduced, we then apply it to an optimization problem wherein we wish to solve the gradient of a function equal to zero. We demonstrate that Newton's method offers a powerful tool that can complement solving optimization problems.
    This course is taught by Jason Bramburger for Concordia University.
    More information on the instructor: hybrid.concord...
    Follow @jbramburger7 on Twitter for updates.

ความคิดเห็น • 6

  • @JosephRivera517
    @JosephRivera517 ปีที่แล้ว +3

    This is very helpful. I have been looking for a resource to help me with my optimization problem, and this one is gold.

  • @derekcobo1453
    @derekcobo1453 ปีที่แล้ว +1

    Great video. Was very intimidating when I first saw it.

  • @nikhilraj4317
    @nikhilraj4317 ปีที่แล้ว +1

    This was very helpful! Thank you!

  • @drioko
    @drioko 9 หลายเดือนก่อน +1

    how did this cost you less than a minute running. i programmed this algorithm in my matlab for 10 iterations and it doesn't render the final result its been 10 minutes. do you have a super powerful computer?

    • @jasonbramburger
      @jasonbramburger  9 หลายเดือนก่อน

      I suspect you're implementing wrong. Check your Jacobian matrix as this is the main bottleneck.

  • @abursuk
    @abursuk ปีที่แล้ว

    Hey there, thnx for the video!
    I have a question tho, you mentioned that J matrix should be invertible(means square, means number of variables = number of functions), but I want to find intersection points of 3 (or more) functions(say hyperbolic curves or something similar)? there are only 2 variables: x, y but 3(or more) functions. which means J matrix will not be square -> will not be invertible.
    Thnx in advance!