Levenberg-Marquardt Algorithm

แชร์
ฝัง
  • เผยแพร่เมื่อ 24 ก.ย. 2024
  • Details of the Levenberg-Marquardt Algorithm and comparison between this method and the Gradient Descent and Newton-Raphson method are provided in this video.
    Gradient Descent Problems: 1:50
    Newton-Raphson for finding a function's extrema: 5:58
    Hessian Matrix: 8:59
    Newton-Raphson Problems: 18:31
    Levenberg-Marquardt Algorithm: 25:00
    MATLAB demo of applying all 3 algorithms to 2 multi-dimensional functions: 43:05
    The code for this video can be downloaded from here:
    drive.google.c...

ความคิดเห็น • 51

  • @janplechaty1702
    @janplechaty1702 4 วันที่ผ่านมา +1

    I usually don't look at videos longer than 30 minutes but WOW.. I saw it whole and it was amazing. Many thanks to you!

  • @ut971
    @ut971 2 ปีที่แล้ว +5

    Thank you soo much for uploading this. It means A LOT to every engeenering student in different parts of the world who is struggling to understand this algorithm.

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว +2

      You are welcome. Happy that you like the video. Please share this Channel with your friends.

  • @capsbr2100
    @capsbr2100 ปีที่แล้ว +3

    Fantastic. You made a complex subject seem easier to understand by your way of explaining it in a clear, intuitive, illustrative and easy language. Thank you very much.

  • @gabrielperez1369
    @gabrielperez1369 2 ปีที่แล้ว +2

    Excellent explanation! Your English is very good and easy to understand! Thank you very much!

  • @thedanebear
    @thedanebear 9 หลายเดือนก่อน

    Incredibly intuitive and helpful. Easily the best way out there to spend an hour to better understand this topic

  • @smchiew7708
    @smchiew7708 ปีที่แล้ว +1

    Very clear explanation for the Levenberg-Marquardt algorithm. Thank you so much!

  • @neoneo1503
    @neoneo1503 2 ปีที่แล้ว +2

    Thanks for your explanation!! The Levenberg-Marquardt method that balances the converge-speed(Newton method) and converge-robustness(GD)

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      You are welcome. Happy to hear that you found the video useful. Please share this channel with your friends.

    • @neoneo1503
      @neoneo1503 2 ปีที่แล้ว

      @@mehran1384 Yeah I will😊, Thanks!

  • @pedrohenriquesiscato9768
    @pedrohenriquesiscato9768 2 หลายเดือนก่อน +1

    Thank you for that video. Excellent explanation!

  • @justman7656
    @justman7656 11 หลายเดือนก่อน +1

    Great and very clear explanation! Thank you so much for your work

  • @martvald
    @martvald 7 หลายเดือนก่อน +1

    Thanks for the explanation. I will add that this is not LM though, this is a trust region method using GD and NR. While LM is a trust region-based method using GD and gauss-newton (GN). They look similar, but you would end up with x_(n+1) = x_n - (J^T*J + kI)J^T*E_n, where k is lambda, J is the jacobian matrix and E_n is error vector (see GN). But other than that, the explanation on how the weights etc is used is very descriptive.

    • @mauriciogonzalez1998
      @mauriciogonzalez1998 6 หลายเดือนก่อน

      Hi, where could I look an explanation this clear about the real LM method?

    • @eaglezhou1243
      @eaglezhou1243 3 หลายเดือนก่อน

      You are right. Strictly speaking, LM method is a trust region based method that solves the nonlinear least square problem. And in which Hessian uses JTJ instead of the conventional second order derivative. And gradient descent is replaced by the error vector.

  • @skymanaditya
    @skymanaditya 2 ปีที่แล้ว +1

    Great video. Explained with utmost clarity!

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว +1

      thanks. happy you liked it.

  • @mokhaladhasan6937
    @mokhaladhasan6937 ปีที่แล้ว

    Many thanks to you , it was very clear and simple explanation from a professional person. My understanding of this algorithm was stuck in some points (as GD😊😊 ) until this video.

  • @vlado.erdman
    @vlado.erdman 2 ปีที่แล้ว +1

    Great, easy to understand explanation. Thank you.

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      Happy that you found the video easy to follow. Please share this channel with your friends.

  • @polinba
    @polinba ปีที่แล้ว +1

    Thank you for the amazing video! It helped me a lot!

  • @shafqatjabeen1104
    @shafqatjabeen1104 ปีที่แล้ว

    Thank you so much for this video. Very clear information

  • @workaccount6597
    @workaccount6597 2 ปีที่แล้ว +1

    I have been binge watching you videos about non-linear equation and their solvers and optimisers. By, every video I am getting more clarity. Your background in teaching students at different levels really helps you explaining very clearly. I question thought, do you think we( as in viewers) get the material from your videos?

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      Thanks. I am not sure if I understood your question about getting the material? Could you elaborate?

    • @workaccount6597
      @workaccount6597 2 ปีที่แล้ว

      @@mehran1384 The one note notes are what I meant.

  • @zheka47
    @zheka47 2 ปีที่แล้ว +1

    Amazing explanations!

  • @priyachimurkar6058
    @priyachimurkar6058 2 ปีที่แล้ว +1

    Nice Videos with excellent demonstration

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      Happy to hear that you liked the video. Please share this channel with your friends.

  • @Chadwikj
    @Chadwikj 7 หลายเดือนก่อน

    Fantastic. Thank you!

  • @RAJIBLOCHANDAS
    @RAJIBLOCHANDAS 2 ปีที่แล้ว +1

    Excellent video

  • @kleanthiskaramvasis9512
    @kleanthiskaramvasis9512 2 ปีที่แล้ว +1

    Excellent presentation :) :)

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      Thank you. Please share this channel with your friends.

  • @minute_machine_learning5362
    @minute_machine_learning5362 7 หลายเดือนก่อน

    great talk and heavily informative.
    can you provide the sheet that you are presenting?

  • @kihoon2217
    @kihoon2217 2 ปีที่แล้ว +1

    Great lecture

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      Thank you. Please share this channel with your friends.

  • @tshipmatic
    @tshipmatic 2 ปีที่แล้ว +1

    Awesome video! easy to follow along. One question, is there a way to choose the initial value of lambda? or any value would work?

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      Sorry for the late response. Since lambda changes by an order of magnitude each time, the initial value of it is not so critical. An imperfect lambda just slows downs the entire convergence by only a few iterations.

  • @ИльяЧугунов-д1с
    @ИльяЧугунов-д1с ปีที่แล้ว

    That's great!

  • @tsalex1992
    @tsalex1992 ปีที่แล้ว

    Thanks for the video! From my understanding the most common heuristic for lambda is to having the increase factor be smaller than the decrease factor. However, I'm not sure that I understand the rational since we expect the algorithm to have more decreasing steps. At some point lambda will reach zero, or at least zero in the numerical sense - can you elaborate a bit more on this point?

  • @wwefan9391
    @wwefan9391 2 ปีที่แล้ว +1

    Thank you for this great video ,but I'm just wondering,in the matlab code for the gradient descent method, why did you divide by norm(temp)? what's the purpose of it?

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      You are welcome. Diving by norm gives a unit vector (direction only) of the notion and magnitude of it is determined by alpha.

    • @wwefan9391
      @wwefan9391 2 ปีที่แล้ว

      @@mehran1384 im a bit weak in linear algebra so I'm not sure what is alpha? Also norm(temp) is taking the norm of 2×2 matrix correct? Does dividing by the norm of a matrix also gives us the unit vector like when dividing by the norm of a vector? Because I thought taking the norm of a matrix gives us info about how big the elements are

  • @danielhelmanlee5126
    @danielhelmanlee5126 2 ปีที่แล้ว +1

    Is this least squares and levenberg-marquardt algorithm? I see things like Jacobian matrix in other resources...

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      This is the standard LM algorithm. It has least squares as a part of it.

  • @DongIncheonExpress
    @DongIncheonExpress 2 ปีที่แล้ว

    Great Work! Thank you for the good explanation. Can i get your OneNote Lecture Notes that you showed to us in this lecture?

  • @gianmarcoalarcon6185
    @gianmarcoalarcon6185 2 ปีที่แล้ว +1

    Nice Video!!!

  • @mohammadsheikhpour6612
    @mohammadsheikhpour6612 2 ปีที่แล้ว

    thank you so much

  • @sephgeodynamics9246
    @sephgeodynamics9246 2 ปีที่แล้ว +1

    thank you

    • @mehran1384
      @mehran1384  2 ปีที่แล้ว

      You are welcome. Please share this channel with your friends.