3.1 Intro to Gradient and Subgradient Descent

แชร์
ฝัง
  • เผยแพร่เมื่อ 5 ม.ค. 2025

ความคิดเห็น • 8

  • @ayuanyang3092
    @ayuanyang3092 3 ปีที่แล้ว +4

    excellent course

  • @itachi4alltime
    @itachi4alltime 3 ปีที่แล้ว +1

    23:15 choose the subdifferential gx=[10m,-1], where m in [-1,0) ?

  • @atchutram9894
    @atchutram9894 3 ปีที่แล้ว

    The way the course is being built, it is a great series. Thanks to Prof. Constantine Caramanis
    One comment on this particular video is, instead of stating that solving the optimization problem is difficult, an example will be very helpful to appreciate the usefulness of the approximation of the functions. Motivation for going for iterative methods instead of solving directly the gradient equation will also help.

  • @huuhieuphan4708
    @huuhieuphan4708 3 ปีที่แล้ว +2

    I from Vietnam. Thank for course

  • @petere9383
    @petere9383 ปีที่แล้ว

    If I may ask, the graph you have of |10x1| + |x2| doesn't seem correct to me. For example, at (0,-3) we should have f(x1,x2) be (0,3) having no negative values or am I missing something?

  • @sanjayksau
    @sanjayksau 3 ปีที่แล้ว

    How is the convergence estimated as exp(-ct) at 17:30?

    • @tarunkrishna1564
      @tarunkrishna1564 3 ปีที่แล้ว

      its not exp() it's a constant raised to (-ct). If I understood it correctly.

    • @atchutram9894
      @atchutram9894 3 ปีที่แล้ว +1

      It is equivalent:
      let
      K = (1-6*eta)
      and
      C = -ln(K) = - ln(1-6*eta),
      then
      K = exp(-C),
      and
      (1-6*eta)^t = K^t = exp(-Ct)