Lecture 06 - SIFT - 2014

แชร์
ฝัง
  • เผยแพร่เมื่อ 30 ม.ค. 2025

ความคิดเห็น • 29

  • @joshmiller5703
    @joshmiller5703 6 ปีที่แล้ว +7

    Take a shot for every time he tries to watch naruto during the lecture

  • @DIYGUY999
    @DIYGUY999 6 ปีที่แล้ว +1

    I am confused a bit. In the butterfly example you told select the scale which gives maximum response. And now in DOG we look at 26 neighbours and select and extrema. How does these two things connect?

  • @ner30
    @ner30 8 ปีที่แล้ว +3

    I dont get the usage of the taylor series development. why is this done?

  • @soumalyasahoo2664
    @soumalyasahoo2664 4 ปีที่แล้ว +1

    If you are struggling to understand this video go through his 2012 lecture on SIFT and come back.

  • @stoka43
    @stoka43 7 ปีที่แล้ว

    In the exractrion of local image descriptors, it is mentioned " store numbers in a vector" which numbers is are referred here ?

  • @nioncao
    @nioncao 5 ปีที่แล้ว +3

    Best SIFT introduction in the internet. do not agree, show me one better

  • @MonkkSoori
    @MonkkSoori 4 ปีที่แล้ว

    How is the orientation of the key point used? He explained making the 128 dimensional vector as a descriptor for key point matching, and he said previously that the orientation of the key point is for rotation invariance, but is the direction of the key point added to the 128-D vector? Where is it stored etc?

    • @MonkkSoori
      @MonkkSoori 4 ปีที่แล้ว +1

      It's still not super clear but I found the answer in the original paper: "In order to achieve orientation invariance, the coordinates of the descriptor and the gradient orientations are rotated relative to the keypoint orientation. For efficiency, the gradients are precomputed for all levels of the pyramid as described in Section 5."

  • @jibran6635
    @jibran6635 4 ปีที่แล้ว

    His lecture makes far more sense when students start asking questions persistently. Otherwise, its easy to get lost in the details.

  • @SunilMeena-do7xn
    @SunilMeena-do7xn 4 ปีที่แล้ว +1

    Great explanation

  • @TheBirdBrothers
    @TheBirdBrothers 9 ปีที่แล้ว +2

    amazing series, thank you for sharing them.

  • @coffle1
    @coffle1 8 ปีที่แล้ว

    How can we match interest point gradient histograms if when we encoded it into the 128 size vector, we could be making the same vector but permute the xi's in the vector?
    So essentially what I'm asking is if we were to somehow have a gradient vector of [1;0] of an interest point of an image, and have an interest point in another identical, rotated image that puts the 0 in the vector first so that the vector is [0;1], how can we find a way to match these up?

    • @thanaphontangchoopong1449
      @thanaphontangchoopong1449 5 ปีที่แล้ว

      It's late, but (might be useful for the others) the key is that it is a "relative histogram" with respect to the highest magnitude orientation of the considering keypoint.

  • @ayushdayani7157
    @ayushdayani7157 3 ปีที่แล้ว

    he was vague during initial, further outlier rejection and scale-space peak detection. After watching the video for 3rd time I am still not able to understand it.

  • @hamedalsufyani2628
    @hamedalsufyani2628 9 ปีที่แล้ว +2

    that is amazing lecturer. I like the clarity of your speech and it is really useful presentation. Thank you very much prof

  • @soumalyasahoo2664
    @soumalyasahoo2664 4 ปีที่แล้ว

    can somebody explain why k^2*sigma is higher in 2nd level than in the 1st level?.

    • @zhengqianyu7913
      @zhengqianyu7913 4 ปีที่แล้ว

      I have a similar question, if you solved your question, please leave me a comment

  • @roar363
    @roar363 8 ปีที่แล้ว

    why is the content of this lecture exactly same as the last? (lecture 5)

    • @hoanghieu6389
      @hoanghieu6389 8 ปีที่แล้ว +2

      i don't think so, lecture 5 is about pyramids.

    • @mouadmorabit3618
      @mouadmorabit3618 8 ปีที่แล้ว +2

      Pay attention to the year, there is a playlist for 2012 and 2014
      (Chapter 5 - SIFT in the 2012 version)

    • @roar363
      @roar363 8 ปีที่แล้ว +1

      Mouad Morabit
      yea i realized later

  • @feyilon
    @feyilon 6 ปีที่แล้ว

    Thanks a lot very well explained... It helps a lot...

  • @nkhullar1
    @nkhullar1 4 ปีที่แล้ว

    Thank you * infinity for this lecture.

  • @awais_latif
    @awais_latif 4 ปีที่แล้ว +1

    he is not clearing anything just reading slides.

    • @nkhullar1
      @nkhullar1 4 ปีที่แล้ว

      What the hell? He is the best teacher .. please point me to the better lecture than this.

  • @MW-yz8rm
    @MW-yz8rm 9 ปีที่แล้ว

    i like this version better

  • @debangshasarkar5137
    @debangshasarkar5137 5 ปีที่แล้ว

    The presentation is horrible

    • @nkhullar1
      @nkhullar1 4 ปีที่แล้ว

      Really? How? Please point me to the lecture better than this.