As a French student, i understand most of the video and find it way clearer than the orignal paper from Lowe. I think it is due to to quality of the presentation, the fluency of the teacher. Moreover you can feel that the teacher knows what he is talking about! :) Great video!
Thanks to Dr Shah and the uploader. I just did this in class at my university, but it wasn't half as clear as this one. Very helpful. Amazing that an 8 year old recorded lecture is more relevant than a current live one
The part where he explained how the Laplacian of Gaussian works as a specific size of blob detector to achieve scale invariance at 18:19 was really helpful for me. My CV professor just skipped straight to difference of Gaussians and I didn't get why we used them or the benefit of it until now.
and, in the figure, two Probability Density Functions (pdf) are shown. the first bell-shape curve is the PDF for all correct cases. You can see that it's a normal distribution, or gaussian distribution. in most cases, the ratio (horizontal axis) has a value less then 0.8. The 2nd PDF is for the wrong cases, meaning we found a close but wrong match, it's a parabola, and most cases have a ratio larger than .08. so 0.8 is a good value to use.
this is for the purpose of robustness. for a descriptor in image 1, we may find more than one pretty close matches in image 2, the closeness of these matches are measured by their Euclidean distance from the descriptor in image 1. The smaller the distance the better. The ratio is the ratio between the best match and 2nd best match.
i dont understand one thing in 0:26.55. after half-sampling the image with the k^2*sigma scale, are we gonna apply gaussian with k^2*sigma scale on the image that we obtained from sampling? i would be glad if somebody explain that.
A way to downscale an image is by taking + skipping one row/col. Assuming your img is 4x4. You take Rows/Cols 1,3 while discarding Rows/Cols 2,4. You new img is 2x2 which is a downscaled version of the original. There are other methods thou. i.e. Taking the average of an 2x2 window into a pixel.
Sry maybe its a stupid question. I wonder if all the computation in the videos, where they show how they track an object like a card, is done in "real time"? I mean its computationally expensive to compute the coeffs for e.g. an affine transformation, right?
I think that when he tried to explain the Key Point matching at 57.50 he didn't see the words "minimum Euclidean Distance". That would have helped him a lot. It happens sometimes.
I am just wondering why Harris keypoint detector +SIFT desciptor is popular approach ? SIFT keypoints are scale invariant wheras Harris Keypoints are not ...
0.8 means first and 2nd best matches are too much closer. actual match can be 2nd best but due to some noise we are getting it as 2nd best instead of 1st or in other words 1st best can be wrong so we are taking chance. Graph is experimental results that from 0.1 to 0.8 first best match is best there are some very small wrong matches. but after 0.8 the first best is not correct. It is taken according to experimental results not according to some specific theory.
Time is NECESSARILY possible/potential AND actual IN BALANCE, AS E=MC2 IS F=ma; AS ELECTROMAGNETISM/energy is gravity ON BALANCE. Great !!!! By Frank DiMeglio
Dr. Shah is more concerned about citations than SIFT. I wish he described SIFT as detailed as the importance of citations (in his universe). This way there are still unexplained things.
One of the best lectures I have seen. Very clear explanation of all the technical steps.
This professor has a talent for explaining things clearly and concisely.
or maybe good slides
Or maybe both
true
I think this guy is great. This is the first time I have bothered to write something about anything on the internet apart from facebook.
As a French student, i understand most of the video and find it way clearer than the orignal paper from Lowe. I think it is due to to quality of the presentation, the fluency of the teacher. Moreover you can feel that the teacher knows what he is talking about! :) Great video!
Thanks to Dr Shah and the uploader. I just did this in class at my university, but it wasn't half as clear as this one. Very helpful. Amazing that an 8 year old recorded lecture is more relevant than a current live one
The part where he explained how the Laplacian of Gaussian works as a specific size of blob detector to achieve scale invariance at 18:19 was really helpful for me. My CV professor just skipped straight to difference of Gaussians and I didn't get why we used them or the benefit of it until now.
A polished lecture given by a nice guy. Dr. Mubarak describe SIFT in a straightforward way.
Does the scale refer to Sigma of Gaussian within an octave or downsampled image size?
Where are we using DOG's calculated on downsampled image?
BEST VIDEO ON SIFT! Explains the algorithm really well. Thank you so much.
Even though i am new to CV he clearly made me to understand about SIFT.. Thanks! professor.. :)
Thank you so much for these videos, very detailed and helpful :) please don't stop posting these lectures.
36:00 Difference between edges and interest points in terms of Laplacian of Gaussian.
The best SIFT explanation I ever found. Thanks
This lecture really helped me acquire a better understanding of the SIFT algorithm. Thank you very much.
after playing it almost 5 times over a month every thing is clear now.
Thanks! I like how at 54:14 he says "and that's it. you can describe this in one slide".
and, in the figure, two Probability Density Functions (pdf) are shown. the first bell-shape curve is the PDF for all correct cases. You can see that it's a normal distribution, or gaussian distribution. in most cases, the ratio (horizontal axis) has a value less then 0.8. The 2nd PDF is for the wrong cases, meaning we found a close but wrong match, it's a parabola, and most cases have a ratio larger than .08. so 0.8 is a good value to use.
That is a great demonstration on the SIFT algorithm. Thanks much!
This is the definition of greatness
Very clear explanation! I was very interested in this topic because of the way it was delivered.
53:50
SIFT in a nutshell ..... why SIFT is 128 dimensions and how it's extracted from actual image.
- Thanks for uploading the video :)
this is for the purpose of robustness. for a descriptor in image 1, we may find more than one pretty close matches in image 2, the closeness of these matches are measured by their Euclidean distance from the descriptor in image 1. The smaller the distance the better. The ratio is the ratio between the best match and 2nd best match.
28:55 , isnt sigma supposed to be = 1.6 in the start ( and not 0.707 )
Thank you for your contribution, it's much easier for me than reading the paper myself.
This lecture is so good. I loved the way of explaining it by Dr. Shah
Clear and concise explanation. Smart way
amazing, very clear explanation of each step involved. good job sir
Amazing Lecture! A comprehensive explanation
Can we get the presentation slides, the link provided is showing an error
Thanks! That's very clear explanation of SIFT. Much better than my professor..
I have to say, this point and explanation is much better than mine.
It's really helpful for getting the gist of SIFT. Thank you so much!
i dont understand one thing in 0:26.55. after half-sampling the image with the k^2*sigma scale, are we gonna apply gaussian with k^2*sigma scale on the image that we obtained from sampling? i would be glad if somebody explain that.
By the way. Very good lecture, thanks a lot for publishing this. God bless you all.
Extremely good and clear explanation, thank you for this video
The professor looks like Mohammad Reza Pahlavi :))
Excellent lecture btw
At 26.30 time, What does it mean by every other rows and every other column in down sampling process?
A way to downscale an image is by taking + skipping one row/col.
Assuming your img is 4x4. You take Rows/Cols 1,3 while discarding Rows/Cols 2,4. You new img is 2x2 which is a downscaled version of the original.
There are other methods thou. i.e. Taking the average of an 2x2 window into a pixel.
+Hans Hardmeier thank you lot
I've been looking for an explanation of how sigma values are computed to lead to those results for a while now. Thank you.
Thanks a lot Dr. Mubarak Shah.
Sry maybe its a stupid question. I wonder if all the computation in the videos, where they show how they track an object like a card, is done in "real time"?
I mean its computationally expensive to compute the coeffs for e.g. an affine transformation, right?
Can somebody explain the figure at 01:04:46 ? What is the ratio of distance from.... and what the figure says? why we choose 0.8 ?
Thank you sir for giving this lecture. It helps me a lot.
Very informative. Best explanation about SIFT
by using sift algorithm can we identify color of image?
[01:08:34] «You have to write good papers which can be cited» - Dr. Mubarak Shah
is there any lecture for SURF ? ... similar to SIFT
Excellent video! Thanks! Keep them coming please.
Good explanation professor.
can anyone explain zero crossings vs scale space graph.
Thank you Sir for explaining it so clearly and in great detail.
I think that when he tried to explain the Key Point matching at 57.50 he didn't see the words "minimum Euclidean Distance". That would have helped him a lot. It happens sometimes.
I am just wondering why Harris keypoint detector +SIFT desciptor is popular approach ? SIFT keypoints are scale invariant wheras Harris Keypoints are not ...
+Sai Manoj Prakhya because they combine the qualities of descriptor and detector
lots of thanks, Great and simple explaination
Hairless points? What is he saying?
watch the previous lecture. He covered corner detection (Harris point) there
yeh those points have to be well groomed you know
0.8 means first and 2nd best matches are too much closer. actual match can be 2nd best but due to some noise we are getting it as 2nd best instead of 1st or in other words 1st best can be wrong so we are taking chance. Graph is experimental results that from 0.1 to 0.8 first best match is best there are some very small wrong matches. but after 0.8 the first best is not correct. It is taken according to experimental results not according to some specific theory.
Very good lecture, helped me a lot. Thank you!
Sir, very nice. Great lecture.
very well explained
very good lecture,thumbs up!
Thank you sir, for your nice explanation and information.
Thanks a lot! Very good and detailed explanation!
great series!
very grateful :)
ALLL professors explain SIFT as a literature review, no one can explain it practically, we need David Lowe himself to explain his theory!
very nice lecture!
wonderful lecture ...Thanks a lot
does anyone have the matlab code for SHIFT?
clear explanation .....thanks
Life-saver, thank you so much!
Time is NECESSARILY possible/potential AND actual IN BALANCE, AS E=MC2 IS F=ma; AS ELECTROMAGNETISM/energy is gravity ON BALANCE.
Great !!!!
By Frank DiMeglio
Great tell. Can anyone tell me how to find or create datasets for detecting sexually explicit images?
great lecture thanks!!
Demo Software: SIFT Keypoint Detector
David Lowe
www.cs.ubc.ca/~lowe/keypoints/
Thank you. It is interesting video
this is a UG course??
Yes it is taught in 3rd year ,but in fact in some places taught in 2nd year ,
thank you... :) this is awesome...
well explained..
Super clear, saved my ass.
thanks a lot sir!
ONE WORD
AMAZING
Thank you!
Nice!
thank so much
Thankyou sir
thanks a lot very clear
nice though I didn't understand a bit !!
hahaha
Dr. Shah is more concerned about citations than SIFT. I wish he described SIFT as detailed as the importance of citations (in his universe). This way there are still unexplained things.
Ok
i am doing same project on my PG
+Imam Vali hello
I'm working the same algorithm but I have problem to classify the result
can you help me
thanks a lot
Dear Imam Vali
At 26.30 time, What does it mean by every other rows and every other column in down sampling process?
Explain
cool :)
Thanks for the bonus lecture in the end on How Google Search Works?
Damn it, Windows 7, get out of the way...
sift sucks surf rules!!!!
surf?
Dan Frederiksen google?
good explanation, thanks!
thank you for this