Just discovered your channel while looking for explanations related to localisation and mapping for my Robotics BEng and could not be more appreciative of the videos! great content!
can anyone help me out because when i was watching this video I couldn't understand why we are learning this concept and the notations are explained verbally for e.g. z,x,u this notations were hard to figure out until sir didn't defined it verbally
Hi Cyrill, which previous lecture are you referring to here? 7:28 "And one way to simplify it is to apply Bayes' rule. Remember, the thing that we did just a few minutes ago in in the previous lecture on probability theory?" What is the title, or how can I find that one? I would like to watch it before this one.
@@CyrillStachniss, thank you - that was indeed very helpful, especially the derivation of Bayes' rule and the note about background knowledge (additional givens), which had been the part really confusing me previously. I am still a little unclear on why it is permissible to reduce the "evidence" denominator to a "normalization constant", but I think I can find out more about that by searching around.
This is one of the most valuable channels on TH-cam for me. Thank you for your work and your contribution.
Thanks
Same here. Its literally saving my career life. Thank you :)
Crystal clear lecture, you saved my days! Viele dank!
The best videos on mobile robotics: concise and crisp
Thank you so much sir, Its very hard to find such a detailed explaination on this topic on internet. It really helped a lot.
Just discovered your channel while looking for explanations related to localisation and mapping for my Robotics BEng and could not be more appreciative of the videos! great content!
Thank you!
I think it's not possible to find a better explanation, thank you!
Thanks
This is the best explanation of bayes filter. Thank you!
Thank you for the premium content.
Great presentation ❤️👌
Nicely Explained. Thank you Cyrill stachniss.
very good explanation! man I hope every professor would be like this :)
excellent
So great a video! Thank you so much, Prof. Cyrill.
Excellent 👌
Very good content
Great video
Will it be possible to have access to the pdf files of the slides?
You are a gem.
thank you professor
@4:33 Hmm why does the distribution move when the robot moves 1 m?
because the belief about where you are has moved,
and that movement has also introduced noise, so the distributions has less sharp of peaks.
cool! after attending burgard's course then come here. also nice
can anyone help me out because when i was watching this video I couldn't understand why we are learning this concept and the notations are explained verbally for e.g. z,x,u this notations were hard to figure out until sir didn't defined it verbally
Is there any relationship between Viterbi algorithm used in HMM and this filter?
Hi Cyrill, which previous lecture are you referring to here? 7:28 "And one way to simplify it is to apply Bayes' rule. Remember, the thing that we did just a few minutes ago in in the previous lecture on probability theory?" What is the title, or how can I find that one? I would like to watch it before this one.
th-cam.com/video/JS5ndD8ans4/w-d-xo.html
@@CyrillStachniss, thank you - that was indeed very helpful, especially the derivation of Bayes' rule and the note about background knowledge (additional givens), which had been the part really confusing me previously. I am still a little unclear on why it is permissible to reduce the "evidence" denominator to a "normalization constant", but I think I can find out more about that by searching around.
Thank you so much
You're welcome!