Do you mean beyond Bayesian Regression and Inference? … I am trying to base most of the tutorials in this series using examples and plots from PRML provided it makes sense. For example, the latest (Markov Chains) in PRML is not covered to my satisfaction.
@@KapilSachdeva So here is how I see it, A lot of Advanced ML courses abroad require you to have prerequisites in the form of PRML or the MML book. At least if someone wants to understand research papers a lot of them assume students know chapters from PRML like Kernel Methods and Approximate Inference (which become important in Generative models, VAEs). Your videos are super nice so if you are working on a series on these lines I believe undergraduates can start understanding the subject in more depth and also start contributing and try their hand in research. Definitely, if you feel find other books to make more sense for a particular topic that is the way forward, but its just the confidence it gives students knowing they covered the prerequisite book with the reputation of PRML feels good.
All fair points. Thanks, your suggestion makes lot of sense and see the value in it. Let me ponder over it on how to execute on this great suggestion ! 🙏
This was a wonderful tutorial. Your graphics, style, clarity and humour are very helpful. Looking forward to the rest of the series. Specifically, I'm interested in using Bayesian prediction on my own data.
Really a treasure of knowledge expressed in very simple way….a help to learn Bayesian treatment. Please complete the series and more videos are expected.
Very good video! I just have a question on 12:07 : Once we know that the derivatives are x^i, what allows us to just write x^i there in the formula? i is only the exponent of xn and I can’t see any sum iterating over „i“ or anything clarifying which i I need to use? I find it quite difficult to understand that part of the formula during those steps. Only after transforming everything into vectors and Matrices „i“ disappears and the formula becomes readable again.
Amazing lectures, thank you so much! I was wondering at 12:07 the partial derivatives of E wrt w_i (in the right panel), shouldn't those be the partial derivatives of ys and not Es?
Thank you. Very helpful! But I think there is a mistake in saying the weights vector is a row vector. I believe it should be a column vector for t = Xw
Can you please guide me whether weight vector is column vector or row vector. It is creating confusion in multiplication. Thanks in advance for the great series.
How to create this type of video. Just curious its spot on with black background and animation is enough to serve the purpose. Any lead is appreciated 👍. Thanks for the video and the content.
Please do more from PRML! A whole generation of us need it :) Thank you!
Do you mean beyond Bayesian Regression and Inference? … I am trying to base most of the tutorials in this series using examples and plots from PRML provided it makes sense. For example, the latest (Markov Chains) in PRML is not covered to my satisfaction.
@@KapilSachdeva So here is how I see it, A lot of Advanced ML courses abroad require you to have prerequisites in the form of PRML or the MML book. At least if someone wants to understand research papers a lot of them assume students know chapters from PRML like Kernel Methods and Approximate Inference (which become important in Generative models, VAEs). Your videos are super nice so if you are working on a series on these lines I believe undergraduates can start understanding the subject in more depth and also start contributing and try their hand in research. Definitely, if you feel find other books to make more sense for a particular topic that is the way forward, but its just the confidence it gives students knowing they covered the prerequisite book with the reputation of PRML feels good.
All fair points. Thanks, your suggestion makes lot of sense and see the value in it. Let me ponder over it on how to execute on this great suggestion ! 🙏
@@KapilSachdeva Thank you! Really looking forward!
🙏
This channel is gold mine seriously. Thank you Sir!
🙏
This is an amazing series, please continue doing videos for rest of "pattern recognition and machine learning" book
Will do.
Feeling guilty now :( …. Apologies for the delay.
Yeah..this is really helpful...I was struggling to get sense of these equations but then this video has very systematically explained the concepts!!!
@@KaranYadav-hw9yo thanks 🙏… glad you found it helpful!
This was a wonderful tutorial. Your graphics, style, clarity and humour are very helpful. Looking forward to the rest of the series. Specifically, I'm interested in using Bayesian prediction on my own data.
🙏
Really a treasure of knowledge expressed in very simple way….a help to learn Bayesian treatment.
Please complete the series and more videos are expected.
🙏
This video is so helpful for me after I finished my regression class. I didn't under the overfitting concept. Now I do! Thank you!
🙏
I just found your channel today and I am glad I did! Great stuff
such a gem of the playlist. Every topics are explained clearly. Please do more videos with the same book❤❤❤
🙏
Very well explained. Please keep up the good work!
🙏
Very good video! I just have a question on 12:07 : Once we know that the derivatives are x^i, what allows us to just write x^i there in the formula? i is only the exponent of xn and I can’t see any sum iterating over „i“ or anything clarifying which i I need to use? I find it quite difficult to understand that part of the formula during those steps. Only after transforming everything into vectors and Matrices „i“ disappears and the formula becomes readable again.
This is amazing, I hope you make more. Thank you.
🙏… there are 10 more videos in this playlist called “Towards Bayesian Regression”
Very helpful videos, thank you very much!
🙏
Amazing lectures, thank you so much! I was wondering at 12:07 the partial derivatives of E wrt w_i (in the right panel), shouldn't those be the partial derivatives of ys and not Es?
No it is E, the error function. Our goal is to minimize the error.
That is super useful! Fantastic work :)
🙏
Thank you. Very helpful! But I think there is a mistake in saying the weights vector is a row vector. I believe it should be a column vector for t = Xw
🙏 thanks for pointing it out. You are correct. I will add a note in the description.
Can you please guide me whether weight vector is column vector or row vector. It is creating confusion in multiplication. Thanks in advance for the great series.
Great video, thanks !
🙏
How to create this type of video. Just curious its spot on with black background and animation is enough to serve the purpose. Any lead is appreciated 👍. Thanks for the video and the content.
🙏 PowerPoint with black theme and morph transitions and animations. Not much more than that.
@@hyperadapted sometimes in latex but mostly in PowerPoint :)
fantastic video. thank you
Thanks for excellent lecture
🙏