Prof. Deisenroth, Thank you for sharing your excellent lecture. At the very beginning, you say that, "...yesterday we looked at the Bayesian Linear Regression towards the end of the class...". Is that previous lecture's video available?
At 58:10 , f1,..., fk and x1,...xk are with stars I guess. At 1:00:59 what is cov( f(x);f(x*))? Maybe it is a (k+N)x(k+N) covariance matrix there defined via kernels.
The best lecture of GP ever!!!!
very clear🎉thanks a lot!
Incredible lecture; thank you
This lecture is very concise and the explanations are quite convincing. thank you
the last 5 mins of process adding observation is so insightful and helped me a lot, thanks professor!
Amazing!!
Excellent explanation. Thank you so much for making this video.
Prof. Deisenroth, Thank you for sharing your excellent lecture. At the very beginning, you say that, "...yesterday we looked at the Bayesian Linear Regression towards the end of the class...". Is that previous lecture's video available?
At 58:10 , f1,..., fk and x1,...xk are with stars I guess. At 1:00:59 what is cov( f(x);f(x*))? Maybe it is a (k+N)x(k+N) covariance matrix there defined via kernels.
Very helpful !
Sir may I ask why in the slide page 40 the Posterior has different functions instead like in pages before only one function?