Have you thought on opening a Discord server? I think it would be really helpful. I'm personally starting to read the book today, but these videos you're uploading are gold. Thanks.
So far, this discussion assumes a prior mean of zero. I think this is quite restrictive; often in real world we have a strong intuition of what the mean of our features should be like. Just wondering if all the discussion and formula (e.g. those for the equivalent kernel) would apply if we stray away from zero mean and isotropic variance assumptions?
Thanks for the question! The equivalent kernel is the same, just with S_0 baked into the S_N. Interpolation with the equivalent kernel is adjusted by adding a constant offset to the training targets. See my derivations here: sinatootoonian.com/index.php/2024/12/20/the-equivalent-kernel-for-non-zero-prior-mean/. As for the rest of the discussion, it's an exercise for the reader :)
I just had an exam on GPs by Carl Rasmussen himself, such a cool concept!
He wrote the book on the topic :) Hope the exam went well!
Have you thought on opening a Discord server? I think it would be really helpful.
I'm personally starting to read the book today, but these videos you're uploading are gold.
Thanks.
That's a great idea, I'll do it! I haven't done that before, so will probably need some fine-tuning from you guys :)
So far, this discussion assumes a prior mean of zero. I think this is quite restrictive; often in real world we have a strong intuition of what the mean of our features should be like.
Just wondering if all the discussion and formula (e.g. those for the equivalent kernel) would apply if we stray away from zero mean and isotropic variance assumptions?
Thanks for the question! The equivalent kernel is the same, just with S_0 baked into the S_N. Interpolation with the equivalent kernel is adjusted by adding a constant offset to the training targets. See my derivations here: sinatootoonian.com/index.php/2024/12/20/the-equivalent-kernel-for-non-zero-prior-mean/. As for the rest of the discussion, it's an exercise for the reader :)