Gaussian Processes

แชร์
ฝัง
  • เผยแพร่เมื่อ 5 ก.พ. 2025

ความคิดเห็น •

  • @sisilmehta
    @sisilmehta 2 ปีที่แล้ว +130

    Literally the best explanation on the internet for GP Regression Models. He's not trying to be cool, but genuinely trying to explain the concepts

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว +18

      Thank you my man. And yes, my risk of being cool is zero lol

    • @markzuckerbread1865
      @markzuckerbread1865 ปีที่แล้ว

      Agreed, I've been trying to understand GPs for a task at work and this is the easiest to understand explanation I've found, liked and subbed!

    • @proudirani
      @proudirani 9 หลายเดือนก่อน +1

      Absolutely excellent but I like to suggest a minor correction at 1:54. The linear regression DOES account for the uncertainly of the line; the linear regression prediction interval would produce intervals that widen as you move away from the data just like the Bayesian ones. (Many textbooks give the approximate formula for prediction intervals that don't widen but the actual non-approximate formula will give widening bands.)

    • @RK-bj7fi
      @RK-bj7fi 7 หลายเดือนก่อน

      *of mean zero with a standard deviation of 4.20

  • @ScottCastledine
    @ScottCastledine 10 หลายเดือนก่อน +19

    My postgrad supervisor literally told me to watch this a few times just so I can explain it clearly to Human Sciences people in my research proposal. Thanks for all your effort making it!

  • @LuddeWessen
    @LuddeWessen 3 ปีที่แล้ว +38

    Great balance between technical depth and intuition for me right now. I love how you say that multiplication "is like", but still is not. This gives intuition, but provides a warning for the day when we have come further in our understanding. 🤗

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว +7

      Ha yea glad these details aren’t unnoticed. It’s a careful game making sure I never say anything *technically* wrong.

    • @Ethan_here230
      @Ethan_here230 11 หลายเดือนก่อน

      Yes sir sometimes to make a point you need to recontextualize the matter to specific to make things easier to understand​@@Mutual_Information

  • @AnandKumarAgrawal-r5v
    @AnandKumarAgrawal-r5v 11 หลายเดือนก่อน +2

    I know how hard it is to explain this topic, so simply and comprehensively. I am extremely thankful for your efforts.

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 ปีที่แล้ว +2

    I learn each time I rewatched the video. So much better than sitting lectures where you only listen once.

  • @tommclean9208
    @tommclean9208 3 ปีที่แล้ว +11

    the first time I watched this a month or so a go I had no idea what was happening. However I have recently needed to use a GP and after a lot of reading up on them and coming back to this video, I can appreciate it a lot more with some understanding:)

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว +4

      Yea my topics require some prerequisite 😅 but with a little getting used to on the notation and basics of probability/stats, I think it should be fairly digestible. Glad you got something out of it.

  • @jb_kc__
    @jb_kc__ 7 หลายเดือนก่อน +2

    Really helpful. GPs finally clicked watching and working through this video. Great balance of accessibility + technical details. Thanks bro

  • @xbailleau
    @xbailleau ปีที่แล้ว +2

    I will have to watch your video several times to understand (if I can) everything but undoubtly your video is professional and very very well done !! congratulations

  • @caseyalanjones
    @caseyalanjones ปีที่แล้ว +4

    It looks like you have optimized the hyperparameters of making an awesome video. So concise, but still a sprinkle of humor here and there. Awesome visualizations, so appreciated.

    • @Mutual_Information
      @Mutual_Information  ปีที่แล้ว +1

      haha I thought that was gonna be a nitpick, pleasantly surprised - thank you!

  • @Boringpenguin
    @Boringpenguin 2 ปีที่แล้ว +1

    I read the distill article and came back to watch the whole video again for the second time. Now it's crystal clear! Thanks so much!!

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว

      Distill is an epic educational source :)

    • @Boringpenguin
      @Boringpenguin 2 ปีที่แล้ว

      @@Mutual_Information It's sad that they're in hiatus since last year :(
      Hopefully they'll come back some day

  • @zubintrivedi7105
    @zubintrivedi7105 6 หลายเดือนก่อน +1

    Thanks

  • @utkarshsinghal5
    @utkarshsinghal5 5 หลายเดือนก่อน

    Awesome tutorial! Where could I find more information on GPs for graphs and varying length strings?

  • @springnuance7048
    @springnuance7048 ปีที่แล้ว +1

    holy sh*t, you have unlocked the secret of GP and Bayesian stuff... I have struggled so hard to understand what is even GP as it is so abstract. Thank you so much for your great work!

  • @gergerger53
    @gergerger53 2 ปีที่แล้ว +3

    I just discovered your videos yesterday and now they're popping up on my YT home screen and I feel a bit like a little boy in a toyshop. How have these high quality fantastic tutorials evaded me for so long, when I spend so much time looking at technical content on TH-cam? Seriously impressive! I'll definitely be one to share your videos when the opportunity arises.

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว +1

      Much appreciated! I got some really cool stuff coming in May. If you like this stuff, you'll *love* what's coming. Thanks again!

  • @mCoding
    @mCoding 3 ปีที่แล้ว +87

    Great reference video, I'm sure I will come back to it again and again. The level of detail in all the simulations you do is just incredible. Do you make all your animations in manim?

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว +38

      Thanks brother! And i don’t use manim actually. I like representing data with Altair, which is like a better version of matplotlib. So I have a small library which turns Altair pics into vids.

  • @abubakryagob
    @abubakryagob 2 ปีที่แล้ว +1

    In min 3:00 I saw a smile coming out of my mouth, just how happy I was when I was listening to you!
    This is a masterpiece work! Really thank you :)

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว +1

      Thank you very much - glad it's getting some love :)

  • @SinaMiri-m7j
    @SinaMiri-m7j 5 หลายเดือนก่อน

    I guess this is the first time I am commenting in TH-cam! This tutorial is one of the best that I've ever seen on GPs and even math. The visual morph changes of the GP corresponding to different hyperparasite are fascinating. Wish you the best!

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 ปีที่แล้ว +2

    The way you motivate the problem really adds insights for understanding.

  • @yuanheli8566
    @yuanheli8566 4 หลายเดือนก่อน

    Literally the GOAT. So clear and concise, and the pace is perfect! A humble suggestion: adding a quick explanation of what GP does would be ideal (also, what are these sample prior, etc.) Dimension-wise, it would make more sense to me.

  • @wiggwigg2010
    @wiggwigg2010 3 ปีที่แล้ว +6

    Great video. I've seen GPs mentioned a few times in papers and always glossed over it. Thanks for the great explanation!

  • @sietseschroder3444
    @sietseschroder3444 2 ปีที่แล้ว +2

    Truly amazing how you turned such a complex topic into an accessible explanation, thanks a lot!

  • @cosapocha
    @cosapocha 2 ปีที่แล้ว +5

    The production value of this is insane

  • @ApiolJoe
    @ApiolJoe 3 ปีที่แล้ว +1

    Absolutely perfect! I heard of GPs and was wondering what they were exactly, wanted a bit of intuition of how and why they work, how to use them, just as a quick intro or motivation before learning them later on.
    This video answered all of this in a duration that is absolutely perfect: not too long so that it can be watched "leisurely", and not too short so that you still give enough information that I don't have the impression that I learnt nothing.
    Didn't know your channel, will definitely check the rest out!

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว +1

      Thanks a lot! That's exactly what I'm going for. Relatively short and dense with useful info. Glad it worked for you.

  • @swindler1570
    @swindler1570 2 ปีที่แล้ว +4

    Phenomenal video. I genuinely can't thank you enough for how accessible this was. I'm sure I'll come back and reference it, or your other work, as I continue preparing for my upcoming internship working on physics-informed neural networks.

  • @massisenergy
    @massisenergy 2 ปีที่แล้ว +2

    Perfect! - research, delivery, production, duration, pictorial intuitiveness, mathematical rigor, naïve friendly 👏🏽

  • @aiart3453
    @aiart3453 3 ปีที่แล้ว +3

    The best explanation of Kernel so far!

  • @manuelstrondl1797
    @manuelstrondl1797 2 ปีที่แล้ว +1

    Great video! Just dived into GPs by learning about their application in system identification techniques. In fact I'm learning for my examn right now and looked for a video that nicely sums up this topic and gives some intuition. This video matches my needs 100%, thank you very much.

  • @KaydenCraig-z8p
    @KaydenCraig-z8p ปีที่แล้ว +1

    Such a clear and intuitive explanation of GPs! Great work!. Excellent video on this topic. Brief and elegant explanations!.

  • @matveyshishov
    @matveyshishov 2 ปีที่แล้ว +1

    Man, you have some beautiful explanations, and the way you explain the details is somehow very simple to understand, thank you so much!

  • @majdkahouli8644
    @majdkahouli8644 11 หลายเดือนก่อน

    What a smooth way to explain such complex math , thank you

  • @gabrielbelouze7765
    @gabrielbelouze7765 3 ปีที่แล้ว +1

    What the hell that's a great channel I'm so glad I found you. Production quality is spot on, thank you for taking such care !

  • @heyna88
    @heyna88 2 ปีที่แล้ว +1

    As I wrote you on LinkedIn, this is probably the best video on GPs out there! I know it takes a long time to put together something of such high quality, but I hope I will see more of your videos in the future! 😊

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว +1

      thanks, means a lot - and it's coming. This one has been taking awhile, but it'll be out soon :)

  • @pilurussu20
    @pilurussu20 ปีที่แล้ว +1

    You are the best man! Thank you for your videos, you 're helping a lot of students, because your explanations are so clear and intuitive.
    I Hope the best for you.

  • @antonzamyatin7586
    @antonzamyatin7586 5 หลายเดือนก่อน

    Really grateful for this video. Got the gist of it but will have to pull out my old friend pen and paper and work through the math of the GP assumtion. Thanks for the neat definitions :)

  • @andreyshulga12
    @andreyshulga12 ปีที่แล้ว +1

    Thank you! I've been researching paper dedicated to the gaussian approach to time series prediction(as a task in a lab), and I really struggled with it. But after your video, everything has been sorted out in my head, and i finally have understood it!

    • @Mutual_Information
      @Mutual_Information  ปีที่แล้ว +1

      Exactly what I've hoped to do - happy it helped!

  • @ronitganguly3318
    @ronitganguly3318 3 ปีที่แล้ว +2

    Dude has named his channel mutual information so when we look for the concept of mutual information, all his videos will pop up 🤣 genius!

  • @maulberto3
    @maulberto3 2 ปีที่แล้ว +1

    Props for explaining such a complex model in a friendly way

  • @anttiautere3663
    @anttiautere3663 ปีที่แล้ว

    A great video! Thank's. I used GP at work many years ago and enjoyed the framework a lot.

  • @alexandrebonatto6438
    @alexandrebonatto6438 9 หลายเดือนก่อน

    By far this is the best video I have seen on this subject! Thank you very much!

  • @Charliemmag
    @Charliemmag 22 วันที่ผ่านมา

    Damn, Charlie Cox really has the best yt channel for learning ML out there!

  • @MohdDarishKhan
    @MohdDarishKhan ปีที่แล้ว +1

    A really good explanation! Though I wasn't able to understand everything, I would keep coming to this video until I do. ;D

  • @minhlong1920
    @minhlong1920 ปีที่แล้ว +1

    Such a clear and intuitive explanation of GPs! Great work!

  • @mightymonke2527
    @mightymonke2527 2 ปีที่แล้ว +2

    Thanks a lot for this vid man it literally saved my life, you're really one hell of a teacher

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว +1

      Thank you - I'm getting a little better over time, but it's a work in progress.
      If you love what I'm doing, one thing that would be *huge* for me, is if you tell anyone you think might be interested. This channel is pretty small and it'll be easier to work on it if it gets a little more attention : )

    • @mightymonke2527
      @mightymonke2527 2 ปีที่แล้ว

      @Mutual Information best of luck man 🫡

  • @-mwolf
    @-mwolf 21 วันที่ผ่านมา

    Such a clear and intuitive explanation!

  • @oldPrince22
    @oldPrince22 2 ปีที่แล้ว +2

    Excellent video on this topic. Brief and elegant explanations!

  • @popa42
    @popa42 ปีที่แล้ว

    I did understand just a few things, but still I watched this video till the very end - the production value is insane! And maybe I’ll need GPs in the future? :D
    You definitely deserve much more subscribers, your videos are great!

  • @mengliu6720
    @mengliu6720 3 ปีที่แล้ว +1

    Best video for GP I have seen! Thank you so much!

  • @graham8316
    @graham8316 ปีที่แล้ว +1

    I would be really helped by putting variable definitions on screen while they're in use! I find myself forgetting what f and f* are for example as I mull it over and watch the explanation. Amazing video! I'm a fan.

    • @Mutual_Information
      @Mutual_Information  ปีที่แล้ว +1

      Thanks Graham! It's always a balance thinking about what does/doesn't go on screen. More recently, I'm biasing towards *less* on screen, b/c I've gotten feedback that what's on screen can be overwhelming.
      But, if you have some question about what may be confusing, ask here and I may be able to help

    • @graham8316
      @graham8316 ปีที่แล้ว

      @@Mutual_Information I'm thinking what was hard for me is that everything was defined and then they were used? the viewer needs to remember what each things means before they can give it the context, and context allows us to combine things and save on short term memory?

  • @taotaotan5671
    @taotaotan5671 3 ปีที่แล้ว +1

    a really really hard-core video... thanks D.J

  • @youngzproduction7498
    @youngzproduction7498 2 ปีที่แล้ว +1

    Now you make me love math again. Thanks.

  • @Laétudiante
    @Laétudiante 2 ปีที่แล้ว +1

    This is truly a great explanation that helps me to connect all the dots together!! Thanks a lot!!!!

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว

      I'm glad it help. When I was studying GPs, these are the ideas that floated in my head - happy to share htem.

    • @FrederikFalk21
      @FrederikFalk21 ปีที่แล้ว

      Quite literally
      Badum tssch

  • @Birdsneverfly
    @Birdsneverfly 3 ปีที่แล้ว +2

    The visualizations are the catch. Just excellent 😊

  • @DanieleO.
    @DanieleO. 2 ปีที่แล้ว +1

    Super quality content! Thank you so much: I subscribed and I hope your number of subscribers increases more and more to motivate you to keep going!

  • @jonathanmarshall2518
    @jonathanmarshall2518 3 ปีที่แล้ว +1

    Love the level you've pitched this video at.

  • @Kopakabana001
    @Kopakabana001 3 ปีที่แล้ว +2

    Another great video! Love seeing each one come up

  • @Paulawurn
    @Paulawurn 2 ปีที่แล้ว +1

    What an excellent video! Just thinking about the amount of effort that must have gone into this gives me anxiety

  • @MauroRincon
    @MauroRincon 2 ปีที่แล้ว +1

    Brilliant video! loved the graphics.

  • @goelnikhils
    @goelnikhils 2 ปีที่แล้ว +1

    Amazing video on GP's .

  • @karmpatel6832
    @karmpatel6832 ปีที่แล้ว +1

    What an Explanation! Become fan in seconds.

  • @TomuszButAvailable
    @TomuszButAvailable 8 หลายเดือนก่อน +1

    You are a legend.

  • @waylonbarrett3456
    @waylonbarrett3456 ปีที่แล้ว +1

    I built a model years ago that I never realized is perhaps a GP model. I only learned about GP models a weeks ago. It doesn't use any real-valued data; only binary vectors. The similarity kernel is Hamming distance. Other than that, it's basically what he described here.

  • @tirimula
    @tirimula 2 ปีที่แล้ว +1

    Awesome Explanation. Thank you.

  • @alfrednewman2234
    @alfrednewman2234 ปีที่แล้ว +1

    So far beyond my abilities. Like Frankenstein's monster, I am soothed by its music.

  • @jobiquirobi123
    @jobiquirobi123 3 ปีที่แล้ว +1

    Nice visualizations man. Just discovered your channel.

  • @TeoChiaburu
    @TeoChiaburu 3 ปีที่แล้ว +1

    Excellent explanations and visualizations. Helped me a lot, thank you!

  • @nicksiska3231
    @nicksiska3231 2 ปีที่แล้ว +1

    Straight forward and explained well thank you

  • @eggrute
    @eggrute 3 ปีที่แล้ว +1

    This video is really nice. Thank you so much for creating this content material.

  • @Donmegamuffin
    @Donmegamuffin 2 ปีที่แล้ว +1

    A truly fantastic explanation to them! The visuals were instructive and well presented, thank you for making this!

  • @SamuelLiJ
    @SamuelLiJ ปีที่แล้ว

    Your observation of the product of two normally distributed variables is true for the following reason: given independent scalar random variables X,Y, we have Var(XY) = Var(X)Var(Y) + Var(X) (E(Y))^2 + Var(Y) (E(X))^2. Given two multivariate random normals U,V with mean zero, we may choose to work in a basis (possibly different for the two distributions) where the covariance matrices are diagonal. The all components of each vector are independent and so Cov(U) Cov(V) = Cov(UV) by working element-wise. Since this is true in one basis, it must therefore be true in every basis.

  • @abdjahdoiahdoai
    @abdjahdoiahdoai 2 ปีที่แล้ว +1

    you might want to look into probabilistic numeric, cheers great video you made there!

  • @sashaaldrick
    @sashaaldrick 3 ปีที่แล้ว +1

    Really good explanation, the animations help so much. Thank you, I really appreciate it.

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว

      You're very welcome - Glad to hear it's landing as intended!

  • @lucasvanderhauwaert419
    @lucasvanderhauwaert419 2 ปีที่แล้ว +1

    Super fricking impresed! Bravo

  • @patrickl5290
    @patrickl5290 3 ปีที่แล้ว +4

    Love these vids. Can you do a video about normalizing flows in the future?

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว +3

      I plan on making one. It’s a very interesting idea. In the meantime, there is already an excellent explanation : th-cam.com/video/i7LjDvsLWCg/w-d-xo.html

  • @rkstager
    @rkstager 2 ปีที่แล้ว +1

    Excellent video, thanks. I have a question at the 1:20 mark. The distribution that you show, ‘p(y|x)’, does not look Gaussian. Am I missing something? Can a GP predict a non-Gaussian distribution?

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว

      Oh I see the confusion. That's not a GP model. That's just some non-parametric model to show the type of thing we're going for. It's there to draw a contrast when we start making assumptions. We assume the normal distribution at some point.. and that's what gives you the p(y|x) gaussian.. but in the general case, that's not necessarily true. But this isn't very clear in the vid, - sorry about the confusion :/

    • @rkstager
      @rkstager 2 ปีที่แล้ว

      @@Mutual_Information got it. Thanks

  • @besugui1969
    @besugui1969 ปีที่แล้ว +1

    Awesome video!. Only one question. Minute 09:20. A linear kernel does not imply that the realizations of the random process must be linear, does it?. Thanks!!

    • @Mutual_Information
      @Mutual_Information  ปีที่แล้ว

      Thanks Jesus. And regarding your Q, in the broader model, no a linear kernel doesn't imply the realizations need to be linear, since there is a noise component in the overall kernel. That allows points along a sample to be different in a nonlinear way.

  • @Gggggggggg1545.7
    @Gggggggggg1545.7 3 ปีที่แล้ว +1

    Another great video. Keep up the good work!

  • @anatoliizagorodnii2563
    @anatoliizagorodnii2563 3 ปีที่แล้ว +1

    Wooow! Excellent quality video!

  • @taekyookim2576
    @taekyookim2576 4 หลายเดือนก่อน

    Great video. Thank you 😊

  • @chamithdilshan3547
    @chamithdilshan3547 9 หลายเดือนก่อน +1

    Great video ! ❤

  • @HaolaiChe
    @HaolaiChe ปีที่แล้ว +1

    Missed a lot of math, will get back later!

  • @ryandaniels3258
    @ryandaniels3258 2 ปีที่แล้ว

    What a great video! Very helpful, thanks!

  • @Pabloparsil
    @Pabloparsil 3 ปีที่แล้ว +1

    Keep this up! It really helps

  • @TheRaspberryPiGuy
    @TheRaspberryPiGuy 3 ปีที่แล้ว +1

    Great video - I subbed!

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 ปีที่แล้ว +3

    Not sure if others are also interested, but I think a coding example with GPyTorch could be interesting.

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว +6

      This is something I'm working on! I'd like to make code samples available alongside my videos. They aren't currently available b/c the modeling code is intertwined with the animation code, so it would make for a terribly difficult to decipher code if released as-is. My plan is.. once my video production workflow is a little more streamlined, I'll pair these video with code snippets.

  • @I_am_who_I_am_who_I_am
    @I_am_who_I_am_who_I_am 2 หลายเดือนก่อน

    This is excellent!

  • @Clouduis
    @Clouduis 6 หลายเดือนก่อน

    Great video!! I have a question about the graphs at 9:25. Shouldn't the heat map for the for the x vs x' look so that the K is highest at the origin (0,0) and fades moving to the other corners instead of what's shown? I might still not fully understand it. Thanks!

    • @Mutual_Information
      @Mutual_Information  6 หลายเดือนก่อน +1

      The heatmap just shows v*x*x'. I think the constant v here is .5. At the origin, it's .5*0*0 = 0. In the top right, it's .5*10*10 = 50.

  • @MatteoRossi-h2y
    @MatteoRossi-h2y ปีที่แล้ว +1

    The best video about GP I have ever seen! Thank you for sharing. I would like to reproduce the graphs that you created in a script, but unfortunately I cannot see any code about it on you github page! It is possible to access to those scripts? with the examples that you produced?

    • @Mutual_Information
      @Mutual_Information  ปีที่แล้ว +2

      Thank Matteo - I appreciate it!
      Unfortunately, the code for this one was heavily intertwined with the animation code, so I didn't make it public. But I wasn't doing anything you can't learn from reading the GPyTorch docs

  • @Friemelkubus
    @Friemelkubus ปีที่แล้ว +1

    Damn. Just Damn. This is great! Like: really really really great.

  • @brettbyrnes577
    @brettbyrnes577 ปีที่แล้ว +1

    Nice video - love it

  • @Alexander-pk1tu
    @Alexander-pk1tu ปีที่แล้ว +1

    Hey, great video! In practice in my machine learning class we did both GPR and GPC I found it very difficult to scale it to more than 10k samples. It seems that despite the advantages it has, it is not useful for a lot of practical problems. Can you maybe show show video on how to invert a matrix with less than O(n^3) complexity and which software someone could use for GPR/GPC for larger data?

    • @Mutual_Information
      @Mutual_Information  ปีที่แล้ว

      Yea, so that's a big component of GP research. Getting the cost down. A dominate approach are inducing point methods, where you try to summarize a large dataset with a smaller data set of "inducing points". It's a popular approach, but introduces another source of uncertainty.
      In my experience, I tend to use GPs with smaller datasets.

  • @jigonro
    @jigonro 8 หลายเดือนก่อน

    7:01 shouldn’t it be the other way around? If most ys are deemed different from an x, then the GP would sample closer ys and therefore the functions shouldn’t wiggle too much. Or am I missing something?

  • @RHCPhooligan
    @RHCPhooligan ปีที่แล้ว +1

    Hey, love the videos. What software do you use to create your visuals?

    • @Mutual_Information
      @Mutual_Information  ปีที่แล้ว +1

      I use a very dope, though static, python plotting library called Altair. And then I have a personal library that turns many of them into videos.

  • @knightofhyrulelink7531
    @knightofhyrulelink7531 ปีที่แล้ว +1

    thank you for your understandable video!
    I'm still wander what is the point of "similar y for similar x", is it make sure the function is smooth, or other usage?
    looking forward to your reply!

    • @Mutual_Information
      @Mutual_Information  ปีที่แล้ว

      The goal of the problem is predict y for a given incoming x.. and we can learn to do this by observing many pairs of (x_i, y_i)'s. So we make an assumption: "If x1 is similar to x2 (that is K(x1, x2) is large/positive)), then we expect y1 and y2 to be close". With that assumption, we can form a prediction for y when given an x.. and that basically is formed by determining "which y value would best work with our similar-y's for similar x's given the x's and y's observed?" and then you can form your prediction that way. The GP does all this hard work for you and allows for noise and whatnot.

  • @hektor787
    @hektor787 2 ปีที่แล้ว +1

    13:08 isnt X a d*n matric meaning that each row represents a feature and each column is a datapoint x?

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว

      Not in this case. Here, one row provides all the features for one example.

  • @snp27182
    @snp27182 3 ปีที่แล้ว +1

    If you were using a chi-distribution as a kernel could you combine kernel-a and kernel-b multiplicatively? If I recall, gaussian distributions are linear, ie: their sum is a gaussian, however there product is not. Chi-distributed variables on the other hand, when you multiply their products, you get an f-distribution, which is tractable.
    Really cool video! You definitely have an INSANE amount of material to make more videos on! Definitely subscribing!

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว +1

      Very interesting idea.. maybe there is a very special choice of kernel such that the multiplication kernel-and-sampled-function-distributions holds exactly, just like a sum does. I really don't know!
      If I had to guess, I'd say there is no such kernel. The problem arises from the multivariate normal, which is always operating in a GP, regardless of the kernel. And that problem is.. if you sample two vectors from a multivariate normal.. and multiply them together element wise.. the distribution of that thing is NOT some other multivariate normal. The kernel can only change the covariance of those two vectors, but that problem doesn't depend on those covariances.
      And thanks for the subscription!

    • @snp27182
      @snp27182 3 ปีที่แล้ว +1

      @@Mutual_Information Ah that makes sense; it's called a *gaussian process* not a *insert-random-pdf* process after all !
      Thanks for the reply!

  • @MikeOxmol_
    @MikeOxmol_ 3 ปีที่แล้ว +3

    Hey, that's a good video, I enjoyed it a lot and you earned a sub :) However, wouldn't weget something very similar to GPs when we allowed for different basis functions in the Bayesian regression example? These functions don't have to be straight lines, so if I choose some polynomials, sines or exponents as my basis functions and I perform Bayesian linear regression, wouldn't I get basically the same things that GPs offer?

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว +1

      Yes you would! Bayesian linear regression with basis functions gives you a GP. But a GP is more general. You can input any similarity via the kernel. Given a kernel, it can be hard to determine the basis functions you’d need to use to recreate it using Bayesian linear regression.

  • @This.handle.is.taken..
    @This.handle.is.taken.. 3 ปีที่แล้ว +1

    Great video.
    Great contents.
    You are a good-looking man, that's good.
    You move a lot and occupy half of the screen; that's bad.
    Great work 😅

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว +1

      Thank you! I appreciate the honest feedback - I need it! I've been thinking a bit about my positioning on the screen and the hand motions. Planning on practicing (it's a bit of a habit) chilling out the hand gestures. Me taking up half the screen... that's a bit to keep things dynamic when the math side isn't doing anything. I'm not at 3Blue1Brown level, so when I show *just* the math/animations, things are more boring/move less. And it's rather expensive time-wise to make them so dynamic that that's not an issue.

    • @This.handle.is.taken..
      @This.handle.is.taken.. 3 ปีที่แล้ว +1

      @@Mutual_Information Ok.
      Here is another honest feedback. If I want to get you 10/10, I like to see the implementation of a GP model and submit it to some of Kaggle's competitions.
      I know it is time-consuming, and maybe the result is not shining, but your video becomes the best GP introduction on TH-cam.
      Keep moving forward 🤗

    • @Mutual_Information
      @Mutual_Information  3 ปีที่แล้ว

      That's an interesting idea. I'd like to do some collabs with one of those crusher kaggler's - I'm not one of them! I'll keep an eye out for kaggle related topics.

  • @christiankentorasmussen7492
    @christiankentorasmussen7492 3 ปีที่แล้ว +1

    Really well made explanation :)

  • @patrickadjei9676
    @patrickadjei9676 2 ปีที่แล้ว +3

    This is cool stuff! There is something I want to understand from the similarity heat map of the linear kernel. If the function samples are dissimilar as they get further apart (according to the lines), should the heat map not be brighter at (0,0) and fade as it approach (10,10)? I am trying to get the picture in my head.

    • @Mutual_Information
      @Mutual_Information  2 ปีที่แล้ว

      Thanks! I think you're thinking about it from a difficult angle. It's not function *samples* that are similar/dissimilar, it's specific *inputs* across samples. So, for the linear kernel, for inputs that are very similar (like input=0, so heatmap is high, which means similar), the outputs are virtually the same spot. For inputs closer to 10, the inputs are dissimilar and outputs are far apart. Since all function samples are lines, this will manifest as two lines which intersection at input=0 but are far apart at 10. Make sense?

    • @patrickadjei9676
      @patrickadjei9676 2 ปีที่แล้ว

      @@Mutual_Information I do not totally understand. It is not your explanation that is bad. I simply need to get to know GP better. Thank you for trying to explain!

  • @willliamape6297
    @willliamape6297 7 หลายเดือนก่อน +1

    Thank to the video!!!