Linear Regression

แชร์
ฝัง
  • เผยแพร่เมื่อ 15 พ.ย. 2024

ความคิดเห็น • 33

  • @Ash-bc8vw
    @Ash-bc8vw 2 ปีที่แล้ว +4

    I don't think anybody is teaching LR with respect to SVD on TH-cam right now, hence this video is more informative! Loved it immediately subscribed

  • @nikosips
    @nikosips 4 ปีที่แล้ว +24

    Those videos are the best resource for someone who wants to understand data driven models! Thank you very much for your work from an engineering student!!

  • @vahegizhlaryan5052
    @vahegizhlaryan5052 ปีที่แล้ว

    I am honestly surprised(just accidentally discovered this channel) why this coolest recourse is not popular among TH-cam algorithms

  • @patrickxu8795
    @patrickxu8795 2 ปีที่แล้ว +2

    The lecture is so clear and well-organized! IT IS IMPRESSIVE!!!!

  • @SoroushRabiei
    @SoroushRabiei 4 ปีที่แล้ว +5

    Dear professor, you're a great teacher!
    Thank you so much for these videos.

  • @appliedmathness8397
    @appliedmathness8397 4 ปีที่แล้ว +8

    I love thiese videos!
    But in this one you point out the "squared projection error" while showing the segment going from the biased line to the outlier (like in PCA); instead in case of linear regression residuals should be vertical lines.

  • @dmitrystikheev3384
    @dmitrystikheev3384 3 ปีที่แล้ว +6

    I was looking for copper, but found gold! Boss, excellent as always. Love your way of conveying the material. I hope you will continue presenting more topics on statistics, cause in the multivariate case it can become really intimidating. Best regards from Russia!

  • @ParthaPratimBose
    @ParthaPratimBose 3 ปีที่แล้ว +4

    Hi Steve, I am a pharmaceutical data analyst, but you're just outstanding

  • @Chloe-ty9mn
    @Chloe-ty9mn 5 หลายเดือนก่อน

    i've been watching all the videos in this chapter and this is the one that got me to cave and purchase the book!! i was so surprised to see that it was so affordable.
    thank you and your team so so so much for the high quality accessible information

  • @shakibyazdani9276
    @shakibyazdani9276 3 ปีที่แล้ว +9

    Absolutely awesome series, I will finish the whole series today:)

    • @Eigensteve
      @Eigensteve  3 ปีที่แล้ว +4

      Hope you enjoy it!

  • @linnbjorkholm9237
    @linnbjorkholm9237 ปีที่แล้ว +3

    Wow! Great video! I really liked your shirt, where is it from?

    • @Eigensteve
      @Eigensteve  ปีที่แล้ว +1

      It’s a Patagonia capilene. My favorite shirt. I almost only wear them

  • @saitaro
    @saitaro 4 ปีที่แล้ว +19

    This is gold, professor!

    • @motbus3
      @motbus3 4 ปีที่แล้ว

      besides the very awesome explanation, the book is awesome and he writes mirrored as it was nothing 😄

  • @udriss1
    @udriss1 2 ปีที่แล้ว

    Hello.
    In your book DATA DRIVEN SCIENCE & ENGINEERING page 24, relation (1.26), you express the matrix B. In this relation you must write: B = X - X bar and not as one can read B = X - B bar. With here X bar which is the matrix of means.

  • @PaulVirtual
    @PaulVirtual ปีที่แล้ว

    Interesting. In the first lecture of this series, individual faces (i.e. people) were in the columns, but a face was really a column of many pixels. In this lecture, people are in the rows. So each use of SVD is different. And each setup of a data matrix is different.

  • @afreeseaotter622
    @afreeseaotter622 4 ปีที่แล้ว +2

    Thank you sir,
    your courses are awesome!

  • @a.danielhernandez2839
    @a.danielhernandez2839 4 ปีที่แล้ว +1

    Excellent explanation!, What happens with the y-interception of the line? Is it b?

  • @Martin-iw1ll
    @Martin-iw1ll ปีที่แล้ว

    In mechanics, overdetermined is named statically indeterminate

  • @engr.israrkhan
    @engr.israrkhan 4 ปีที่แล้ว +2

    sir great teacher you are

  • @sachavanweeren9578
    @sachavanweeren9578 3 ปีที่แล้ว

    very nice series ... though it has been a while and I might be a bit rusty on my math. But if I recall correctly there is nowhere an explicit link made between the SVD and least squares. It is explained that the there is an SVD and with a theorem that this was the best one in some norm. But I have not seen an explicit link with ols. Would be nice if that would be more explicit in the video series...

  • @kindleflow
    @kindleflow 4 ปีที่แล้ว +2

    Thanks

  • @anilsenturk408
    @anilsenturk408 4 ปีที่แล้ว +1

    How's it going?

  • @philrincon
    @philrincon 2 ปีที่แล้ว

    Is he writing in the reverse?

  • @spidertube1000
    @spidertube1000 4 ปีที่แล้ว +1

    Good vid bruh

  • @ralfschmidt3831
    @ralfschmidt3831 4 ปีที่แล้ว +1

    I am slightly confused: the orthogonal projection of b onto a should minimize the distance between b and its projection - which is ORTHOGANAL to the span of a. If I remember correctly, the minimum least squares, however, should minimize the VERTICAL distance between the projected and the original point. I am sure there is something wrong with my assumptions but maybe someone can point me in the right direction

  • @moshuchitu203
    @moshuchitu203 6 หลายเดือนก่อน

    by cross referencing th-cam.com/video/ualmyZiPs9w/w-d-xo.html, one can clearly see the slope derived in the end is nothing but "covariance (a, b)/variance(a)"

  • @robsenponte3308
    @robsenponte3308 3 ปีที่แล้ว

    Cool

  • @SkyaTura
    @SkyaTura ปีที่แล้ว

    Besides the undeniable quality of the video overall, isn't awesome that he writes backwards in the air just to explain his points? 🤔

  • @uzferry5524
    @uzferry5524 3 ปีที่แล้ว

    based

  • @clickle23
    @clickle23 3 ปีที่แล้ว

    Can you explain why in the example at the end, U = a/|a|, is it because U has the only one eigen vector of matrix AA(transpose), which is just itself?