CUDA Explained - Why Deep Learning uses GPUs

แชร์
ฝัง
  • เผยแพร่เมื่อ 6 ม.ค. 2025

ความคิดเห็น • 137

  • @deeplizard
    @deeplizard  6 ปีที่แล้ว +17

    Check out the corresponding blog and other resources for this video at:
    deeplizard.com/learn/video/6stDhEA0wFQ

    • @kudoamv
      @kudoamv 6 ปีที่แล้ว

      So I can't use Computer Vision Programs which require GPU because I am an AMD card user?

    • @mae1000
      @mae1000 4 ปีที่แล้ว

      @@kudoamv Actually, you can, I think. AMD invested in that field too, google "gpuopen".

  • @debajyotisg
    @debajyotisg 5 ปีที่แล้ว +86

    Great job speeding Jensen Huang up. xD

  • @larryteslaspacexboringlawr739
    @larryteslaspacexboringlawr739 5 ปีที่แล้ว +7

    5:13 ballmer ambush, panic clicking to skip (thank you for awesome video)

    • @deeplizard
      @deeplizard  5 ปีที่แล้ว +1

      lol. You are welcome!

  • @Xiler6969
    @Xiler6969 5 ปีที่แล้ว +55

    Congratulations, You've impressed me. Very professional series. Right to the good stuff, clear and sharp voice, broad yet specific explanations.

  • @majeedhussain3276
    @majeedhussain3276 6 ปีที่แล้ว +9

    This channel seriously deserves million subs.Have been watching many series from this channel.Great work !!!!! keep going I'm sure this channel gonna flow with lots of subscribers someday .

    • @deeplizard
      @deeplizard  6 ปีที่แล้ว +1

      Thank you, Majeed! We're glad to hear you've been enjoying multiple series here, and we're happy to have you as an engaged member of the community! Always appreciate seeing your comments :)

  • @ajwadakil6020
    @ajwadakil6020 4 ปีที่แล้ว +3

    This channel should have more subscribers, seriously

  • @gabriellugmayr2871
    @gabriellugmayr2871 5 ปีที่แล้ว +10

    wow, I saw these first 4 videos of the Pytorch series and am impressed how much time & effort you put into these tutorials. Thanks a lot.
    Also, you have developed enormously (although the older tutorials were already very good)

  • @Sikuq
    @Sikuq 4 ปีที่แล้ว +5

    Beautifully done, Chris. Wow. Thanks. I learned a lot.

  • @engkamyabi
    @engkamyabi 2 ปีที่แล้ว +2

    Amazing video and loved the short clips! Thank you!

  • @Aweheid
    @Aweheid 4 ปีที่แล้ว +2

    Rich informative video !! No better explanation is more than yours!!

  • @shivangitomar5557
    @shivangitomar5557 2 ปีที่แล้ว +2

    The BEST VIDEO on this topic!

  • @MarcelloNesca
    @MarcelloNesca 3 ปีที่แล้ว +4

    Thank you very much, currently learning deep learning and this was perfect to explain why I need a good GPU

    • @chinonsoalumona6734
      @chinonsoalumona6734 2 ปีที่แล้ว

      Hi Marcello how’s your deep learning experience going?

  • @IgorAherne
    @IgorAherne 6 ปีที่แล้ว +17

    Good overview, also having 8 cores won't necessarily speed up computation by exactly x8, perhaps by x7 in practice
    I just wish you would mention that processors use SSE2, AVX2 and similar things that allow each core do 8 summations/multiplications/shifts/etc at a time, rather than one by one. This allows a CPU registers to process arrays in chunks of 8. Many C/C++ programmers don't know about those, and build programs that are by default doomed to underperform.
    So I feel everyone is always unfair towards the CPU. Everybody is pointing at the cores, but each core can (and should) use intrinsics, doing parallel things.
    Especially with various RNNs where we only win if we move the entire algorithm to the GPU to avoid data transfer bottlenecks, and when the RNN is decently wide in each layer for the GPU.
    Also, CPU is really flexible when it comes to 'if/else' or while loops, reacting faster and wiry when the branch occurs.

    • @deeplizard
      @deeplizard  6 ปีที่แล้ว +2

      Hey Igor - Thanks for adding these details. Great stuff. Much appreciated! 🙏

  • @bazzinga204
    @bazzinga204 3 ปีที่แล้ว +2

    Beautifully explained.

  • @mayur9876
    @mayur9876 6 ปีที่แล้ว +4

    Thanks for putting in all the efforts.

    • @deeplizard
      @deeplizard  6 ปีที่แล้ว +2

      You are welcome!

  • @taj-ulislam6902
    @taj-ulislam6902 9 หลายเดือนก่อน +1

    Very professional video. Good information.

  • @Ammothief41
    @Ammothief41 5 ปีที่แล้ว +8

    Any idea on how the rtx graphics cards and their tensor core stuff compares to the standard gtx gpus? Is that something that tensorflow or pytorch take advantage of?

    • @deeplizard
      @deeplizard  5 ปีที่แล้ว

      Haven't seen the comparisons. And. Yes. www.geforce.com/hardware/technology/cuda/supported-gpus

  • @jamesferry8484
    @jamesferry8484 6 ปีที่แล้ว +5

    Thank you for sharing. Very helpful.

    • @deeplizard
      @deeplizard  6 ปีที่แล้ว +1

      Hey James - You are welcome!

  • @estebansevilla2229
    @estebansevilla2229 ปีที่แล้ว +10

    I ended up here because my daughter is learning "AI" at high school and now I need to understand how this all works to make her a PC.

    • @deeplizard
      @deeplizard  ปีที่แล้ว +1

      😅😅

    • @SDRIFTERAbdlmounaim
      @SDRIFTERAbdlmounaim 4 หลายเดือนก่อน +1

      did you build that pc just fine or you needed further help ?

  • @AdrianDucao
    @AdrianDucao 5 ปีที่แล้ว +145

    my girl friend and i been doing a lot of deep learning lately

    • @osumanaaa9982
      @osumanaaa9982 4 ปีที่แล้ว +73

      you sure it's not just shallow computations ? :p

    • @q3d385
      @q3d385 4 ปีที่แล้ว +5

      @@osumanaaa9982 🤣

    • @Aditya_Kumar_12_pass
      @Aditya_Kumar_12_pass 4 ปีที่แล้ว +31

      I hope you are not hoping for any output.

    • @anamitrasingha6362
      @anamitrasingha6362 4 ปีที่แล้ว +3

      @@Aditya_Kumar_12_pass TH-cam should have a Haha react xD

    • @vibesnovibes6320
      @vibesnovibes6320 4 ปีที่แล้ว +11

      How many layers for protection? Are you clear on how back propogation is supposed to work? 😂😂😂

  • @durafshanjawad5250
    @durafshanjawad5250 4 ปีที่แล้ว +1

    Very Helpful Series

  • @SW-ud1wt
    @SW-ud1wt 9 หลายเดือนก่อน

    I have HP envy ci7 laptop having GeForce rtx 2050 card, will it be used for machine learning tasks?

  • @Not0rious7
    @Not0rious7 4 ปีที่แล้ว +1

    Nice video, I like all the graphics you used. Where do you find them?

  • @kudoamv
    @kudoamv 6 ปีที่แล้ว +3

    PLEASE HELP, Is it something like -
    "you have to download pytorch with cuda if you wanna use gpu or else you will only be able to use cpu" ?
    I am an AMD user.

    • @JasperHatilima
      @JasperHatilima 5 ปีที่แล้ว +3

      CUDA is NVIDIA platform and so only supports NVIDIA cards like the GTX GPUs. For AMD, the framework for parallel programming is OpenCL...which unfortunately does not have a development community as big as the CUDA community.

    • @ankitaharwal5886
      @ankitaharwal5886 5 ปีที่แล้ว +1

      I can feel your pain 😔 😔

  • @henson2k
    @henson2k 4 ปีที่แล้ว

    When multiple apps are using CUDA how it's managed by GPU? Can GPU execute different kernels at the same time?

  • @e4r281
    @e4r281 6 ปีที่แล้ว +75

    Maybe if we start telling people the brain is an app they will start using it.

    • @clbl8706
      @clbl8706 4 ปีที่แล้ว +2

      That's some cringeworthy joke my grandma would share on FB.

    • @e4r281
      @e4r281 4 ปีที่แล้ว

      @@clbl8706 actually, it's not a joke.

    • @nikhilmathur3351
      @nikhilmathur3351 4 ปีที่แล้ว +1

      @@e4r281 Unlike you

    • @hailongvan8285
      @hailongvan8285 4 ปีที่แล้ว

      ok commie

  • @nozaxi0327
    @nozaxi0327 6 ปีที่แล้ว +4

    Thank you. This series so helpful for me

    • @deeplizard
      @deeplizard  6 ปีที่แล้ว +1

      Glad to hear that, jesse! You're welcome!

  • @mexzarkashiev2435
    @mexzarkashiev2435 5 ปีที่แล้ว

    So which graphics card should I buy for deep learning?

  • @rifatmasud
    @rifatmasud 5 ปีที่แล้ว +3

    Anyone have that video link regarding "python is slow"?

    • @deeplizard
      @deeplizard  5 ปีที่แล้ว +2

      Added it to the description. Here you go: th-cam.com/video/DBVLcgq2Eg0/w-d-xo.html

    • @arkamitra4345
      @arkamitra4345 5 ปีที่แล้ว +1

      GIL is evil 😓

  • @ajaykrishnan4277
    @ajaykrishnan4277 6 ปีที่แล้ว +4

    you just nailed it

  • @True38
    @True38 5 ปีที่แล้ว

    Did they remove all those stats/functions on a newer version of Cudo? Because I just recently downloaded it, and the only thing I can see on the screen is CPU, XMRig, h/s to the left and Payout Coin to the right. That's it! I'm using CPU but want to use GPU, but can't see any option. Please help if you can.

  • @faizahmed8034
    @faizahmed8034 4 ปีที่แล้ว

    Aren't GPU's used for image processing (i.e conversion of binary code to analog Graphic pixels)? If so how can we use them for mathematical computations ?

  • @jackt9535
    @jackt9535 2 ปีที่แล้ว

    I'd like to know whether you could use dedicated graphic card for deep learning while you don't have a CPU with no iGPU.
    This will help me much with my screen cable management issue (I'm new to this) !
    Thanks!

  • @adithi729
    @adithi729 4 ปีที่แล้ว

    Sir is there any way to do cuda programing online? i mean any online compiler is present now? my system is not supporting cuda..pls help

  • @Henry..s
    @Henry..s 4 ปีที่แล้ว +6

    What I learned from this video is that nvidia gpu got their speed from the CEO

  • @RohanSawant-s3q
    @RohanSawant-s3q 8 หลายเดือนก่อน

    😄very nice video

  • @sqliu9489
    @sqliu9489 4 ปีที่แล้ว

    brilliant explanation

  • @zeppybrawlstars3906
    @zeppybrawlstars3906 5 ปีที่แล้ว +1

    So when I play games with advance Ai make sure my gpu is ready

  • @jordan5253
    @jordan5253 5 ปีที่แล้ว +3

    Jesus @2:41 I spit out my water lol

  • @anujsaboo7081
    @anujsaboo7081 5 ปีที่แล้ว +1

    You have a mistake in the quiz section:
    Q. Different PyTorch components are written in different programming languages. PyTorch is written in all of the following programming languages except?
    Ans. Java (on blog correct answer is coming as Python)

    • @deeplizard
      @deeplizard  5 ปีที่แล้ว +1

      Hey Anuj - Thank you so much for pointing this out! I've fixed it. You may need to clear your cache to see the change.
      Chris

  • @vibesnovibes6320
    @vibesnovibes6320 4 ปีที่แล้ว +1

    Great video

  • @ezevictor4448
    @ezevictor4448 5 ปีที่แล้ว

    Is a gtx 960m or 1050m worth using?

  • @krishna_o15
    @krishna_o15 4 ปีที่แล้ว

    I am agreeing since Soumith has said it.

  • @henson2k
    @henson2k 4 ปีที่แล้ว

    At research stage I can see how Python is acceptable choice. However for production systems Python too large in size and not fast enough!

  • @qusayhamad7243
    @qusayhamad7243 4 ปีที่แล้ว +1

    thank you

  • @MMphego
    @MMphego 5 ปีที่แล้ว +1

    Subscribed.............

  • @robertatvitalitystar2444
    @robertatvitalitystar2444 6 ปีที่แล้ว +4

    ty! Peeling the plastic of a brand new GPU is a good day, lol.

  • @benjaminhansen4808
    @benjaminhansen4808 4 หลายเดือนก่อน +3

    You can explain this whole AI trend 5 YEARS AGO

  • @Priya_dancelover
    @Priya_dancelover 2 ปีที่แล้ว +1

    excellent

  • @vky771
    @vky771 6 หลายเดือนก่อน +1

    Everyone wishes they saw your video 5 years back 😅

  • @magelauditore333
    @magelauditore333 4 ปีที่แล้ว

    5:25 i was like wtf. Can anyone have the link of whole video. 😂😂. Man i got excited and started shouting at home

    • @deeplizard
      @deeplizard  4 ปีที่แล้ว

      It's a popular one. Google and you will find 😂

  • @lancelotxavier9084
    @lancelotxavier9084 5 ปีที่แล้ว +4

    nVidia is holding back processing power in order to make selling their products sustainable.

    • @deeplizard
      @deeplizard  5 ปีที่แล้ว +5

      This is important. A company's incentive to make a profit can be a double-edged-sword. Consider the same problem in healthcare or biotech.

    • @henson2k
      @henson2k 4 ปีที่แล้ว

      @@deeplizard conflict of interests indeed, every software developer knows that LOL

  • @TheTariqibnziyad
    @TheTariqibnziyad 5 ปีที่แล้ว +11

    Finally these nerds got the guts to call something a funny name, Embarrassingly parallel xD

    • @deeplizard
      @deeplizard  5 ปีที่แล้ว +3

      🤓

    • @zackrider3708
      @zackrider3708 3 ปีที่แล้ว +1

      @@deeplizard can the new XE server gpu from intel handle ai or Deep Learning workloads like nvidia gpu ?

  • @isbestlizard
    @isbestlizard 5 ปีที่แล้ว +2

    OH MY GOD WOW are you a lizard too? i love ai and stuff as well :D

  • @kanui3618
    @kanui3618 5 ปีที่แล้ว

    Can I combine Nvidia GTX 1070 or higher with amd ryzen 5

    • @deeplizard
      @deeplizard  5 ปีที่แล้ว

      no amd at the moment.

  • @sajolsajol8393
    @sajolsajol8393 ปีที่แล้ว +1

    wow!

  • @richarda1630
    @richarda1630 3 ปีที่แล้ว

    exciting and if you read WBW blog thrilling times

  • @alkeryn1700
    @alkeryn1700 5 ปีที่แล้ว

    Guys, don't use cuda, use HIP so it runs everywhere
    or use opencl or sycl but don't have your software stuck to proprietary and platform specific hardware and software

  • @hedonismbot1508
    @hedonismbot1508 ปีที่แล้ว

    Never in my lifetime would I have ever imagined "Embarrassingly [something]" would be an actual technical term.

  • @A-Predator
    @A-Predator 4 ปีที่แล้ว +1

    i just learned the name of deep learning
    😵

  • @pscheuerling
    @pscheuerling 4 ปีที่แล้ว

    I came Here to learn how to utilitise deep learning cores for Training my own ai... I still don't know why i should buy cores i cant use.

    • @A-Predator
      @A-Predator 4 ปีที่แล้ว +1

      😵

    • @lakeguy65616
      @lakeguy65616 2 ปีที่แล้ว

      the forward pass can rely on matrix math which can be run through CUDA(software layer) and done on an NVIDIA GPU. The more GPU cores, the faster the process. A gpu with 100 cores will perform this step 10x faster than a GPU with 10 cores (in general...).

  • @beomheelee1249
    @beomheelee1249 3 ปีที่แล้ว +1

    you're so geinus!! Thanks

  • @SuperMIKevin
    @SuperMIKevin ปีที่แล้ว +1

    Lmao, I can't believe they actually named it embarrassingly parallel.

  • @avdhutjadhav5657
    @avdhutjadhav5657 2 ปีที่แล้ว

    Does that mean GPU with more CUDA is better for Deep learning ?

    • @lakeguy65616
      @lakeguy65616 2 ปีที่แล้ว

      yes (and more gpu ram...)

  • @louerleseigneur4532
    @louerleseigneur4532 4 ปีที่แล้ว +1

    merci merci

  • @Infinityxx3
    @Infinityxx3 6 ปีที่แล้ว +1

    u need some music in vid... attracks views.... cool vid

    • @deeplizard
      @deeplizard  6 ปีที่แล้ว

      What kind of music do you like?

    • @Infinityxx3
      @Infinityxx3 6 ปีที่แล้ว +1

      ​@@deeplizard PEWDIEPIE bitch lasanga ? jk... any relaxing music when u talk... when u changing camera shot etc... ;)

    • @deeplizard
      @deeplizard  6 ปีที่แล้ว

      haha. That sent us on a tangent. Hadn't seen that before.🤣

  • @DinoFancellu
    @DinoFancellu 4 ปีที่แล้ว

    Its a pity that AMD don't seem to support CUDA. Their new big navi cars look really nice apart from that

  • @alebecker12
    @alebecker12 10 หลายเดือนก่อน

    I have a doubt, does Nvidia has a monopoly in such hardware? If not, does CUDA only work on Nvidia hardware?

    • @deeplizard
      @deeplizard  10 หลายเดือนก่อน +1

      Yes. Nvidia built CUDA. It only works with their hardware.

  • @Joco5012
    @Joco5012 3 ปีที่แล้ว

    Wow was it this ok to do so much coke back in the day? 5:12 damn dude take it down a notch

  • @robertsmith512
    @robertsmith512 5 ปีที่แล้ว +1

    SUBBED !
    EVERYBODY SUB THIS CHAN !
    THIS ONE KNOWS HE'S STUFF !
    GO LOOK AT THE PLAYLIST LIB !

    • @deeplizard
      @deeplizard  5 ปีที่แล้ว

      Thanks Robert! Note that there are two of us here. 🦎🦎

  • @invest8198
    @invest8198 3 หลายเดือนก่อน

    someone who bought $NVDA 2018 ?

  • @MilesBellas
    @MilesBellas ปีที่แล้ว +3

    Jensen is better at 2x

  • @cubul32
    @cubul32 4 ปีที่แล้ว

    Former CEO - and we can see why.

    • @deeplizard
      @deeplizard  4 ปีที่แล้ว

      Although, he became a billionaire from his tenure at Microsoft 😄

  • @りょりょりょ-b6s
    @りょりょりょ-b6s 4 ปีที่แล้ว

    5:57 東工大でるのは草www

  • @MeowsyDancer
    @MeowsyDancer 3 ปีที่แล้ว

    I will now send this to anyone who asked why I bought a 3090! rip wallet tho

  • @iLevelTechnology
    @iLevelTechnology 7 หลายเดือนก่อน

    He doesn’t explain tensor very well but overall good job

  • @ErrorRaffyline0
    @ErrorRaffyline0 4 ปีที่แล้ว

    I really dislike CUDA becouse it’s not open-source and AMD is not able to use it, it makes development for both AMD and NVIDIA much harder.

  • @_SupremeKing
    @_SupremeKing ปีที่แล้ว +1

    Although still being confused, I just picked up some new knowledge for a layman like me

  • @Drtsaga
    @Drtsaga 4 ปีที่แล้ว

    "Deep learning" should not be on the title in my opinion.

  • @jacobcorr337
    @jacobcorr337 3 ปีที่แล้ว +2

    CUDA not explained at all

    • @lakeguy65616
      @lakeguy65616 2 ปีที่แล้ว +1

      CUDA is a software layer that interfaces with Nvidia GPUS to allow porting some problems (think forward pass) to the GPU which can be done in parallel. (your pc has an Nvidia GPU, with software like Pytorch, you tell PC cuda is available and to send certain processes to GPU for processing in parallel.) vastly over simplified.

  • @ronjon7942
    @ronjon7942 7 วันที่ผ่านมา

    You never, ever, ever have to show ballmer make an ass out of himself. Please.

  • @escapefelicity2913
    @escapefelicity2913 3 ปีที่แล้ว

    I'm glad I don't need to listen to Jensen Huang.

  • @mmm-ie5ws
    @mmm-ie5ws 8 หลายเดือนก่อน

    u did a rly bad job at explaining why gpu's are better for parrallel computing than cpu's.