Looking at NVIDIA H100 with High End Exxact System

แชร์
ฝัง
  • เผยแพร่เมื่อ 28 พ.ย. 2024

ความคิดเห็น • 23

  • @plumberski8854
    @plumberski8854 ปีที่แล้ว +5

    Amazing. That is what I will be using as a common GPU when I come back in my 3rd generation. It is using Torch from a display I saw.

  • @nicksterba
    @nicksterba ปีที่แล้ว +2

    Hey Jeff, really enjoying your channel! I realize that with your channel, you're typically talking about higher end components, and how different ai programs interact with each other and relevant hardware. I'm hoping to get myself an RTX 3060 before the end of the year, and I want to get into using ai for multiple purposes. Generating subtitles in English & other languages, upscaling videos, restoring older videos, creating deepfakes, etc.
    Would you ever consider making videos on specific ai tools that one can use for these types of workloads? Maybe tutorials, like how to install and use these ai tools? I understand that that's not exactly what you're channel is always about, but it'd be very cool if you ever made videos like this!
    Keep up the great work! Your channel is awesome and informative! 🖥😎

    • @HeatonResearch
      @HeatonResearch  ปีที่แล้ว +3

      Thanks! The 3060 (esp the 12gb one is really nice). I added your suggestions to my idea list. RIght now I am completely buried with the transition of the course to PyTorch, but hope to be caught up there in a few months.

    • @nicksterba
      @nicksterba ปีที่แล้ว

      @@HeatonResearch Understandable! In the meantime I'll keep watching your channel, again keep up the great work!

  • @TheReferrer72
    @TheReferrer72 ปีที่แล้ว +3

    Cant wait till Nvidia has real competition, so we prosumers can run these models at home.
    Im starting to think the market is being starved on purpose to sell high margin products.

    • @HeatonResearch
      @HeatonResearch  ปีที่แล้ว

      I would love to see this power at the edge in a few years. Price points are crazy.

    • @Biedropegaz
      @Biedropegaz ปีที่แล้ว

      I believe that electrical outlet in the wall won't support so power hungry GPUs.

    • @TheReferrer72
      @TheReferrer72 ปีที่แล้ว

      @@Biedropegaz Maybe in America, using outlets that have not been upgraded. we should be able to run 3 H100's no problem.
      Anyway I will build a local Petal network, after I have taken a course on how to prepare data for LORA and fine tuning.

  • @Biedropegaz
    @Biedropegaz ปีที่แล้ว +3

    Can You show some free LLMs running on it?

  • @dholzric1
    @dholzric1 ปีที่แล้ว

    It seems like SDXL with the refiner should be easier to use 2 gpus since you could have the refiner run on a second gpu. Maybe just set it up to monitor the directory for files and then automatically process them.

  • @Nimitz_oceo
    @Nimitz_oceo 8 หลายเดือนก่อน

    I understand these GPUs cost an arm and a leg, however the Tesla k80 with 24 Gb ram now costs less than 100 if I decided to go cray. I could easily build a cluster with 5 Tesla and with a water cooling system. What I don’t understand is how fast these data center GPU depreciate in values it’s kind of unfair

    • @amjadpanhwar3320
      @amjadpanhwar3320 8 หลายเดือนก่อน

      It is bc of speed...no one wants too use slow service..data center does bc of many things...speed,time,money, electric bills, efficiency.. process of data, easy to use, easy to replace, conductivity between old and new systems till old ones phase out..hope you get the picture..its not about the hardware its about money how much they can save,old machine may be useless too them,but for its still in use..poor countries still use these things for a reason...so not all goes in a waste..

    • @Sickoloko24
      @Sickoloko24 หลายเดือนก่อน

      The h100 96gb is 346 times faster than the k80 in AI workload so if you had 346 k80 that’s 100.000+ watts lol compared to h100 350watts, k80 would cost 150k in electricity alone in a year period and and h100 only cost 500 dollars lol yeah so at the end of the day the h100 is cheaper it makes why it’s priced that way.

  • @samvarnell7405
    @samvarnell7405 ปีที่แล้ว

    Amazing video ! I love watching your content and learning about how the future will be made.
    Do you have any thoughts about the new 4060ti 16gb for deep learning and AI workloads ?

    • @plumberski8854
      @plumberski8854 ปีที่แล้ว +1

      4060ti 16 GB was what I wanted to get. Got 4070 instead for my AI programming hobby and photo/video editing. So far very happy with my Gigabyte Eagle 4070.

  • @carterjohnson25
    @carterjohnson25 10 หลายเดือนก่อน +1

    How many FPS on counterstrike?

  • @ScienceIsGreat-laboratory
    @ScienceIsGreat-laboratory 9 หลายเดือนก่อน

    ur personal datacenter

  • @PeteDunes
    @PeteDunes ปีที่แล้ว

    Ok, but can it play Crysis?

  • @Carlin2810
    @Carlin2810 ปีที่แล้ว

    But can it run Crysis?

  • @rtz549
    @rtz549 2 หลายเดือนก่อน

    Put some DOS Doom on it.

  • @parthwagh3607
    @parthwagh3607 วันที่ผ่านมา

    I tried with same specs of stable diffusion. My 4090 is way more faster than this

  • @CharlesM236
    @CharlesM236 ปีที่แล้ว

    Until we fix the legal, spiritual and immoral foundations of society, no technical intervention, no matter how useful or clever, will save us.

  • @acryptoq6868
    @acryptoq6868 7 หลายเดือนก่อน

    This won't touch my 3070 lmao😂😂😂