🎯 Key Takeaways for quick navigation: 00:00 🖥️ *Building a $10,000 deep learning workstation involves careful planning and research.* 01:36 🛠️ *PCPartPicker is a valuable tool for organizing components, prices, and compatibility when building a deep learning workstation.* 02:17 🎮 *GPU selection is crucial; consider factors like tensor cores, memory bandwidth, and GPU memory for optimal deep learning performance.* 05:14 💻 *Collaboration with Nvidia can provide high-endGPUs with ample VRAM for deep learning tasks.* 08:02 🧠 *CPU choice should consider having at least four CPU cores per GPU and sufficient PCI lanes for GPU connections.* 09:12 💾 *RAM should match the largest GPU's VRAM and can be added later if needed.* 09:54 🖥️ *Ensure motherboard compatibility with CPU, GPU, RAM, and other components.* 11:14 🧰 *Consider using NVMe SSDs for OS and data storage, while HDDs or SSDs can be used for data sets.* 13:19 ⚡ *Ensure your PSU has adequate wattage based on your components, and choose a case with sufficient space and good airflow.* 16:05 🔄 *Double-check if any CPU cooler components need to be installed on the back of the motherboard before mounting it in the case.* 19:45 🚀 *Use Lambda Stack to easily install all necessary GPU drivers and dependencies for AI model training.* 21:54 🤖 *Verify GPU and PyTorch installation to ensure your deep learning workstation is ready for AI tasks.*
Found a glitch today. I learned that @ around $1148.66 you can buy TWO RTX 4060 Ti OCs and get 32GB of VRAM for less than half what a 4090 costs. Now that I’ve seen this I can help but think that for 1750-2500 Nvidia has been ripping people off building Deep Learning/AI Servers. OH! And the 4060s only need ONE 8-Pin power cable so running both of them with a smaller power supply should be no problem if you’re just upgrading your current system.
Just ordered two A6000 Ada for my upcoming ubuntu deep learning workstation for my startup, pairing it with 128GB Corsair Vengeance DDR5 6000MHz, AMD Ryzen 9 9950X Zen 5 CPU and 4x 8TB Sabrent NVMe Drives which I'll setup in RAID 10 so I get around 15TB of storage and full redundancy on 2 of the drives. Can't wait to start building it soon! What's your opinion on the noctua A12x25? I got 6 of them to put inside the NZXT H5 Flow Case and I never owned Noctua fans before, just simple corsair ones.
You probably could have gotten everything but the GPU for less than $1k and already assembled if you had chosen a second hand Lenovo or Dell or HP workstation. The 2year old refurbished thinkstation 940 can be had with 2 CPUs and 513GB ram with 3 x16 PCIe slots and these devices are already configured to support high end GPUs. Consumer hardware is fine if that is all you can get but a proper workstation cannot be beat on price or performance. On a 10k budget that can buy a lot more GPU and you will be putting it in a motherboard that won't loose it solder under the heat if a few GPUs.
Great video, I'm jealous of that setup! :) by the way please don't forget to remove the blue protective sticker from the GPU, it's meant to be removed.
in terms of GPU for making only Art any RTX card usable, because Nvidia finally updated driver and process can be offloaded to system RAM from GPU (only Windows as i remember, stable diffusion in automatic1111). Anyway, ordinary people can still use many modern open models as consumers released each day with only CPU by GGUF format, better to have any multi-core CPU like cheap Xeons on Ali and 128Gb Ram (less Ram=less choice, if i remember correctly Llama2 70B GGUF 8bit uses like 80Gb of Ram). Also, instead of Ubuntu i would use a Zorin OS, like Ubuntu but with flatpaks.
when it comes to the motherboards of those xeons, the chipsets are used h61 motherboard chipsets taken from old dell desktops so not the best quality, also their software is basically made by the chinese and russian gaming communities who want's cheap platforms for gaming, you are better off with an entry level ryzen cpu and motherboard who have a warranty and software made by billion dollar corporations.
@@emanuelsavage-op5mm that is really weird argumentation, even some racism notes. There's a huge choice on market for Xeon's, if you have money just take supermicro boards. Offering AMD which can't into CUDA is like offering salt instead of sugar.
@@fontenbleau dummy your racism card backfired because I'm half Russian myself, All I said is true, those chinese boards are made with the materials of used cheap motherboards from dell desktops, with software mostly supported by the community. the latter part is absolutely non sense, and AMD is not as good as intel when it comes to software support for linux, but if we are in a budget we have to make sacrifices I would rather use AMD ryzen than some old server architecture with a reclaimed dell motherboard.
@@angelg3986 on windows it's globally supported by driver, in Nvidia control panel called System Memory falback, on Linux depends fully on software, which in my case is oobabooga and similar
The performance per dollar it would be good to see your price reference for each GPU. Was it at their original MSRP or the second hand price at the time of the article? I am wondering whether to get another RTX3090 second hand or a rtx4070
Instead of a new station, how do you feel about existing system(laptop) with an external video card..? This way, it works with laptop easily a d GPU is upgradable.
Bro how do you start collaboration with nvidia and why nvidia give you a RTX 6000 gpu, if you make a video on this i think it will add value to your channel.
Great video!I am really confused on which package & model of gpu & cpu to buy.I have seen these recommended package: AMD Ryzen 7 7800X3D Processor  Deepcool AK400 DIGITAL - AFTERSHOCK Edition  Gigabyte B650M Gaming Wifi  Gigabyte RTX 4090 Windforce V2 - 24GB  32GB Team T-Force Delta RGB 6000mhz (16x2)  2TB Lexar NQ710 Gen4 SSD  850W Deepcool 80+ Gold ATX3.0 (ZC850D)  AX Wireless + Bluetooth Included Is this package powerful & fast enough to run majority of AI applications & video rendering?Does the number of cores and threads in CPU affect the performance & speed of AI workloads?Should i buy intel or AMD processor? I would like to hear your recommendation 😍
This is a 10 month old video. Just in case if you didn't realizein it during the last 10 months, there is a design flaw in your assembly. The hot air comes out of cpu fan will be sucked in to you gpu. Please verify this.
You can get a much better machine with less than $10k. Basically get 192GB DDR5 RAM and 2x 4090 or 2x 3090. You’ll even get better performance with 2 of these GPUs than a single RTX 6000 despite both equating 48GB VRAM.
Is there some cheapest variant, 10k only for deep learning is way too much 🙂? Also realized if I buy 2-3 GPUs then power supply should be 1000+ Watts 😢 BTW what about AMD graphic cards, they can’t be used?
I was using cheap mothers board and was getting errors and training wouldn't finish. You think I would learn. I did it in 2 build, AI needs high end mother board.
The A6000 is amazing. The best most of us can hope for is a pair of 3090s and even that is silly expensive.
So happy for you. Your excitment is contagious 😂
Thanks a lot, really appreciate it! :-)
Bro you literally just got a 10K Nvidia GPU and called it a video lol
🎯 Key Takeaways for quick navigation:
00:00 🖥️ *Building a $10,000 deep learning workstation involves careful planning and research.*
01:36 🛠️ *PCPartPicker is a valuable tool for organizing components, prices, and compatibility when building a deep learning workstation.*
02:17 🎮 *GPU selection is crucial; consider factors like tensor cores, memory bandwidth, and GPU memory for optimal deep learning performance.*
05:14 💻 *Collaboration with Nvidia can provide high-endGPUs with ample VRAM for deep learning tasks.*
08:02 🧠 *CPU choice should consider having at least four CPU cores per GPU and sufficient PCI lanes for GPU connections.*
09:12 💾 *RAM should match the largest GPU's VRAM and can be added later if needed.*
09:54 🖥️ *Ensure motherboard compatibility with CPU, GPU, RAM, and other components.*
11:14 🧰 *Consider using NVMe SSDs for OS and data storage, while HDDs or SSDs can be used for data sets.*
13:19 ⚡ *Ensure your PSU has adequate wattage based on your components, and choose a case with sufficient space and good airflow.*
16:05 🔄 *Double-check if any CPU cooler components need to be installed on the back of the motherboard before mounting it in the case.*
19:45 🚀 *Use Lambda Stack to easily install all necessary GPU drivers and dependencies for AI model training.*
21:54 🤖 *Verify GPU and PyTorch installation to ensure your deep learning workstation is ready for AI tasks.*
Love that! :-) What tool did you use?
Excellent video Martin, as always and it's great to see you back. :)
Glad you enjoyed it! Yes, definitely feels good to publish a new video after a long time :-)
Cool! That intersting, I plan to build like this for learning to Fine-Tuning the LLMs model in my own project.
Happy how excited you are ;)
Found a glitch today. I learned that @ around $1148.66 you can buy TWO RTX 4060 Ti OCs and get 32GB of VRAM for less than half what a 4090 costs. Now that I’ve seen this I can help but think that for 1750-2500 Nvidia has been ripping people off building Deep Learning/AI Servers. OH! And the 4060s only need ONE 8-Pin power cable so running both of them with a smaller power supply should be no problem if you’re just upgrading your current system.
Just ordered two A6000 Ada for my upcoming ubuntu deep learning workstation for my startup, pairing it with 128GB Corsair Vengeance DDR5 6000MHz, AMD Ryzen 9 9950X Zen 5 CPU and 4x 8TB Sabrent NVMe Drives which I'll setup in RAID 10 so I get around 15TB of storage and full redundancy on 2 of the drives. Can't wait to start building it soon! What's your opinion on the noctua A12x25? I got 6 of them to put inside the NZXT H5 Flow Case and I never owned Noctua fans before, just simple corsair ones.
Super Cool!! Congratulations!!!
Thanks!!
You probably could have gotten everything but the GPU for less than $1k and already assembled if you had chosen a second hand Lenovo or Dell or HP workstation. The 2year old refurbished thinkstation 940 can be had with 2 CPUs and 513GB ram with 3 x16 PCIe slots and these devices are already configured to support high end GPUs. Consumer hardware is fine if that is all you can get but a proper workstation cannot be beat on price or performance. On a 10k budget that can buy a lot more GPU and you will be putting it in a motherboard that won't loose it solder under the heat if a few GPUs.
Great video, I'm jealous of that setup! :) by the way please don't forget to remove the blue protective sticker from the GPU, it's meant to be removed.
in terms of GPU for making only Art any RTX card usable, because Nvidia finally updated driver and process can be offloaded to system RAM from GPU (only Windows as i remember, stable diffusion in automatic1111).
Anyway, ordinary people can still use many modern open models as consumers released each day with only CPU by GGUF format, better to have any multi-core CPU like cheap Xeons on Ali and 128Gb Ram (less Ram=less choice, if i remember correctly Llama2 70B GGUF 8bit uses like 80Gb of Ram).
Also, instead of Ubuntu i would use a Zorin OS, like Ubuntu but with flatpaks.
when it comes to the motherboards of those xeons, the chipsets are used h61 motherboard chipsets taken from old dell desktops so not the best quality, also their software is basically made by the chinese and russian gaming communities who want's cheap platforms for gaming, you are better off with an entry level ryzen cpu and motherboard who have a warranty and software made by billion dollar corporations.
@@emanuelsavage-op5mm that is really weird argumentation, even some racism notes. There's a huge choice on market for Xeon's, if you have money just take supermicro boards. Offering AMD which can't into CUDA is like offering salt instead of sugar.
@@fontenbleau dummy your racism card backfired because I'm half Russian myself, All I said is true, those chinese boards are made with the materials of used cheap motherboards from dell desktops, with software mostly supported by the community. the latter part is absolutely non sense, and AMD is not as good as intel when it comes to software support for linux, but if we are in a budget we have to make sacrifices I would rather use AMD ryzen than some old server architecture with a reclaimed dell motherboard.
How to offload to the system ram - which framework does support it - pytorch TF etc?
@@angelg3986 on windows it's globally supported by driver, in Nvidia control panel called System Memory falback, on Linux depends fully on software, which in my case is oobabooga and similar
Hope to build my own DL workstation one day as well.
I have to get my hands on this UNO Flip
Dude, How did you convent Nvidia to send you this gem ? 😅 I need one too 😂
Hey, how do you start collaboration with nvidia and why nvidia give you a RTX 6000 GPU? Could you please tell me the process?
The performance per dollar it would be good to see your price reference for each GPU. Was it at their original MSRP or the second hand price at the time of the article? I am wondering whether to get another RTX3090 second hand or a rtx4070
Instead of a new station, how do you feel about existing system(laptop) with an external video card..? This way, it works with laptop easily a d GPU is upgradable.
Bro how do you start collaboration with nvidia and why nvidia give you a RTX 6000 gpu, if you make a video on this i think it will add value to your channel.
Great video!I am really confused on which package & model of gpu & cpu to buy.I have seen these recommended package:
AMD Ryzen 7 7800X3D Processor

Deepcool AK400 DIGITAL - AFTERSHOCK Edition

Gigabyte B650M Gaming Wifi

Gigabyte RTX 4090 Windforce V2 - 24GB

32GB Team T-Force Delta RGB 6000mhz (16x2)

2TB Lexar NQ710 Gen4 SSD

850W Deepcool 80+ Gold ATX3.0 (ZC850D)

AX Wireless + Bluetooth Included
Is this package powerful & fast enough to run majority of AI applications & video rendering?Does the number of cores and threads in CPU affect the performance & speed of AI workloads?Should i buy intel or AMD processor?
I would like to hear your recommendation 😍
This is a 10 month old video. Just in case if you didn't realizein it during the last 10 months, there is a design flaw in your assembly. The hot air comes out of cpu fan will be sucked in to you gpu. Please verify this.
You can get a much better machine with less than $10k. Basically get 192GB DDR5 RAM and 2x 4090 or 2x 3090. You’ll even get better performance with 2 of these GPUs than a single RTX 6000 despite both equating 48GB VRAM.
Don’t forget they need to be a single node, if you mean to do any serious work with it.
Could you make a post on what you wrote here so that others can refer to?
@@sokhibtukhtaev9693Anything you have questions about, on what they said?
Is there some cheapest variant, 10k only for deep learning is way too much 🙂? Also realized if I buy 2-3 GPUs then power supply should be 1000+ Watts 😢 BTW what about AMD graphic cards, they can’t be used?
I was using cheap mothers board and was getting errors and training wouldn't finish. You think I would learn. I did it in 2 build, AI needs high end mother board.
Just recreate alpha go and mu zero and train those on your pc
anyone know how he got nvidia to collab?
I guess 10k is one month payment from your youtube monetization
Haha I wish my friend
How to build $1000 Deep Learning Workstation?
TLDR: have lots of money and buy the best cards and build a pc
WAT 😮
Here is my personal Laptop... 10,000$ .... Are you Korean or Ukrainian?