I'm only a minute in and had to comment something; very clever doing this in the kitchen so that you can use the waste-heat to do some cooking after it's built! Haha, I jest of course. Really excited to see how this turns out with the two GPUs installed! Edit: made it to the end. Total success, congrats! Very surprised at thermals on such a compact build (especially a dual 3090 build). Really amazed.
Thanks for watching to the end haha! I was surprised by how cool the 3090s ran too lol considering they pull 400W each overclocked. I will have more videos on this build and I already made some new changes to it to improve the cooling.
@@OTechnology I am really looking forward to that. I just got into building a deep learning rig myself, so your video really hit home. Yours is considerably better than mine already, haha, with twice the RAM, and the dual GPUs. Mine's 128 GB system RAM, and half the VRAM (it's a single 4090). Anyway, again, great job. Really impressed and amazed by your build's compactness and impressive thermals. It really surprised me.
Awesome awesome awesome! Thank you for this Valuable video!!! Just Please if you you, have the NVLINK RTX TITAN build as well and compare these two t/s in 70B LLMs or training them. It'll be Awesomest.
This will be semi portable, the problems with using a normal layout like this is that either the Pcie or the GPU will crack if moved too much. That's the reason why itx build is better for portability since most of the case have the GPU in a separate compartment connected with risers.
Very nice build, and you articulate it very well. However, this is the wrong platform for leveraging those RTX 3090’s I would have gone for the Asus WS C422 PRO/SE motherboard with TPM onboard (for Win 11) hosting the Socket 2066 with a Xeon W 2200 series CPU. For AI and Machine Learning it is helpful to have the AVX512 instruction set, not available on the X99 platform. As well, to run Windows 11 with stability you need to purchase TPM module with no guarantee that aftermarket module will work. Prices for Socket 2066 motherboards and a W 2200 series cpu, are only a few hundred more dollars than the X99 platform. AMD’s Threadripper is also an alternative, but the AVX512 instruction set was not available till the TR 7900 series processors, and those are prohibitively expensive to the enthusiast consumer. Even the 5900 Threadrippers are still overpriced, although both those AMD platforms utilize PCie 4.0 slots for your 3090’s. Choices and trade offs, oh we live in the best of times. Cheers.
hot air from gpus will be sucked in back to the case onto cpu and psu, psu sucks hot air from inside the case, should be installed upside down. not to mention that two gpus close by the one on top sucks already hot air from the bottom one, at least keep the case open, but people do not do this at home :) well the system will work but will be heavily throttling
Why did you not connect your 3090s with nvlink again? Also out of curiosity, what are use cases of a "portable" ML PC? Why not just leave your server back home and just ssh into it from wherever you are?!
Hey, i'm trying to build a similar setup from scratch, if u can provide the models of the parts especially the case used that would be of great help to me, i'm a begineer when it comes to building pcs so yk
@@detwog1677 Most, if not all brand name X99 motherboards can use both ECC and non-ECC memory when paired with a Xeon e5-1600v3/4 series or e5-2600v3/v4 CPU.
Can you post a new video show us how you use this server? How to install the softwares and what is the performance. And the performance to run llama3.1 .Thanks. Since I am planing to build a similar PC. But the mother board and CPU can only buy second hand.
i have i9-1250HX in my workstation laptop, and i am not currently doing anything huge with my laptop. my daily tasks on laptop are 3-4 chrome tabs like youtube,online courses and netflix. i have my maximum cpu performance on 30%. WILL THIS AFFECT MY LAPTOP IN FUTURE?????
Great video, I am in the middle of my first HEDT build with Gigabytes AERO TRX50 with a 7980x and trying to watch as many videos and hoping to find a video for connecting the two GPU's with a nvlink bridge so it will have 48GB Vram
Yea even when running two or three cards in sli configuration for gaming or anything else for that matter each gpu still needs that vram and it doesn't really combine them in a way that your describing like otech explained. I didn't even realize it made memory transfers faster but then again that totally makes sense. I kinda wish they made a way to daisy chain gpus in a way that was effective and efficient so that you would be doubling your gpu performance but unfortunately it doesn't work that way from my understanding. If it did though it would be awesome cause imagine getting two super cheap 1080tis and getting almost 3090 like performance from them but unfortunately I don't think it's possible unless someone developed a new way of using the sli and even then I don't know if it would be possible.
@@SpreadingKnowledgeVlogs Yeah, that is what I was shooting for. Two 3090's to perform like a A6000 and save a bunch of money. I just finished the PC this morning for the most part and going to have a dual boot OS with Windows 11 and Linux (Mint or Ubuntu...haven't decided yet). A friend of mine is into the computer vision program learning and thought I would make a machine geared towards that and deep learning. Bit of a learning curve vs just building gaming PC's for customers
@garyc5245 I don't think you will see much benefit from the two gpus vs just one I've watched many videos on it and it does work that well. What would be interesting is if someone made a program where one gpu does all the rendering while the other gpu acts as a accelerator or upscaler or some sort to make the final i,age better quality and smoother running. Proble, is you would have to do all the r&d yourself likely and it's likely not easy work. If your using this computer for gaming mostly I would urge u to look into steam os it's basically Linux with steams skin on it and runs ga,es way better then windows is what I hear or I should say runs games more efficiently which results is better everything just about.
@@SpreadingKnowledgeVlogs Thanks, for the food for thought, it will not be a gaming PC...this is just a very expensive experiment and would be great if this was the foundation PC for someone in that line of work; however I am signed up for some classes to get a little smarter on the topic.
I consider building 8x4090 build for AI on GENOAD8X-2T/BCM motherboard with pcie extenders inside Thermaltake W200. Please tell me it's a bad idea before I'll do this mistake 🤣
Hmm a single 4090 would outperform it in AI/ML workloads unless NVlink scales with VRAM while being vastly more energy efficient and lower latency (with all the compute units on one die)
@@CaptainScorpio24 NOPE! It's for games and has been kneecapped accordingly. Infact a pascal p40 can come close to its inference speed. The p40 is 8 years old. but the difference is it was BUILT for early AI training. Buy an older server GPU that's built for AI. like an H100 or an A1000... These 30-4000 series gaming class cards arent the best crystal first of all, they are overworked and impure silicon... Good for prof of concept or some playing around, or just driving into the ground playing games, knowing you will replace it in a year or two (disposable). But they aren't REAL AI gear.
Out of curiosity with that not NVlink does windows see the gpus as having 48gb vram on task manager, or is it similar to just sli where regardless of data-throughput it sees only one gpu worth of vram 24gb? Thanks for the video!
how come you put a pic in the thumbnail with the two gpus connected via a sli or nvlink connection but in the video you didnt go over that aspect any? that was the whole reason i clicked on the video i was hoping you were testing sli and gaming on a newer card since not many people have tested it and those that have say its sort of useless.
Its completely useless for gaming. There is no games that can run with multi GPU anymore. I am using it for training AI models where it needs to communicate between GPUs. I only got the NVLink after making the video so unfortunately it's not in the video lol.
@OTechnology that's not true all the old games that supported sli still support it as well as a small handful of more current titles although it's not well optimized nor does it help much since each gpu has to render individually it makes very little sense performance wise but in some cases I does work well. It's just there are only 3 or 4 games where it works well and I've only seen it done with gtx gpus that's why I'd love to see a rtx try it. How does the nvlink help with ai calculations? Sorry but I know very little about ai but it fascinates me. Wouldn't a ai specific gpu be a better option here due to the cores being more centered around ai tasks vs the rtx which has a mixture of cores for different tasks?
And what exactly if AI ML software and models are you running ? See a lot of this and the workload is not worth running (bad and outdated models) and what the person thay built them "learns" about AI is almost not teaching them anything useful
14:34 smh, you dont seem like a person who has done a lot of builds like you've said. why in the world would you think to install both GPUs and you knew how many NVME you are going to use, and should have taken note where the slot(s) are that you are going to use. Your level of excitement has either affected your judgement or exposed your lack of experience or both.
19:08 WHAT ARE YOU DOING AND SHOWING TO PEOPLE AND ACTING LIKE IT'S OK TO BE PULLING AIR FROM INSIDE THE CASE TO COOL THE PSU!!! You should be banned from TH-cam for this!!! And it looks like cables may be an issue with those fans, those they are set to exhaust, BUT YOU NEED TO BE PUSHING HOT AIR OUT WITHOUT ISSUES!! 19:15 THEN, you proceed to tell people it is fine because the PSU fan barely runs!! IT DOESN'T MATTER!!! If the PSU needs it's fan to operate for any reason it should be able to do so at optimal level!!
i clicked this because i know this is just a ridiculous build. Most games are unstable with dual GPUs. i sure hope this guy is using this to CREATE video games rather than playing them with this build. EDIT: ok cool its his work station. no longer shaking my head in disappointment.
I'm only a minute in and had to comment something; very clever doing this in the kitchen so that you can use the waste-heat to do some cooking after it's built! Haha, I jest of course. Really excited to see how this turns out with the two GPUs installed!
Edit: made it to the end. Total success, congrats! Very surprised at thermals on such a compact build (especially a dual 3090 build). Really amazed.
Thanks for watching to the end haha! I was surprised by how cool the 3090s ran too lol considering they pull 400W each overclocked. I will have more videos on this build and I already made some new changes to it to improve the cooling.
@@OTechnology I am really looking forward to that. I just got into building a deep learning rig myself, so your video really hit home. Yours is considerably better than mine already, haha, with twice the RAM, and the dual GPUs. Mine's 128 GB system RAM, and half the VRAM (it's a single 4090).
Anyway, again, great job. Really impressed and amazed by your build's compactness and impressive thermals. It really surprised me.
NVLINK improves performance for AI training significantly
And for text generation?
100%
will u do a 4070 super, 4070 ti super, 4080 super buying guide? please
buying guide for 4070: Don't buy the 4070... Its e-waste.
Step 1: have lots of money to buy the best gpu with tons of. Vram.
Sir buy the 4070ti super. If your after a 4070!
The cameraman is Tony Soprano.
Just found your channel. Excellent Content. Another sub for you sir!
Awesome awesome awesome! Thank you for this Valuable video!!! Just Please if you you, have the NVLINK RTX TITAN build as well and compare these two t/s in 70B LLMs or training them. It'll be Awesomest.
The ROG RAMPAGE V EDITION 10 only supports 128GB of RAM and has only PCIe 3.0.
This will be semi portable, the problems with using a normal layout like this is that either the Pcie or the GPU will crack if moved too much. That's the reason why itx build is better for portability since most of the case have the GPU in a separate compartment connected with risers.
he needs carefully hand carrying
Very nice build, and you articulate it very well.
However, this is the wrong platform for leveraging those RTX 3090’s I would have gone for the Asus WS C422 PRO/SE motherboard with TPM onboard (for Win 11) hosting the Socket 2066 with a Xeon W 2200 series CPU. For AI and Machine Learning it is helpful to have the AVX512 instruction set, not available on the X99 platform. As well, to run Windows 11 with stability you need to purchase TPM module with no guarantee that aftermarket module will work.
Prices for Socket 2066 motherboards and a W 2200 series cpu, are only a few hundred more dollars than the X99 platform. AMD’s Threadripper is also an alternative, but the AVX512 instruction set was not available till the TR 7900 series processors, and those are prohibitively expensive to the enthusiast consumer. Even the 5900 Threadrippers are still overpriced, although both those AMD platforms utilize PCie 4.0 slots for your 3090’s. Choices and trade offs, oh we live in the best of times. Cheers.
100% 👏🏻🔥🤝🏻
so many months has been passed dude where were yuh .. i actually searched your channel now on 12.55am in india.
hot air from gpus will be sucked in back to the case onto cpu and psu, psu sucks hot air from inside the case, should be installed upside down. not to mention that two gpus close by the one on top sucks already hot air from the bottom one, at least keep the case open, but people do not do this at home :) well the system will work but will be heavily throttling
Great assembly at present time's, please change the direction of the cpu fan. The CPU fan should be in-way and not toward the motherboard out devices.
I wish you linked the parts you used
Why did you not connect your 3090s with nvlink again?
Also out of curiosity, what are use cases of a "portable" ML PC? Why not just leave your server back home and just ssh into it from wherever you are?!
16:35
Lama sdh tdk mmpir k sini lg. Seperti sdh full bahas Inggris smua kontennya 😅
Very Nice! This build is awesome and looks incredible!
Hey, i'm trying to build a similar setup from scratch, if u can provide the models of the parts especially the case used that would be of great help to me, i'm a begineer when it comes to building pcs so yk
5:00 if i could get a 3090 FE for $750 even used but in that good condition, heck yeah would buy it!
29:58 i hope your not powering that through that multi plug 😂
Hi, Cool Video… Can you please send me the exact Specification or a Link for the DDR4 - B- Die 256GB Ram you chose for your build? 😃 Thank you!!!
Are you using Non ECC or ECC Ram since the motherboard is a Non ECC Board as far as I can read on the webpage
@@detwog1677 Most, if not all brand name X99 motherboards can use both ECC and non-ECC memory when paired with a Xeon e5-1600v3/4 series or e5-2600v3/v4 CPU.
Would be great to see some fine-tuning on this setup!
28:47 i been WAITING to see if you were going to mention that you switched out the PSU.
Hi in the final assembly I didn’t see you use nvlink 4 slot? So what do you think, it is faster Using nvlink or not?
Great stuff 🎉
are 2x 3090 faster than one single 4090? how about 4x 3090s ? what case would you use to run 4x 3090 ?
Can you post a new video show us how you use this server? How to install the softwares and what is the performance. And the performance to run llama3.1 .Thanks. Since I am planing to build a similar PC. But the mother board and CPU can only buy second hand.
love your build
i have i9-1250HX in my workstation laptop, and i am not currently doing anything huge with my laptop. my daily tasks on laptop are 3-4 chrome tabs like youtube,online courses and netflix.
i have my maximum cpu performance on 30%. WILL THIS AFFECT MY LAPTOP IN FUTURE?????
no
Good video! No comment on cables 😊
Thanks Anthony! Haha no cable management space in there...
@@OTechnology Yeah, don't blame you :) Hope you are well.
Can this CPU/Motherboard combo run both 3090’s at PCI’s 4.0 at 16X? Thanks
I want to built system on two 3090/3090Ti in NvLink mode on AM4 socket x570 chipset PCIe 4.0 😁
Could you please put the build & parts list in the discription?
You didn't install any storage, so how do you boot it up?
What OS are you using?
He installed an nvme for storage
No liquid cooling? Does the motherboard have a built in bmc?
hi, can you please do a 3060 and 3060 TI buying guide? i dont know what verson off the cards to buy. thx
3060 is better than the 3060ti for AI. VRam is important.
Instead of LLMs , did you ever try Large Image Models, LIMs?
Great video, I am in the middle of my first HEDT build with Gigabytes AERO TRX50 with a 7980x and trying to watch as many videos and hoping to find a video for connecting the two GPU's with a nvlink bridge so it will have 48GB Vram
It won't get connected into a 48GB pool. It just makes memory transfers between cards faster.
Yea even when running two or three cards in sli configuration for gaming or anything else for that matter each gpu still needs that vram and it doesn't really combine them in a way that your describing like otech explained. I didn't even realize it made memory transfers faster but then again that totally makes sense. I kinda wish they made a way to daisy chain gpus in a way that was effective and efficient so that you would be doubling your gpu performance but unfortunately it doesn't work that way from my understanding. If it did though it would be awesome cause imagine getting two super cheap 1080tis and getting almost 3090 like performance from them but unfortunately I don't think it's possible unless someone developed a new way of using the sli and even then I don't know if it would be possible.
@@SpreadingKnowledgeVlogs Yeah, that is what I was shooting for. Two 3090's to perform like a A6000 and save a bunch of money. I just finished the PC this morning for the most part and going to have a dual boot OS with Windows 11 and Linux (Mint or Ubuntu...haven't decided yet). A friend of mine is into the computer vision program learning and thought I would make a machine geared towards that and deep learning. Bit of a learning curve vs just building gaming PC's for customers
@garyc5245 I don't think you will see much benefit from the two gpus vs just one I've watched many videos on it and it does work that well. What would be interesting is if someone made a program where one gpu does all the rendering while the other gpu acts as a accelerator or upscaler or some sort to make the final i,age better quality and smoother running. Proble, is you would have to do all the r&d yourself likely and it's likely not easy work. If your using this computer for gaming mostly I would urge u to look into steam os it's basically Linux with steams skin on it and runs ga,es way better then windows is what I hear or I should say runs games more efficiently which results is better everything just about.
@@SpreadingKnowledgeVlogs Thanks, for the food for thought, it will not be a gaming PC...this is just a very expensive experiment and would be great if this was the foundation PC for someone in that line of work; however I am signed up for some classes to get a little smarter on the topic.
Damn, you really spent it all on this machine
Where'd you go
Either the camera person has asthma or smokes cigarettes because I can hear them breathing even when the builder is speaking.
How’s this running? This seems like a perfect sweet spot for llama 3 70b
16:00 next i am noting is, did you work out how you are going to route your cables, especially to the GPUs, in particular the 2nd one.......
Q: You used EVGA 1600 PSU for most of the video then changed to Corsair 1200, any specific reason?
Apart from the motherboard you used in the video, what other motherboard on eBay do you recommend? Regards
What type of CPU do you have? 🙄
pretty decent 2500 llm ws!
i think everyone is wondering why u pulled it out of the oven lol
I consider building 8x4090 build for AI on GENOAD8X-2T/BCM motherboard with pcie extenders inside Thermaltake W200. Please tell me it's a bad idea before I'll do this mistake 🤣
Do it
Semangat bang
Has having the extra 3090 come in handy yet?
Why would portability be a consideration? Just use a VPN if you need to do anything from the top of a mountain
Which motherboard did you use? I didn’t get the full name
You should run 3dmark with the 3090's in sli when you get bridge 😂😂 i bet they are faster than 4090.
Hmm a single 4090 would outperform it in AI/ML workloads unless NVlink scales with VRAM while being vastly more energy efficient and lower latency (with all the compute units on one die)
I need more VRAM than anything. Which is why 2x 3090 vs a single 4090.
Lol yea I will do a 3DMark run for sure. Although the CPU single core will really bottleneck it.
@@OTechnology but 3090s aren't AI gpus.
@@CaptainScorpio24 NOPE! It's for games and has been kneecapped accordingly.
Infact a pascal p40 can come close to its inference speed. The p40 is 8 years old. but the difference is it was BUILT for early AI training. Buy an older server GPU that's built for AI. like an H100 or an A1000... These 30-4000 series gaming class cards arent the best crystal first of all, they are overworked and impure silicon... Good for prof of concept or some playing around, or just driving into the ground playing games, knowing you will replace it in a year or two (disposable). But they aren't REAL AI gear.
Are those spots for 60mm fans ?
Out of curiosity with that not NVlink does windows see the gpus as having 48gb vram on task manager, or is it similar to just sli where regardless of data-throughput it sees only one gpu worth of vram 24gb? Thanks for the video!
NvLink doesn't support by Win 10 😢
@KirillPodcast thank you
Hey what were the memory temperature while training?
Key point
Can we have are own Ai feed it info to help us to live life?
😂😂
AI will feed humans and hunt humans like deer 🦌 because it can 😂
@@sithounetsith9877 Or, it will feed deer, and hunt humans ....
Nice, do a blender test render
how much noise do this setup makes on full power usage?
Let's have morning breakfast 😂.. curry will be high end CPU build, ingredients are GTX 3090, motherboard, ram etc etc 😅
Where's the nvlink?
Buying guide for the 4070super line? :D
how come you put a pic in the thumbnail with the two gpus connected via a sli or nvlink connection but in the video you didnt go over that aspect any? that was the whole reason i clicked on the video i was hoping you were testing sli and gaming on a newer card since not many people have tested it and those that have say its sort of useless.
Its completely useless for gaming. There is no games that can run with multi GPU anymore. I am using it for training AI models where it needs to communicate between GPUs. I only got the NVLink after making the video so unfortunately it's not in the video lol.
@OTechnology that's not true all the old games that supported sli still support it as well as a small handful of more current titles although it's not well optimized nor does it help much since each gpu has to render individually it makes very little sense performance wise but in some cases I does work well. It's just there are only 3 or 4 games where it works well and I've only seen it done with gtx gpus that's why I'd love to see a rtx try it. How does the nvlink help with ai calculations? Sorry but I know very little about ai but it fascinates me. Wouldn't a ai specific gpu be a better option here due to the cores being more centered around ai tasks vs the rtx which has a mixture of cores for different tasks?
You can donate the parts you remove to an intern who is willing to work with llm's (like me 😂😂🙌🙌💖)
is there any whay to make them work together on the same project?
Like Stable Dif and enable them both for more VRAM?
Yep
wich rtx 4060 buy now pls
what case?
And what exactly if AI ML software and models are you running ? See a lot of this and the workload is not worth running (bad and outdated models) and what the person thay built them "learns" about AI is almost not teaching them anything useful
will it run crysis?
Why so much ram? Are you using cpu for some layers?
please do a 4070 ti super buying guide :)
Will you do a 4070 super buying guide? 🥵
11:32 u keep praising this case but man, IT IS SO WEIRD TO ME. haha
Will it game?
How much does it weigh ?
It's very heavy 😅
Where cound you possibly get a rtx 3090 for under $800? Oh wait could it be they fell off the back of a truck? 🤣
Pls make a 4070 super buying guide
waiting for your 4070 super, 4070 ti super, 4080 super buying guides bossman :)
Hello! Are you making a tier list of 4070 super?
15:13 so u did note all of that prior to getting parts and STILL went ahead to try to install GPUs before. SMH.
NEED 7800XT BUYING AVOID VIDEO
14:34 smh, you dont seem like a person who has done a lot of builds like you've said. why in the world would you think to install both GPUs and you knew how many NVME you are going to use, and should have taken note where the slot(s) are that you are going to use. Your level of excitement has either affected your judgement or exposed your lack of experience or both.
19:08 WHAT ARE YOU DOING AND SHOWING TO PEOPLE AND ACTING LIKE IT'S OK TO BE PULLING AIR FROM INSIDE THE CASE TO COOL THE PSU!!! You should be banned from TH-cam for this!!! And it looks like cables may be an issue with those fans, those they are set to exhaust, BUT YOU NEED TO BE PUSHING HOT AIR OUT WITHOUT ISSUES!! 19:15 THEN, you proceed to tell people it is fine because the PSU fan barely runs!! IT DOESN'T MATTER!!! If the PSU needs it's fan to operate for any reason it should be able to do so at optimal level!!
i clicked this because i know this is just a ridiculous build. Most games are unstable with dual GPUs. i sure hope this guy is using this to CREATE video games rather than playing them with this build.
EDIT: ok cool its his work station. no longer shaking my head in disappointment.
What an idiot