@@WCGwkf if you are doing stock trading or Bitcoin, monitor performance doesn't matter, as you only need to display information. No 144hz needed. Normally you also get less displayports than HDMI ports.
Either rendering or virtualization. He's running two gaming PCs off of the desktop, and passing one gpu to each virtual desktop. You don't need that many CPU cores for gaming so this is a viable usecase to run multiple gaming pcs off of one desktop.
1. Crypto mining 2. Streaming 3. Running the same game in 2 separate windows to 2 separate monitors to play with kids without everyone getting a full individual PC
Yes but a 3060? It's better to get a single better card. Most likely a stock trader or crypto trader, they need many monitors to track all their stocks
@@SSukram_12G of vram makes 3060 ideal for blender since cards with more than that are a lot more expensive. Also OPTIX api is much faster than amd apis
@@not_smart_and_not_a_toy It was from an architecture studio. But anyway that's not true. Linus did a whole video testing out GPUs that were used for mining 24/7 for 3/4 years. And none of the cards had performance drops below 2/3%. Maybe you'll have to change the thermal paste, but then you're good to go for years. And most important thing you spent 600€ instead of 1600€ 😊 when the 5090 will come out I'll buy a used 4090 and be happy for the next few years like I currently am on my used 3090.
Nvidia be like ... introducing the 4090 ti while this guy... introducing the all new 7020 Edit: it's no mathematical mistake I'm just sticking with the Nvidia gen naming Anyway if you guys want I can correct it 😊
@@GamingwithPortalsit sure can, but it is better to just get an 4070 ti in that case. Roughly same performance and price. But with all the benefits of having one card instead of two. Dude probably just needs a ton of displayports.
He might be doing artificial intelligence. That amount of storage is required for large models, especially if he's training models. He could technically use hundreds of thousands of GPUs for AI.
Pretty sure the ventus 2x cards are LHR so my understanding of hashrates and machine learning computation is limited I'm pretty sure these are the wrong cards for that. Much better suited for rendering of some kind
@@Just1n2802 ai models will still take stupid amount of time, so it mostly about fiting all data into memory, disk cache is insanly slow. It is a common solution to buy more then one low end cards with big vram
Nobody's talking about the fact that 3060s have great performance for smaller AI workloads as they're fairly cheap, have a decent amount of VRAM, and are pretty quick.
@@antoniomatthews4319 nope. Nearly every program in existence that supports two gpus, does not have the capability to queue for just one and let another process queue for the other
@@idontknowanygoodnames1498 ahh i must be mistaken i know that in some programs you can manually chose the gpu would let you independently schedule processes
That’s just jumping through hoops that you don’t even need to, so you can support some agenda. This person assumedly already has these two cards, and already came to a conclusion as it’s already working, you want this person to break down their already working system for a significantly weaker card, from a significantly worse brand, sell their working card, and play with fire because AMD is cool 😎 Imo, no, definitely not, and hell no. Enjoy your Nvidia cards, if they break down get another set of Nvidia cards or If you want to get AMD but never feel pressured for it, Infact if you feel like it go to Intel Arc and break the Duopoly. Because what matters isn’t the brand or Performance, but it’s whatever you want to spend your money on.
If you're doing AI, then the A series or Quadro series will do, 2 3090s would do the trick. I'd say, this guy either needs a lot of monitors, or hes an idiot
SLI/Crossfire was fantastic tech as it let you eek out more performance on a smaller budget. The issue was that the performance was entirely dependent on how well it was implemented by the developer.
I have a 4080 ti and a 4090 in my rig. The 4090 runs my main display a dual qhd ultrawide. The 4080 ti runs the 4k projector behind me and a pair of qhd 32in monitors above my main display. My green screen is actually a neutral gray screen and a 3800lum Epson 4k projector. This lets me have a clean green /blue or whatever work for what I wear in the moment with chromakey it also lets me use an image or video as a backdrop
We did this at work, we needed one gpu to power varjo vr glasses and the other to show the visualization on tv and stream it to teams. Only way it was stable.
Based on the systems being flashy, I can't imagine this is mining or rendering. My guess is Lan Center or Internet Cafe, where two instances of the same game are running on different gpus through magic software wizardry
As someone who bought the Xidax limited edition Nebula workstation (back when Zack was still there) that has 2 GPU's in SLi... I took this short personally lol.
Can enable integrated gpu in bios also....if cpu has it. Usually adds 2 more monitors. Sucks we can't run 5 monitors in surround, only 4. I think quadro cards are different though, not sure.
not true, the fact of the matter is that SLI/NVLink where used by like 1% of gamers at the time. And for SLI to operate efficiently both Nvidia and game publishers had to accomodate for SLI setups at a driver and application level. Even if SLI was at a decent spot with drivers (which it never was), developers didnt deem the time and money spent trying to get it to work for their games a priority or worth it. It was a supply and demand type situation. Even dual GPUs on the same PCB were failing at a driver level and not offering double the performance despite having 2 chips, this is why we never saw them again after the GTX690.
I've been running a multi-monitor setup since the early aughts when video cards only had a single output! So either he's running massive multi-monitor setups, or he's using them as 3D workstations, since most 3D software supports GPU rendering.
Back in the day people used to do SLI not only for the flex but because it was a great value. You could get two mid range gpus and get better or same performance for less money than high end single card solutions.
SLI would be perfect for VR, i can imagine that it would result in almost perfect scaling, one gpu takes over for each eye and there wont be no swapping between gpus which was the problem back than, it just makes sense but probably wont happen :/
I definitely remember something about 'ghetto sli', where one GPU was used to render the game and you offloaded the PhysX and other GameWorks effects to the second GPU. Not sure if you can still do that, not to mention very few games still use GameWorks. Maybe a dedicated GPU for RTX?
Any game using DirectX 12 can utilize multiple GPUs, software SLI basically. But the developer has to program it in, but they don’t bother cause almost no one has multiple GPUs anymore
video editing, set one card for rendering, other for work, OR streamer, one for encoding and other for games, there are LOTS of things that can select one or other card and multitask
Different render tasks on one system, maybe not big enough that more than a 3060 makes sense money wise, but sensible enough to use more than one of them to give each render task their own GPU.
I miss my sli and crossfire rigs. A pair of 290x cards was amazing as all hell in 2014. Battlefield at 140fps maxed out setrings was amazing. 1080ti sli was great too
Those are probably onna be rendering machines. Specifically designed for rendering cgi and / or videos. Each card can render a separate scene so they can wrender 4 different projects at the same time
It would be doable on a 14700k, split 4p cores and 6e cores to each machine. One GPU each, and this is 4 gaming rigs. I'm not sure what kind of performance losses you would see, but it may not be important.
He can use one GPU for certain tasks and use the other for different tasks. Basically multitasking at its final form. So being able to play video games while video editing or 3D rendering at the same time
The problem with SLI had to do with unstable bandwidth fluctuations from what first was the North bridge and later the CPU interconnect also the overall data bandwidth that would be needed per power level of the card, SLI is fixed but won't be used, as none of the companies will use the fix because they won't be able to control performance output causing the artificial market product tier stacking to be completely destroyed as it's taken care of at a hardware level and really has no needed software interactions beyond support for PCIe 5.0 spec.
He probably has a lot of monitors and needs lots of HDMI outputs. I'd say bitcoin.
Yup
You'd have a whopping total of 2 HDMI outputs. Professionals and gamers who actually want good monitor performance use displayport
@@WCGwkf if you are doing stock trading or Bitcoin, monitor performance doesn't matter, as you only need to display information. No 144hz needed. Normally you also get less displayports than HDMI ports.
@VRGYF00TBALL is that right? What vast list of cards have more hdmi than displayport? I'll wait.
@@WCGwkf sorry my bad, but you still don't need good monitor performance for Bitcoin.
Bro is about to render the meaning of life _💀💀💀_
Why your skulls italicized
Use _ before and after you write any thing then you can type like this _like this_@@GangityBirdity this might not work because there is spaces
They're super fancy. @@GangityBirdity
I wanna know too 💀💀💀@@GangityBirdity
Bros gonna render my dad
Bro being ready for gta 6
Repent and believe in the gospel of Jesus Christ and you will be saved 😄
Either rendering or virtualization. He's running two gaming PCs off of the desktop, and passing one gpu to each virtual desktop. You don't need that many CPU cores for gaming so this is a viable usecase to run multiple gaming pcs off of one desktop.
Bro turned into a goblin 💀
😂😅😂😅😂😅😂😅
Fanum tax in Ohio 😂
1. Crypto mining
2. Streaming
3. Running the same game in 2 separate windows to 2 separate monitors to play with kids without everyone getting a full individual PC
Also video editing
That's what I want so me and my wife can play she always wins
3d programs such as Blender.
that last one actually soudns really plausible.
since he asked for 2 builds theyre probably gonna be used as crypto mines
bro is building an AI harem.
Now it is a rtx 7120💀
*6120
he is probably just a 3d artist because blender supports multiple gpus
Yes but a 3060? It's better to get a single better card. Most likely a stock trader or crypto trader, they need many monitors to track all their stocks
@@SSukram_12G of vram makes 3060 ideal for blender since cards with more than that are a lot more expensive. Also OPTIX api is much faster than amd apis
You can buy a used 3090 for 600 bucks. Worth more than two new 3060 and has 24Gb of VRAM...
@@Pixel.Dystopia and the risk of it having been used for mining resulting in less performance
@@not_smart_and_not_a_toy It was from an architecture studio. But anyway that's not true. Linus did a whole video testing out GPUs that were used for mining 24/7 for 3/4 years. And none of the cards had performance drops below 2/3%. Maybe you'll have to change the thermal paste, but then you're good to go for years. And most important thing you spent 600€ instead of 1600€ 😊 when the 5090 will come out I'll buy a used 4090 and be happy for the next few years like I currently am on my used 3090.
Nvidia be like ... introducing the 4090 ti while this guy... introducing the all new 7020
Edit: it's no mathematical mistake I'm just sticking with the Nvidia gen naming
Anyway if you guys want I can correct it 😊
6120*
9180*
@potato7095 he's talking about the both 3060s
@@potato7523_ u stupid?
@@potato7523_bro doesn’t know how to do math
Seems like running his own LLM that's an ez 24GB of vram
editing style makes me feel like it's 5am on a saturday after an all night bender.
Lmfao, spot on!
He could be rendering. Rendering can still take advantage of 2 cards
This is what I was thinking, I’m pretty sure blender can use 2 cards to their fullest
Running ai models. The power of the card is less important than the amount of vram. Cheaper to have 2 3060s for 24gb, than one 4090
@@tossedsaladman2184 vram doesn't stack in sli
@@GamingwithPortalsit sure can, but it is better to just get an 4070 ti in that case. Roughly same performance and price. But with all the benefits of having one card instead of two.
Dude probably just needs a ton of displayports.
@@Volksjager_one 4070 ti is not better than two 3060
He might be doing artificial intelligence. That amount of storage is required for large models, especially if he's training models. He could technically use hundreds of thousands of GPUs for AI.
Exact
Pretty sure the ventus 2x cards are LHR so my understanding of hashrates and machine learning computation is limited I'm pretty sure these are the wrong cards for that. Much better suited for rendering of some kind
LHR isn't a thing anymore, Nvidia got rid of it driver wise after crypto mining died, any LHR card is just a regular card now@@Just1n2802
@@Just1n2802 ai models will still take stupid amount of time, so it mostly about fiting all data into memory, disk cache is insanly slow. It is a common solution to buy more then one low end cards with big vram
I am also using my 3060 for that
“WHAT IN THE HYEEAALEE” made me roll😂
😐
That goblin like “what the hell” sounds like Dave Mustaine from Megadeth
The electric bills do not approve 🗣️ 🔥
we're going in poverty 🗣️ 🔥
does*
@@donthackme2 electric bills DO not (plural on bills so it’s do, not does)
electric bill DOES not (singular on bill so it’s does, not do)
👍
Still less power than a 4090
Fr
Nobody's talking about the fact that 3060s have great performance for smaller AI workloads as they're fairly cheap, have a decent amount of VRAM, and are pretty quick.
Sounds accurate to me
i was looking for this comment. everybody is like "he probably has a lot of monitors"
Used 3090 would be better for AI. Splitting models between multiple GPUs reduces the permanence by a lot.
8gb is not a decent amount of vram
@@formerlostcause3573 the 3060 (non Ti) has 12GB, not 8
Man must have his own power plant.
He's playing Ultima 9 in 1024x768
It's like when twin turbo setups were the thing and later, it was all about a bigger single turbo.
Smaller twin turbos still apply, 2 smaller turbos=less lag than 1 massive turbo
Forreal
Big single turbos are literally the least efficient system unless you're only on the drag strip.
There is no fricking space in modern carss for 2 turbo.
People forget how great of a car the buick grand national was. Came twin turbo right from the factory.
My assumption is that he’s running more than four monitors or he’s running less than that at super high resolutions and needs the additional V ram
V ram doesn't stack if each as 12gs the system has 12g not 24
@@Mynamesisjeff6999 could he not run 2 separate processes using 2 different gpus like virtualization or 2 separate heavy processes
@@antoniomatthews4319 nope. Nearly every program in existence that supports two gpus, does not have the capability to queue for just one and let another process queue for the other
Speed of the gpu is important too though
@@idontknowanygoodnames1498 ahh i must be mistaken i know that in some programs you can manually chose the gpu would let you independently schedule processes
Definitely using it to mine or as a 3D workstation
3D, multiple monitors, streaming? There is a few possibilities
I had a dual gpu setup, but it wasn't sli or even matching cards. I needed more monitor connections so I used a 3060 and a 1070.
you could have gotten a rx 550 and sold the 1070.
@@user-yz8do8vu1s probably, but I already had both of the cards on hand.
@@user-yz8do8vu1swhy would you do that. It’s just a card to run other monitors on. Doesn’t have to be any good.
The RX550 runs off board power and is a lot more efficient than GTX 1070@@caboose6411
That’s just jumping through hoops that you don’t even need to, so you can support some agenda.
This person assumedly already has these two cards, and already came to a conclusion as it’s already working, you want this person to break down their already working system for a significantly weaker card, from a significantly worse brand, sell their working card, and play with fire because AMD is cool 😎
Imo, no, definitely not, and hell no. Enjoy your Nvidia cards, if they break down get another set of Nvidia cards or If you want to get AMD but never feel pressured for it, Infact if you feel like it go to Intel Arc and break the Duopoly. Because what matters isn’t the brand or Performance, but it’s whatever you want to spend your money on.
Probs stock shares or ai programming and running
If dude works with AI, they can probably afford better than two 3060s
@@fayenotfaye But they still chose 2 3060's cuz it means 24gb total vram which is important for ai.
Probably rendering
If you're doing AI, then the A series or Quadro series will do, 2 3090s would do the trick. I'd say, this guy either needs a lot of monitors, or hes an idiot
Stocks dont need gpu. That can be cpu and server intensive though
He'd probably using the two graphics cards to crypto mine
It’s not always about gaming, multiple gpus are essential for training multiple deep learning algorithms in parallel
SLI/Crossfire was fantastic tech as it let you eek out more performance on a smaller budget.
The issue was that the performance was entirely dependent on how well it was implemented by the developer.
I have a 4080 ti and a 4090 in my rig.
The 4090 runs my main display a dual qhd ultrawide. The 4080 ti runs the 4k projector behind me and a pair of qhd 32in monitors above my main display.
My green screen is actually a neutral gray screen and a 3800lum Epson 4k projector.
This lets me have a clean green /blue or whatever work for what I wear in the moment with chromakey it also lets me use an image or video as a backdrop
i have an 4070 with an i7😭😭
@@user-pc6jo2cy9l my cpu is amd threadripper 7985wx pro with 512GB's of ecc ddr5
What do you use it for?
We did this at work, we needed one gpu to power varjo vr glasses and the other to show the visualization on tv and stream it to teams. Only way it was stable.
Because it's not 1 pc he ordered 6 and what you say is true then they definitely need it for work
I’m guessing it’s THE blender guru
Bro, I know he wants to play Minecraft on 5000 fps
Based on the systems being flashy, I can't imagine this is mining or rendering. My guess is Lan Center or Internet Cafe, where two instances of the same game are running on different gpus through magic software wizardry
would make sense if it weren't for the fact that modern anti cheat give a bunch of errors when run virtualized (the wizardry)
@@kiam9941Everything except vanguard anticheat will run in a VM, and even that you can circumvent by running patched kernels and hypervisors.
@@kiam9941some games do, but it is still possible if you avoid games with the most invasive anti-cheats
As someone who bought the Xidax limited edition Nebula workstation (back when Zack was still there) that has 2 GPU's in SLi... I took this short personally lol.
He probably has a lot of monitors and needs a lot of HDMI ports. I think he’s a stock trader
More than 4 monitors
No wtf dude, one gpu can easily do that
@@FurganManafov except the part most GPU only have 3-5 display output.
@@accurian148Except that you can daisy chain some Displayport Monitors.
@@FurganManafov Nvidia GPUs max out at 4 displays
Can enable integrated gpu in bios also....if cpu has it. Usually adds 2 more monitors. Sucks we can't run 5 monitors in surround, only 4. I think quadro cards are different though, not sure.
I'd say probably 3d modeling in blender
Loll that's was personal . Whant you pc crash play blender
Yeah same
It was actually overshadowed by the complete lack of developer support.
Either he’s cracking hashes or he’s mining/trading.
SLI didn't die because graphics cards got better. SLI died because nVidia wants you to buy their new GPUs instead of chaining together old ones.
not true, the fact of the matter is that SLI/NVLink where used by like 1% of gamers at the time. And for SLI to operate efficiently both Nvidia and game publishers had to accomodate for SLI setups at a driver and application level. Even if SLI was at a decent spot with drivers (which it never was), developers didnt deem the time and money spent trying to get it to work for their games a priority or worth it. It was a supply and demand type situation. Even dual GPUs on the same PCB were failing at a driver level and not offering double the performance despite having 2 chips, this is why we never saw them again after the GTX690.
@@JimboJamboYTon top of that as gpu's got better, the benefit of sli got smaller. At the end, it was actually less performance.
Bitcoin?
for the price you're better off with an ASIC (look it up)
markiplier has 2 4090s in his pc
I've been running a multi-monitor setup since the early aughts when video cards only had a single output! So either he's running massive multi-monitor setups, or he's using them as 3D workstations, since most 3D software supports GPU rendering.
Stock trading
Why the hell would you need two 3060's for stock trading you can do that on your phone
@@tiobiovrIf you want to Drive more than 4 monitors youre gonna need some Power and outputs
Trading probably🤔
Some civil engineering software packages use multiple GPUs without SLI. Metashape is one of those applications.
We mining a whole bitcoin with this one 🗣️🗣️🔥🔥🔊🔊
Yeah stocks, ai, or mining.
Back in the day people used to do SLI not only for the flex but because it was a great value. You could get two mid range gpus and get better or same performance for less money than high end single card solutions.
if not for mining it would be for more monitores or close to doubling your speeds in blender rendering scene etc METAPCs
SLI would be perfect for VR, i can imagine that it would result in almost perfect scaling, one gpu takes over for each eye and there wont be no swapping between gpus which was the problem back than, it just makes sense but probably wont happen :/
He's gotta be mining w that
Main one for general tasks & gaming, second one for rendering/streaming purposes.
Gotta catch them mini frames when the corn stars are riding😅😅
I definitely remember something about 'ghetto sli', where one GPU was used to render the game and you offloaded the PhysX and other GameWorks effects to the second GPU. Not sure if you can still do that, not to mention very few games still use GameWorks. Maybe a dedicated GPU for RTX?
for those of you wondering, he's using the RTX 3060s for crypto mining, basically instead of using the entire cpu, it puts most stress on the GPUs
Bro was obviously gonna play split screen
Any game using DirectX 12 can utilize multiple GPUs, software SLI basically. But the developer has to program it in, but they don’t bother cause almost no one has multiple GPUs anymore
video editing, set one card for rendering, other for work, OR streamer, one for encoding and other for games, there are LOTS of things that can select one or other card
and multitask
He's running one of those cyber gaming hotels.
He was probably a graphic designer
20 years ago a new GPU generation comes out every 6 months alternating between Nvidia and ATI Radeon.
Different render tasks on one system, maybe not big enough that more than a 3060 makes sense money wise, but sensible enough to use more than one of them to give each render task their own GPU.
GPUs passed through to VMs, making 2 gamers, 1 PC.
I miss my sli and crossfire rigs. A pair of 290x cards was amazing as all hell in 2014. Battlefield at 140fps maxed out setrings was amazing. 1080ti sli was great too
One is for gaming, one is for everything else, I've done this myself
U know what SLI would make total sense in vr, each eye would be rendered by each GPU
Bro probably wants to play gta6
Those are probably onna be rendering machines. Specifically designed for rendering cgi and / or videos.
Each card can render a separate scene so they can wrender 4 different projects at the same time
It would be awesome if sli actually doubled your graphics processing power in games
Video editing for maximum VRAM or maybe dedicated encoding for streaming and gaming
It's a 1000D case so the two graphics cards is probably because it's literally two seperate computers in that case.
If memory serves, the 3090ti was the last card to support SLI. Which, was no slouch. No where near as powerful as the 4090, but still, no slouch.
It also added very little advantage over one card. Caused tons of heating problems...
Excellent case choice.
He is probably using some program with cuda procesing
2 mid range cards? My guess is bro is using them for virtual machines so he can split his computer between him and his significant other.
It would be doable on a 14700k, split 4p cores and 6e cores to each machine. One GPU each, and this is 4 gaming rigs. I'm not sure what kind of performance losses you would see, but it may not be important.
He can use one GPU for certain tasks and use the other for different tasks. Basically multitasking at its final form. So being able to play video games while video editing or 3D rendering at the same time
Bro sounded offended 💀
bro bouta get rtx 6120 graphics
One is for display the other is for GPU acceleration.
Maybe he’s just running a virtual machine and wants a dedicated graphics for it
He's probably running neural networks
Probably video encoding on the GPU orrrrrrr 3D rendering for multiple outputs
Probably VM with gpu passthrough so he can game in the VM too. But I don’t think that works with Nvidia
One is for recording/streaming live.
Probably for all of his security cams
VR, VR can render both screens with 1 GPU each in an SLI system
Bro is about to start a new simulation in that beast
Servers..... that it all.
The problem with SLI had to do with unstable bandwidth fluctuations from what first was the North bridge and later the CPU interconnect also the overall data bandwidth that would be needed per power level of the card, SLI is fixed but won't be used, as none of the companies will use the fix because they won't be able to control performance output causing the artificial market product tier stacking to be completely destroyed as it's taken care of at a hardware level and really has no needed software interactions beyond support for PCIe 5.0 spec.
I had a laptop with 755m SLI. When it worked it was glorious. But sometimes it was so bad it have worse performance than with 1 GPU.
He might be doing high partial animation
Should’ve joined a metal band
Bro got the leaked rtx 7020
Probably for high end partial simulation
I think this is for LAN gaming since you could run multiple virtual machines on one pc to share it between a lot of people
Honestly with the amount of vram some of the cards have we kinda need it.
He's probably an investor , that requires a lot of screen, around 6 and 1 GPU couldn't be enough