long story short: adding a second GPU won't help gaming performance much. This is because graphics cards are already so fast that any extra distance between components can actually slow it down. That's why the memory modules on a graphics card always surround the main chip. Also, you *can* use an AMD and Nvidia card at the same time in the same pc, for example running a game on AMD and using Nvidia for OBS; they can both do separate tasks at the same time, but not the same task at the same time. But because having two cards installed complicates things, it's honestly better to never worry about running two at once and instead either try overclocking or replacing your card entirely if it isn't performing to your needs
@@arenzricodexd4409 with an SLI/Crossfire compatible gpu with another gpu of the same model, and vendor would work, but amd and nvidia cards were never designed to work together sadly
The closest you can get to it is getting a second GPU that's dedicated to frame generation (using Lossless Scaling) while the first one does the rendering, depending on the model you can turn 60 fps into 120, 180 or even 240 fps, I love it but it's not real "GPU performance combined", but rather two GPU's doing different things that benefit the guy playing the game
@@hideakikovacs2859 yes and no. Yes you can do it, but it can lower performance due to transferring the image between GPU. Most GPU have encoder chips baked onto them so it's faster to encode with the GPU your using to game
Fun Fact - back in 2010 there was a short lived solution to combine amd and nvidia - some motherboards like the ASUS Crosshair IV Extreme had a "hydra" chip on them
@@stuwilldoo7035 DX12 (and vulkan) support a thing called explicit multi adaptor that kind of worked as a hardware agnostic SLI/crossfire solution - including letting that intel iGPU in the processor get in on the action only issue is only like 5 games supported it because it is 100% on the developers of the game to implement this feature because it requires really low level GPU programming to do
@@FuentesDj some games physx are not updated and the cpu physx is extremely unoptimized (Mafia 2 and Mirror's Edge), so having the dual setup still improves that experience. It's probably only a few games though. BL2 physx is about the same on modern cpu as it is on nvidia gpus, it's definitely the version of physx in the game. I know this, because iirc, you can swap Mirror's Edge physx files with updated ones and it fixes issues with the cpu physx, but this is not possible on Mafia 2.
Random Dude here. 2 things your looking for. First a driver from the past before Nvidia decided to add driver blocks to hybrid gpu setups. Second is a motherboard that supports Bifurcation allowing the 2 PCIE slots depicted by the mobo diagram in the manual to use 8x/8x splitting PCIE lanes. Most GPU's do not saturate x16 slot and nearly nothing is lost performance wise. Good thing is your current Motherboard may already support it. Bad thing is finding a driver without the locks in place. Do an old search for Hybrid GPU you may find what you're looking for. Yes we did used to do this without issue. GL
The reason your second GPU is losing performance is not due to a chipset bottleneck on your motherboard, the chipset only manages the last PCIe and M.2 slot. The real issue is that your GPU is plugged into a PCIe slot with only 1 lane, giving you just 1 GB/s of bandwidth. Also, you don’t need a $1,700 motherboard. First of all, SLI only works with two NVIDIA GPUs, and not in the way you're thinking. If you want to connect a second GPU to the PCIe lanes of the CPU, use your top M.2 slot with a riser. It has 4 PCIe lanes that are directly connected to the CPU.
that depends on the motherboard, there are cheaper motherboards in the $300 range that can dp 8x/8x but the guy who uploaded this video doesnt actually know what he is doing or what he is talking about.
4:32 It's completely fine to use a GPU at a slower slot. I use dual gpu for gpu passthrough virtualization, both the GTX 1060 6GB and RTX 3060 are running at their full rated speeds despite the motherboard throttling the pcie speeds to accommodate both cards.
MSI 870A and P55 Lucid Hydra motherboards supported multi GPU from AMD and Nvidia 14 yrs ago, while it worked it was hit and miss on being better than SLI and Crossfire, the only benefit is you could use any GPU that was from the same brand so GTX XXX or HD XXXX , or just mix them up, and was killed off due to lack of support just like SLI and Crossfire themselves.
I have a dual GPU setup for exactly the reason you mentioned! Radeon RX 6800 in the PCIe 4.0 x16 slot for gaming. RTX 4060 ti 16gb in the PCIe 3.0 x4 slot for CUDA, tensor stuff, etc.
I use my 3060ti as the main render GPU and my 5600XT as the GPU for Lossless Scaling. I will not spoil much, but it varies. The main bottleneck is mostly my 5600XT and its 6GB VRAM, and partially the x4 slot it is in.
@ArnoId-Schwarzenegger yes, he does, I wish youtube would boot this misinformation off the platform also 1500fps in fortnight isn't impossible, in fact I'd be more impressed with getting 900fps in cs2
You can actually use a rtx 30/40 series gpu as the main rendering gpu for games/applications that supports dlss, and a rx 6000/7000 series gpu for AFMF. That's what people actually did to get 200~300 fps on max settings of cyberpunk when it first got first released.
2 Gpus actually make sense for content creators and streamers. 1 card can be used for gaming, the other can be dedicated in OBS for recording/encoding. This way you can still stream/record without losing performance at all and no delay whatsoever. The card dedicated to the recording doesn't need to be a powerhouse, a simple 1050Ti can record 1080p 60fps flawlessly if there's another card handling the game itself.
This reminds me of when physx came out on nvidia, I had a hd 5770 gpu and a gtx 460. In the same pc, i used the 460 (i think it was anyways) to compute the physx calculations and the amd card for raster. Worked really well, but not in everygame because it requires you to manually edit config files to tell it which gpu is for what. Lots of hassle but felt worth at the time. Theres so much distance between computer components, i wish they'd all come together to be nice and speedy.
xfx gpus are especially stable, and reliable, they are probably the best vender for AMD cards, and I'm happy you chose that particular AMD card, putting team red's best foot forward.
Test for you. 1- AMD GPU as primary. 2- Ngreedia GPU as secondary. 3- Install Arkham Knight. 4- test if PhysX works using the Ngreedia GPU just for that
@@pacomatic9833 If you remember, PhysX launched with the Aegia accelerator. Ngreedia bought them and locked the tech behind their hardware. Back then, it was possible to use an AMD GPU as primary and the Ngreedia GPU just for PhysX.
@@PupperTiggle it always worked, but Ngreedia went out of their way to make sure that it didnt work. They spent a decade on that anticonsumer crap. We went through hell back in the day to be able to use our gpus.
Bro, this is not how you dual GPUs work. You cannot make an AMD card work with an Nvidia card. Look at your task manager while in game and you'll see only one card is doing anything at all. You don't just plug in dual GPUs and magically get a bigger resource pool for gaming 🤦
The host also is claiming in 2024 that a rx 6900xt is the best amd card which is bs, if you had a 7900xtx it would smoke both cards, on top of this the 4090 is actually the fastest consumer grade gpu on the market. This video is full of so many holes I can't even use this cheese on a grilled cheese sandwich.
I was not expecting the video to end after the price reveal lmao. SLI was dropped so fast that any board with it is an absurd price. I was interested in it for a time too.. sad to see it disappear
Actually you could combine AMD (ATI at the time) with an Nvidia GPU with technology called "Lucid Hydra" -- it was a feature on my Crosshair IV AMD board. I I believe I had a 6870 (primary) paired with a GTX275 (secondary). Funny enough, as far as I can remember, it didn't really cause any major glitches in games, but many games didn't see much of an improvement, while others saw massive gains. I remember Red Faction Gorilla getting a major 20 - 30fps boost. DX & OpenGL would potentially show different results. Great vid btw!
Well, if you get an m.2 to pcie adapter, you can use the x4 lanes from the top m.2 slot for the gpu. 90% of the new ATX motherboards only wires pcie X1 to the physical x16 plastic for each slot. mATX motharboards have x4 and x8 slots from the chipset because of the size limit.
Yes, you are correct. This dude put the 1080 ti into a Gen 3 Pcie x16(x1) slot. The "(x1)" means it only has a single lane of bandwidth lmao. Using the m.2 to x16 adapter would have been better. Even so, he didn't play a single game that could utilize his setup. The only games that can utilize an AMD and Nvidia gpu together are the rare titles that have Direct X 12 explicit multi-gpu support. Edit: I was incorrect. Rise of the Tomb Raider has explicit multi-gpu support, but I doubt he had anything configured correctly.
This unironically feels like a "I am gaming on AMD gpu, but I just have to use Nvidium in Minecraft"-kind of thing. There are some gains(When they don't bite each other to make it worse), but it feels so unneeded.
So did you just plug both GPUs in and they worked automatically? Or did you have to install new drivers? Or do you plug your monitor into the GPU you want to 'host' everything? What's the process? You made it seem like you just added a second GPU into your motherboard and they both worked together flawlessly.
@@agoogleuser2507 Also the app can select what GPU it will be using, and if its different then the one monitor cable plugged into, then this card would work in pass-trough mode if its rendering something for screen to be displayed, aka games (which can SEVERELY limit the output if you use like CMP90HX in pass-trough mode trough your iGPU).
@@agoogleuser2507 Games, rendering software, pretty much anything. And if it doesnt support it natively (in settings menu), you can just temporarily disable one gpu, launch the thing you need on working one, and then enable it back from device manager.
might be actually slower because then the data must be copied all over to the 2nd card before it can encode it. This will also need the System RAM again which otherwise OBS can bypass too with the current Gamecapture hook and NVEnc implementation they have.
I did this a few years back. I had a AMD Fury and added a 980 I came across for cheap and was using for mining purposes. Snice I had hem both I went ahead and tried them out. Low and behold they could work together in certain instances. I tried mGPU in a couple things but most of the time just left the 980 mining while I played on the Fury. The fun trick was after NVIDIA updated their drivers and stopped blocking access to PhysX when you had a AMD card installed. Had a few games that instantly started running better by enabling PhysX on the 980 instead of having to do it on the CPU and still rendering on the Fury.
I used to run ski/crossfire cards. I kind of wish they would bring that back fully supported. It was sick to buy 2 cards at a $300 price point that had nearly exactly the same performance of the $1000 card
"sicko mode or mo bamba ? why not both ?" ahh video
fr
Fr
FR
@@lecctron edging rn
crazy reference
Ryzen 4070 mentioned 🗣️🗣️🔥🔥
RAHHH🔥🗣️🗣️🔥🗣️🗣️🗣️🔥🦅🦅🦅🦅🦅🔥🔥🔥🔥🗣️🗣️🗣️🗣️🔥🔥🔥🔥🔥🗣️🔥🗣️🗣️🔥🗣️🔥🔥🔥🦅🦅🦅🦅🦅
ZTT FOR LIFE🗣🔥🦅
RYZEN 4070
the leaks said ryzen 4090 is releasing November 23rd 2025
@@latestcueleak said ryzen 5090 will release in February 30, 2026
Guess you're buying an $1800 MB because you can't leave us on a cliffhanger!
Im gonna pull a spiderverse and drop the sequel years later 😂
@@lecctron we’ll be waiting
@@lecctronbro that last spiderverse movie was a whole cliffhanger
@@lecctron lol it would probably be cheaper to change your cpu + mb
waiting
long story short: adding a second GPU won't help gaming performance much. This is because graphics cards are already so fast that any extra distance between components can actually slow it down. That's why the memory modules on a graphics card always surround the main chip.
Also, you *can* use an AMD and Nvidia card at the same time in the same pc, for example running a game on AMD and using Nvidia for OBS; they can both do separate tasks at the same time, but not the same task at the same time. But because having two cards installed complicates things, it's honestly better to never worry about running two at once and instead either try overclocking or replacing your card entirely if it isn't performing to your needs
@@nepp- no. Combine them to make them run a single game together. Time to Fuzion.
@@arenzricodexd4409 with an SLI/Crossfire compatible gpu with another gpu of the same model, and vendor would work, but amd and nvidia cards were never designed to work together sadly
The closest you can get to it is getting a second GPU that's dedicated to frame generation (using Lossless Scaling) while the first one does the rendering, depending on the model you can turn 60 fps into 120, 180 or even 240 fps, I love it but it's not real "GPU performance combined", but rather two GPU's doing different things that benefit the guy playing the game
How about for recording, as in using the NVENC encoder while I game on the radeon cards, does that work or am I missing something here?
@@hideakikovacs2859 yes and no.
Yes you can do it, but it can lower performance due to transferring the image between GPU. Most GPU have encoder chips baked onto them so it's faster to encode with the GPU your using to game
sucks when someone with a good idea cant carry it out because of money problems. love you bro
happens all the time on this channel lol we got so close 💔
@@lecctron send me the banking details, ill sponsor a dollar.
Easy fix, have more money
Fun Fact - back in 2010 there was a short lived solution to combine amd and nvidia - some motherboards like the ASUS Crosshair IV Extreme had a "hydra" chip on them
i thought it was coming with either directX 11/12 ?
@@stuwilldoo7035 DX12 (and vulkan) support a thing called explicit multi adaptor that kind of worked as a hardware agnostic SLI/crossfire solution - including letting that intel iGPU in the processor get in on the action
only issue is only like 5 games supported it because it is 100% on the developers of the game to implement this feature because it requires really low level GPU programming to do
Fun fact: The current CEOs of AMD and NVIDIA are cousins once removed.
And AMD also provided NVIDIA their AMD Epyc CPUs in DGX A100 workstations.
@@PhonkMachine220YT I remember back in the day people pairing nvidia with radeon for physx
@@FuentesDj some games physx are not updated and the cpu physx is extremely unoptimized (Mafia 2 and Mirror's Edge), so having the dual setup still improves that experience. It's probably only a few games though. BL2 physx is about the same on modern cpu as it is on nvidia gpus, it's definitely the version of physx in the game. I know this, because iirc, you can swap Mirror's Edge physx files with updated ones and it fixes issues with the cpu physx, but this is not possible on Mafia 2.
Random Dude here. 2 things your looking for.
First a driver from the past before Nvidia decided to add driver blocks to hybrid gpu setups.
Second is a motherboard that supports Bifurcation allowing the 2 PCIE slots depicted by the mobo diagram in the manual to use 8x/8x splitting PCIE lanes. Most GPU's do not saturate x16 slot and nearly nothing is lost performance wise.
Good thing is your current Motherboard may already support it. Bad thing is finding a driver without the locks in place. Do an old search for Hybrid GPU you may find what you're looking for.
Yes we did used to do this without issue. GL
yea thats why some gpus have nvme slot :D these days cus GPUs not use full x16
wow suddenly i'm excited to try this again, thanks so much !
The forbidden combo ahh video
@@6XVK ahhhhhhh
fun fact AMD and Nividia's founders are cousin Jensen and Lisa are related not joking
@@RTWT990 oh that’s sick
🎉
😂
7:25 triggered ptsd
@@Marius04020 literally
nooooo
I litelary stoped the video to check if it really was there or it was in my head
@@Marius04020 Bruh 😭
@@Marius04020 ddlc reference
The reason your second GPU is losing performance is not due to a chipset bottleneck on your motherboard, the chipset only manages the last PCIe and M.2 slot.
The real issue is that your GPU is plugged into a PCIe slot with only 1 lane, giving you just 1 GB/s of bandwidth.
Also, you don’t need a $1,700 motherboard. First of all, SLI only works with two NVIDIA GPUs, and not in the way you're thinking. If you want to connect a second GPU to the PCIe lanes of the CPU, use your top M.2 slot with a riser. It has 4 PCIe lanes that are directly connected to the CPU.
Sli capable means it gives 16x slots to both gpus. So yes it would help in this case even without actually using an sli bridge.
that depends on the motherboard, there are cheaper motherboards in the $300 range that can dp 8x/8x but the guy who uploaded this video doesnt actually know what he is doing or what he is talking about.
The vid we all been waiting for
yoo korvie
Yo its that stone guy
Hi
4:32 It's completely fine to use a GPU at a slower slot.
I use dual gpu for gpu passthrough virtualization, both the GTX 1060 6GB and RTX 3060 are running at their full rated speeds despite the motherboard throttling the pcie speeds to accommodate both cards.
Awesome Vid man. Loving the Croft Manor background music from TB Legend btw.
2:46 THE NETTSPEND BACKGROUND IM DEAD (i need it pause)
MSI 870A and P55 Lucid Hydra motherboards supported multi GPU from AMD and Nvidia 14 yrs ago, while it worked it was hit and miss on being better than SLI and Crossfire, the only benefit is you could use any GPU that was from the same brand so GTX XXX or HD XXXX , or just mix them up, and was killed off due to lack of support just like SLI and Crossfire themselves.
Godzilla had a stroke trying to understand why you've done this and fragging died.
Bro been here since like 250- subs luv to see you’ve grown so much❤
@@Cxrved I’m here since almost 1k
every Gamer Dream is combined AMD and Nividia Gpu together
Bro mention lunchly which was crazy 😂
Bro made team yellow 😐
Know it’s a good day when leccteon post
I have a dual GPU setup for exactly the reason you mentioned!
Radeon RX 6800 in the PCIe 4.0 x16 slot for gaming.
RTX 4060 ti 16gb in the PCIe 3.0 x4 slot for CUDA, tensor stuff, etc.
im a fridge
Me to
I'm a apple
Brrrrr.
@@Ateszk000 I’m a freezer😏
Cool, bro.
Dammmm That AMD GPU is _gasping_ for air
Best ending: amd and nvdia unite
Because this video, i was subscribed to your channel. Smart an interest content about hardware, but on funny way. Keep it going :)
look ya'll, it's the DSP of computer hardware.
My dude said "2010s" like it was the 1980s.
Day two of asking for this video idea... "Using lossless scalings frame generation on the GT 710"
that will be part of the next video stay tuned bro
I use my 3060ti as the main render GPU and my 5600XT as the GPU for Lossless Scaling. I will not spoil much, but it varies. The main bottleneck is mostly my 5600XT and its 6GB VRAM, and partially the x4 slot it is in.
Hmm, I'm not sure you fully understand how games work with GPUs...
@@10Sambo01 you don't know
@Manemyonem I'v ebeen an IT professional for almost30 years. I know.
@@10Sambo01 yup I agree
@ArnoId-Schwarzenegger yes, he does, I wish youtube would boot this misinformation off the platform also 1500fps in fortnight isn't impossible, in fact I'd be more impressed with getting 900fps in cs2
I've had my gtx 1080ti for a long time and it still performs to even the most graphics intense games in 2024 i love it so much
blud completed mission impossible 😭🙏
it always gets better when you drop a vid XD
-nice content
I literally can't believe LTT hasn't made a video like this before
Holy crap, the legendary ryzen 4070 🔥 I knew it existed!!
I was thinking to combine my old gpus with hopes and dreams and then this vid came into my search
You can actually use a rtx 30/40 series gpu as the main rendering gpu for games/applications that supports dlss, and a rx 6000/7000 series gpu for AFMF. That's what people actually did to get 200~300 fps on max settings of cyberpunk when it first got first released.
No way you got almost 50k subs. I was there from the beginning🎉
Guys. Lets all donate money to see the finished computer. We can do it.
physx be like "woah so much room for activities"
This is why I bought an open form factor case. Convective cooling with no case fans needed
2 Gpus actually make sense for content creators and streamers.
1 card can be used for gaming, the other can be dedicated in OBS for recording/encoding. This way you can still stream/record without losing performance at all and no delay whatsoever.
The card dedicated to the recording doesn't need to be a powerhouse, a simple 1050Ti can record 1080p 60fps flawlessly if there's another card handling the game itself.
Now that is a original idea
@@frenaedits Its been done before, and I thought about it years ago- just no money to execute the experiment.
Is it? There are official product try to do this about 15 years ago.
My favorite youtube back with another awesome video
Lets help him to get 1800$ for the Motherboard 🔥🔥
great job next video should be intel+nvidia+amd gpus all in one pc
the moment he mentioned 1080ti i write this and then close the video. lol
I was two minutes in wondering when the loop was coming 😂
this guy: "i combined two rival gpus to create a hollow gpu. imaginary gpu: amdvidia
title got me rolling its so charged up😂
This reminds me of when physx came out on nvidia,
I had a hd 5770 gpu and a gtx 460. In the same pc, i used the 460 (i think it was anyways) to compute the physx calculations and the amd card for raster.
Worked really well, but not in everygame because it requires you to manually edit config files to tell it which gpu is for what. Lots of hassle but felt worth at the time.
Theres so much distance between computer components, i wish they'd all come together to be nice and speedy.
xfx gpus are especially stable, and reliable, they are probably the best vender for AMD cards, and I'm happy you chose that particular AMD card, putting team red's best foot forward.
The title of this video had my brows raised like Red Forman
Test for you.
1- AMD GPU as primary.
2- Ngreedia GPU as secondary.
3- Install Arkham Knight.
4- test if PhysX works using the Ngreedia GPU just for that
PhysX crashed all the time with one GPU, I doubt it'd be any stabler with 2
@@pacomatic9833 If you remember, PhysX launched with the Aegia accelerator.
Ngreedia bought them and locked the tech behind their hardware.
Back then, it was possible to use an AMD GPU as primary and the Ngreedia GPU just for PhysX.
might actually work pretty sure nvidia already made support for having a second gpu dedicated just to physx
@@PupperTiggle it always worked, but Ngreedia went out of their way to make sure that it didnt work.
They spent a decade on that anticonsumer crap.
We went through hell back in the day to be able to use our gpus.
Bro, this is not how you dual GPUs work. You cannot make an AMD card work with an Nvidia card. Look at your task manager while in game and you'll see only one card is doing anything at all. You don't just plug in dual GPUs and magically get a bigger resource pool for gaming 🤦
The host also is claiming in 2024 that a rx 6900xt is the best amd card which is bs, if you had a 7900xtx it would smoke both cards, on top of this the 4090 is actually the fastest consumer grade gpu on the market. This video is full of so many holes I can't even use this cheese on a grilled cheese sandwich.
@@lockinhinddanger934 you foaming out the mouth I can’t even lie
@@lockinhinddanger934 You know this video was done for fun right? You really need to go outside and find something real to do.
I use a RTX 3080 for gaming and the APU of my 5600G to encode my Livestreams. It works flawlessly. No problems, no colored screens whatsoever.
the "What The" at the end got me ahahha
I dare you to put both amd and intel cpus in a pc.
I was not expecting the video to end after the price reveal lmao. SLI was dropped so fast that any board with it is an absurd price. I was interested in it for a time too.. sad to see it disappear
Should've done the full trifecta and added the intel arc gpu as well
AMD and NVIDIA should collab now
Fire video I can’t wait till your creative map….
I want the 9 min of my life back
Actually you could combine AMD (ATI at the time) with an Nvidia GPU with technology called "Lucid Hydra" -- it was a feature on my Crosshair IV AMD board. I I believe I had a 6870 (primary) paired with a GTX275 (secondary). Funny enough, as far as I can remember, it didn't really cause any major glitches in games, but many games didn't see much of an improvement, while others saw massive gains. I remember Red Faction Gorilla getting a major 20 - 30fps boost. DX & OpenGL would potentially show different results. Great vid btw!
Now break the minecraft fps wr! 🤭
For the mobo, I think you can also use the asus proart b650-creator. It has two pcie slots connected to the cpu and costs around 400 dollars
Ain't no way homie didn't already know this 🤣
leccy really said "the most well respected gpu of all time" when he was telling me not to buy it
this is insane, cool vid G!
The 1080-Ti has ALWAYS been a solid graphics card.
best unexpected ending of a video ever
If i could, i would but that for you. Hope to see this work out in the future
Guys support this man so we can get a part 2!!!!!!!
Croft's Mansion ahh soundtrack at the beggining of the video 🗣️🗣️🔥🔥
Well, if you get an m.2 to pcie adapter, you can use the x4 lanes from the top m.2 slot for the gpu.
90% of the new ATX motherboards only wires pcie X1 to the physical x16 plastic for each slot.
mATX motharboards have x4 and x8 slots from the chipset because of the size limit.
Yes, you are correct. This dude put the 1080 ti into a Gen 3 Pcie x16(x1) slot. The "(x1)" means it only has a single lane of bandwidth lmao. Using the m.2 to x16 adapter would have been better. Even so, he didn't play a single game that could utilize his setup. The only games that can utilize an AMD and Nvidia gpu together are the rare titles that have Direct X 12 explicit multi-gpu support.
Edit: I was incorrect. Rise of the Tomb Raider has explicit multi-gpu support, but I doubt he had anything configured correctly.
This unironically feels like a "I am gaming on AMD gpu, but I just have to use Nvidium in Minecraft"-kind of thing. There are some gains(When they don't bite each other to make it worse), but it feels so unneeded.
Wait how does that even work? You can use shaders with Nvidium anyways.
So did you just plug both GPUs in and they worked automatically? Or did you have to install new drivers? Or do you plug your monitor into the GPU you want to 'host' everything? What's the process? You made it seem like you just added a second GPU into your motherboard and they both worked together flawlessly.
had to install drivers, other than that it worked pretty much flawlessly across the 2 monitors
@@lecctron I see. What drivers?
@@agoogleuser2507 Also the app can select what GPU it will be using, and if its different then the one monitor cable plugged into, then this card would work in pass-trough mode if its rendering something for screen to be displayed, aka games (which can SEVERELY limit the output if you use like CMP90HX in pass-trough mode trough your iGPU).
@@alexturnbackthearmy1907 What app?
@@agoogleuser2507 Games, rendering software, pretty much anything. And if it doesnt support it natively (in settings menu), you can just temporarily disable one gpu, launch the thing you need on working one, and then enable it back from device manager.
I used to use a 3060 as main gpu and a 2060 super as secondary gpu, 3060 for gaming and 2060 super for streaming and recording
might be actually slower because then the data must be copied all over to the 2nd card before it can encode it. This will also need the System RAM again which otherwise OBS can bypass too with the current Gamecapture hook and NVEnc implementation they have.
@@De-M-oN you are wrong i gained almost 80-90 fps by doing so
That steam survey is conducted on 100k-500k people, not 132 million.
Even then the averages should still be similar across the 132 million users
I'm pretty sure the 132m user are the overall laptop/pc users @@atriedonisme4174
Steam doesnt need you to fill out a survey to know what hardware youre using.
Whats your editing software? Your too good❤
Outside of software, The main cause for dual gpus not being performant is communication between the two GPUs that is fast in a practical manner.
That dude at 0:40 installing the second card gave me a brain aneurysm...
"the power of friendship" ahh pc 😭🙏🙏
“Two cookies” aahhh monitors 😂
Tbh they would make so much more money if they both made their stuff work together.
Somebody sponsor this man!
w vid
If it goes up in any game, it's probably tricking the drivers to enable some Nvidia tech. Not the gpu doing any work.
NVIDIA for cuda, AMD for gaming
You’re doong what i dreamt
one request...Try two GPUs on Isle Evrima 😂
I did this a few years back. I had a AMD Fury and added a 980 I came across for cheap and was using for mining purposes. Snice I had hem both I went ahead and tried them out. Low and behold they could work together in certain instances. I tried mGPU in a couple things but most of the time just left the 980 mining while I played on the Fury. The fun trick was after NVIDIA updated their drivers and stopped blocking access to PhysX when you had a AMD card installed. Had a few games that instantly started running better by enabling PhysX on the 980 instead of having to do it on the CPU and still rendering on the Fury.
I like this guy I subscribed after watching like 4 vids
YOOOOOOO i love u man u make my day with these vids fr :)
dang a heck of an ending there. 🤣
I'm gonna have to make a nuclear battery just to power this PC for 7 hours
The graphics are so hot it will melt your face off.
Both companies would like to know your location 😅
I used to run ski/crossfire cards. I kind of wish they would bring that back fully supported. It was sick to buy 2 cards at a $300 price point that had nearly exactly the same performance of the $1000 card