Bro I mean no hate but.... Hear me out When you're recording a video on your phone, like this one, just long press the centre of the screen and bring the sundial to wherever you think it sits best. This locks the exposure and focus for the video. And so, the camera won't be adjusting the exposure every millisecond.
Where is the hate? You delivered pro tips for free kind good sir. May your hdmi cables lock tight like DisplayPort , may the capacity of your dialectic out perform the current norms, may your visions of a brighter world seeded by your good deeds fruit to a 4090++x2, keep on as you are and do not drink when you text.
@@peters3885 bro, I mentioned "no hate" because people nowadays get offended very easily, sometimes even when you provide pro tips. Thank you for your kind words my man. And sir, I'm 17, I don't drink, I never drank alcohol and I have no plans on doing that.😂
SLI didn't stop at the 10. SLI was supported for the 3090. I know what you mean kind of though, the last good SLI was the 2080 TI but it was definitely not a 10 series card
@@ClappOnUpp you can run 2 rtx 3 or 4 cards but it not supported for gaming. Sorry maybe you can game sli with rtx 2000 series. But yea I think support pretty much ended with the gtx series as for as 50 to 100% performance boosts.
This is true. However. Why fill two gen 5 slots with gen 3 settings? Yes, the video cards are probably going to perform with only 5-10 percent hit on performance. But It's a waste of Gen 5. What I'd love to see instead is to get mobo's, video card makers, chip makers all on the same page and let us stick our gen 4 cards into the gen 5 slots, set the 16 lanes at gen 4 for each slot, but then throw the rest of the gen 5 lanes (because we're not using them!) at storage or elsewhere. The current implementation doesn't make sense. One way to do this is for card makers to make gen 5 x8 cards.
They don't do it coz the GPU makes wanna sell the cards to people who use older systems say PCIE Gen3 and then it would run PCIE Gen 3 x8 and that would start to become an issue.... :)
@@theTechNotice True. What about hybrid cards that run 5 x8 and 4 x 16 or 4 x 8, that way we could choose. I simply dislike the idea that we are wasting gen 5.
Limited number of lanes and possibilities based on processor, motherboard, chipset, spacing... Ultimately, bifurcation is set by the aforementioned limitations and the manufacturers. It isn't cut and dry. Cooling, spacing, and the cards' dimensions all play a part. Upgraded fron Asus Z590-E to Z590 Dark with NVLinked 3090 XC3 Ultra examples bacause of cooling and spacing of slots. Upgrading as soon as 13900K arrives to Z690 Dark with the rest of the installation except RAM being reused. Check out bifurcation and read the motherboard and CPU/Chipset manuals about this topic. Changes are usually automatic or require toggling BIOS possibilities.
Is this a physical bottleneck ? Seems like the design is revolved around capital rather than physical parameters. Why are mobos equipped with pcie 5 but not being utilized? What is a good synonym for this, like new iPhone coming out with the bump out camera but the glass will be included in next years version.
@@peters3885 It's all about the lanes a processor is built to have and be able to switch lanes. For most people who build gaming rigs the status quo is just fine. For me, I use Redshift, so two or more video cards is a big benefit, and so I need the lanes more than the average gamer does. Threadripper fits me more, but I see these boards and I'm just like, man, its sooo close. And the jump to threadripper is expensive.
I have a 4090 3090ti 3080ti 1080ti and a 1080 all running at x8. The loss in performance is not noticeable maybe a couple percent. That's like margin of error or the difference between a slight overclock. All my benchmarks are roughly the same X8 or X16.
that's in 2 cases wrong: in Intel enthusiast plattforms for gen 1-4 core-i cpu for example i7 3970 for x79 MBs u had more than enough PCIe Lanes for multiple x16 cards other Case is a Multi GPU card like the gtx 690 cause both gpu's are on the same pcb
@@PhilippJanusch But it does not matter anyway. SLI was only possible through the chipset and not the PCIe lanes of the CPU. These days the only way to use multiple GPUs is to give them different tasks but not to make games run with more fps. If you want to get up to 124 lanes from the CPU you'll have to buy a Xeon or even better a Threadripper or Epic processor.
@@daveb9402 those gpu are not sli just giving 2x more powa for rendering or any soft that will benefit from multi gpu.. gaming is anotehr situation requiring links
That's not what they are using them for one is encoding for compiling HD recording while the other is running everything else it's a setting in OBS to use your second card for your capture and compiling I set a workstation up for my niece with two 1070ti sc cards to play games and stream with without working one card to death the second card only draws like 35watts to do it too
Wrong info. When your slot runs in PCIe 5.0 x8 it means you use 8 of the lanes on the PCIe slot. However, those GPUs are PCIe 4.0. This means that both cards are running at effectively PCIe 4.0 x8 because half of the lanes are physically connected.
No. Nvidia, Ati and Intel have to engineer that for their cards + it needs to be supported by your motherboard + game + the game must implement it well for it to be worth while.
I did this test myself 16 × 8 Is and 16 test times 16 I found more lanes more powerful not faster frames but more smoothness more flow because it had a way of stretching its legs Those were my findings look at the back of your motherboard must be 16 times and 16 times Never go for the 16 × 8 times SLY physics X the second level
Bandwidth-wise Pcie 4.0 x8 is the same as Pcie 3.0 x16. If you run benchmarks you'll see no difference in performance. Until this moment there's no GPU that can benefit from the higher 4.0 bandwidth. Not speaking of Pcie 5.0...
The last GPU that offered SLI was for some weird reason the 3090. I was really bad at utilizing 2 GPUs and ran at best just a little better, same or even worse compared to a single GPU setup. We can without any regret say that SLI is dead. Same goes for AMDs Crossfire.
And you still end up with worse performance than just having one run with the minimal hit it gets. Unless you are using this for some kind of workstation
Dude, ASUS is is selling ProArt setups for creators and not gamers. Of course 2 GPUs will be used for different tasks. Most professional software will take advantage of this kind of setup.
@@daveb9402 yeah they both run at 8x and they can utilize most of both GPUs. I feel like this is where a lot of the high end consumer GPUs end up, at least more than people realize.
I heard that you have beggout big house, 4 cars, 2 boats nd dog. Only thing you still begg for is gpu, and trust me gpu is only thing on this planet that nobody will give for free
Bro I mean no hate but.... Hear me out
When you're recording a video on your phone, like this one, just long press the centre of the screen and bring the sundial to wherever you think it sits best.
This locks the exposure and focus for the video. And so, the camera won't be adjusting the exposure every millisecond.
Where is the hate? You delivered pro tips for free kind good sir. May your hdmi cables lock tight like DisplayPort , may the capacity of your dialectic out perform the current norms, may your visions of a brighter world seeded by your good deeds fruit to a 4090++x2, keep on as you are and do not drink when you text.
@@peters3885 bro, I mentioned "no hate" because people nowadays get offended very easily, sometimes even when you provide pro tips.
Thank you for your kind words my man.
And sir, I'm 17, I don't drink, I never drank alcohol and I have no plans on doing that.😂
@@le_nirnoy
Níce tip. I didn't know that as I am not into video shooting.
This tip is like gold and you're just 17
@@PuppetMasterdaath144 Thank you🫠
I was just looking up for this clarity. 😮
This has been around for ever. Like sli stopped on the 10 series and everyone just forgot it existed?
SLI didn't stop at the 10.
SLI was supported for the 3090.
I know what you mean kind of though, the last good SLI was the 2080 TI but it was definitely not a 10 series card
@@ClappOnUpp you can run 2 rtx 3 or 4 cards but it not supported for gaming. Sorry maybe you can game sli with rtx 2000 series. But yea I think support pretty much ended with the gtx series as for as 50 to 100% performance boosts.
You would need a 4090 to saturate PCIe 3.0 x8, so 4.0 is not a problem at all
This makes a great homework machine
this Asus Pro Art stuff looks very nice. no led's only straight lines and a decent color theme. the liquid cooler looks good too.
A much needed clarification
That black heat sync looks sexy asf tho
I see potential in this man 🙏🏻
Me with 12 GPUs on one motherboard and 8 on another and 6 on another and 3 in my main system:👀👀👀
This is true. However. Why fill two gen 5 slots with gen 3 settings? Yes, the video cards are probably going to perform with only 5-10 percent hit on performance. But It's a waste of Gen 5. What I'd love to see instead is to get mobo's, video card makers, chip makers all on the same page and let us stick our gen 4 cards into the gen 5 slots, set the 16 lanes at gen 4 for each slot, but then throw the rest of the gen 5 lanes (because we're not using them!) at storage or elsewhere. The current implementation doesn't make sense. One way to do this is for card makers to make gen 5 x8 cards.
They don't do it coz the GPU makes wanna sell the cards to people who use older systems say PCIE Gen3 and then it would run PCIE Gen 3 x8 and that would start to become an issue.... :)
@@theTechNotice True. What about hybrid cards that run 5 x8 and 4 x 16 or 4 x 8, that way we could choose. I simply dislike the idea that we are wasting gen 5.
Limited number of lanes and possibilities based on processor, motherboard, chipset, spacing... Ultimately, bifurcation is set by the aforementioned limitations and the manufacturers. It isn't cut and dry. Cooling, spacing, and the cards' dimensions all play a part. Upgraded fron Asus Z590-E to Z590 Dark with NVLinked 3090 XC3 Ultra examples bacause of cooling and spacing of slots. Upgrading as soon as 13900K arrives to Z690 Dark with the rest of the installation except RAM being reused. Check out bifurcation and read the motherboard and CPU/Chipset manuals about this topic. Changes are usually automatic or require toggling BIOS possibilities.
Is this a physical bottleneck ? Seems like the design is revolved around capital rather than physical parameters. Why are mobos equipped with pcie 5 but not being utilized?
What is a good synonym for this, like new iPhone coming out with the bump out camera but the glass will be included in next years version.
@@peters3885 It's all about the lanes a processor is built to have and be able to switch lanes.
For most people who build gaming rigs the status quo is just fine. For me, I use Redshift, so two or more video cards is a big benefit, and so I need the lanes more than the average gamer does. Threadripper fits me more, but I see these boards and I'm just like, man, its sooo close. And the jump to threadripper is expensive.
Just received my ASUS ProArt Mobo today... i didnt realize they make ProArt hardware other than the mobo... looks sweet.
I built the same exact machine in the video, but when rendering my second gpu is always at 0%. Is there a trick?!
I have a 4090 3090ti 3080ti 1080ti and a 1080 all running at x8. The loss in performance is not noticeable maybe a couple percent. That's like margin of error or the difference between a slight overclock. All my benchmarks are roughly the same X8 or X16.
Can you link your video to this build please? I’m looking for other motherboards that also offer similar specs to this including thunderbolt 4
You are correct I can use 2 gpu a rtx 4070 with a RTX 2080 super.
PS not really usable in gaming but in other things it's has its perks.
SLI has always dropped down to 2x 8speed slots, be that on PCIE3 or 4. :)
that's in 2 cases wrong: in Intel enthusiast plattforms for gen 1-4 core-i cpu for example i7 3970 for x79 MBs u had more than enough PCIe Lanes for multiple x16 cards
other Case is a Multi GPU card like the gtx 690 cause both gpu's are on the same pcb
@@PhilippJanusch
But it does not matter anyway. SLI was only possible through the chipset and not the PCIe lanes of the CPU. These days the only way to use multiple GPUs is to give them different tasks but not to make games run with more fps.
If you want to get up to 124 lanes from the CPU you'll have to buy a Xeon or even better a Threadripper or Epic processor.
@@daveb9402 Witcher 3 next Gen gets around +90% fps with 2 gpu's against a single One. In my case with Crossfire in dx11
@@daveb9402 those gpu are not sli just giving 2x more powa for rendering or any soft that will benefit from multi gpu.. gaming is anotehr situation requiring links
@@jeffdrawbot7323 You are correct. Also don't forget mining xD
Wait. Intel mobo's don't got PCI-E Gen 5 yet?
you did not use crossfire in 2009 ?
plx switch has entered the chat
Ever heard of a PLX chip? They were a thing for a while back when SLI was around.
SLI is dead and it doesn’t get any more updates. 😂 Good luck running new games that can’t support dual GPUs😂
That's not what they are using them for one is encoding for compiling HD recording while the other is running everything else it's a setting in OBS to use your second card for your capture and compiling I set a workstation up for my niece with two 1070ti sc cards to play games and stream with without working one card to death the second card only draws like 35watts to do it too
It is possible... You've just been living under a rock
Your pfp is cursed
it is possible Nvidia know how to use their own processors it won't bottle neck the 14900 k lol , looks awesome though
What's the use case scenario of these multiple GPU configuration?
watch the full related video :)
@@theTechNotice link to said video?
CADs, 3D render engines(blender,lux core etc), minners,particle simulation, IA
@@theTechNotice
Mining😅
holy shit I didn't know that
specs and performance matters
Wrong info. When your slot runs in PCIe 5.0 x8 it means you use 8 of the lanes on the PCIe slot. However, those GPUs are PCIe 4.0. This means that both cards are running at effectively PCIe 4.0 x8 because half of the lanes are physically connected.
Makes sense. GPU s only use 8 lanes.
Is it possible with a couple cheap intel Arc GPUs?
No reason why it shouldn’t be
I've spoken to Intel and they said arc doesn't play well with multiple GPUs... :)
No. Nvidia, Ati and Intel have to engineer that for their cards + it needs to be supported by your motherboard + game + the game must implement it well for it to be worth while.
This is how there's an expirimental gpu from, I think, MSI had an nvme slot on it.
ASUS*
@@theTechNotice Thank you, I can never remember which it was. that's why I said 'I think'. Thank you! Wish it had made it to market.
Does this work for 2x rtx 4090 ?
Only with Liquid cooled cards coz of the thickness...
Does the software take advantage of this set up?
Have a look at the related video ;)
I did this test myself 16 × 8
Is and 16 test times 16
I found more lanes more powerful not faster frames but more smoothness more flow because it had a way of stretching its legs
Those were my findings look at the back of your motherboard must be 16 times and 16 times Never go for the 16 × 8 times SLY physics X the second level
Bandwidth-wise Pcie 4.0 x8 is the same as Pcie 3.0 x16. If you run benchmarks you'll see no difference in performance. Until this moment there's no GPU that can benefit from the higher 4.0 bandwidth. Not speaking of Pcie 5.0...
Who still uses Intel in 2024 for gaming?
I read that AMD CPUs are better for gaming.
You still loose 3% of bandwidth running at 8x.
I cam live with that ;)
@@theTechNotice Also, don't forget that sli only works if the game supports it. 😉
sli but not dead
The last GPU that offered SLI was for some weird reason the 3090. I was really bad at utilizing 2 GPUs and ran at best just a little better, same or even worse compared to a single GPU setup.
We can without any regret say that SLI is dead. Same goes for AMDs Crossfire.
It's easy, ive done this thousands of times! I do identify as a liar though
And you still end up with worse performance than just having one run with the minimal hit it gets. Unless you are using this for some kind of workstation
Which is what is for ;)
Dude, ASUS is is selling ProArt setups for creators and not gamers. Of course 2 GPUs will be used for different tasks. Most professional software will take advantage of this kind of setup.
@@daveb9402 yeah they both run at 8x and they can utilize most of both GPUs. I feel like this is where a lot of the high end consumer GPUs end up, at least more than people realize.
Please can you give me GPU that you're no longer using, I want to use them for 3D rendering
Nah first of all stop begging. Secondly he's using both
Ask your local vendor, they would give you. But for a price, of course
nothing is free in the world
xD
I heard that you have beggout big house, 4 cars, 2 boats nd dog. Only thing you still begg for is gpu, and trust me gpu is only thing on this planet that nobody will give for free
Sli
ProArt Collection FTW