@@WolvenSpectre It'll make the load screens a non issue. That way you can save even more by using hard drives instead of SSD's. You also get more game time as you won't be watching as much porn.
I know it's a joke but he probably never paid for any of them. The big TH-camrs have a lot of stuff donated to them and get samples from companies as well as ship stuff to other TH-camrs once done playing with OCs and bench comparisons.
This video contains : - a great ad made by the crew - a monologue about the fall of multi-GPU solutions. - no charts, no figures, no gameplay or synthetic visuals.
some smart people don't need charts or graphs to understand . some kids and children need beans and draws and play-doh to understand how to apply tooth paste.
*Ray Tracing* - extensive work by developers required to implement, supported by literally 3 games, *recommended by reviewers* *Crossfire/SLI* - long established methods to implement, developers required to implement, *bashed by reviewers* Makes Perfect Sense
It's a catch 22. Devs don't make multi-gpu games because people don't have multi-gpu rigs. And people don't build multi-gpu rigs because devs don't support multi-gpu in games.
Problem is also that multi-GPU rigs have to be purposely designed, mostly by having a motherboard with twin PCIe slots, which is not on your average motherboard thats shipped with your home PC, so its not an option to simply "upgrade" a normal PC into SLI, youd have to build it yourself from the ground up. Who would go through that effort if you can just buy a single beefier graphics card for the same or less money of 2 lower end ones?
@@builder396 I think most good motherboards can handle it. The bigger issue for me is that the price of used GPUs doesn't drop as fast or as much as they used to. So I never got a chance where it made sense to buy a second 970, by the time the used prices finally started to drop, I needed more VRAM anyway and had buy a newer card with more VRAM.
For 3D VR you need two ocular references for binocular depth perception - say for hand/eye co-ordination in an interactive sim. I wonder that two gpu cards might handle the L/R processing and output their combined scan into a single signal composite that bump maps, texture wrap's, and contour trace's collision physics or otherwise combines object surfaces from slightly different views for an over the Plane 3D object semi-holo render, rather than a paper doll screen plane object with orthogonal render map (2D) and screen vector rasters for adding the onscreen world and its objects movement... or something like that. Anyone?
I wish more game (or more accurately game engine) devs took the "ashes of the singularity" approach to multiGPU. I know it's a lot harder but it allows near perfect scaling even with mixed AMD/Nvidia systems.
That's actually the problem with multicard. Ashes had functional multi-gpu support but in being DX12 is really poorly built. It's like building a clunky V12 engine to get 200HP when a modern small engine can do it better. (For lack of a better metaphor)
I agree with you 100%. Learned that some time back, in earlier build, where I had two cards and had micro-stuttering. So, I just gave that system away (eventually), and I bought a nice top-tier card. And, I've been happy for last 3 years in 1080p and 1440p.
The truly cool thing about it was that it was mid sentence. I did not see the cut except for that.... maybe I would have, but the graphics card stole the show.
For now. I've got a feeling it's going to come back, nVidia didn't put NVLink on their cards for benchmarks. Now they can't sell to miners, they need another way to get people to buy multiple gpu's.
@Johnny Casserole I have left SLI for what is was 2 years ago, was sick of the constant misery after a driver update, micro stutter and less and less titles supporting it (sometimes even got lower frame rates than a single GPU). Been building since the 80s, have seen the rise and fall of many stuff. SLI for gaming is dead, for editing/rendering other stuff/streaming/multiple monitors it can be unbeatable. Not this video, but your comment is bragging rubbish....
@@SimonLeBonbonbon That's the problem with SLI and Crossfire, it's at the mercy of game developers. These game engines don't directly communicate with the card anyways. They utilize a standard API such as DirectX. There is a huge performance gain in forgoing the API but game devs aren't going to heavily customize their engines for each kind of GPU out there. So since most of the heavy work is done by the API anyways, then parallel GPU optimization should really be mostly on the shoulders of the API. The protocols for parallel GPU gaming need to be developed from scratch is they even bother to do that. In the next 10 years we'll start to see the lines between CPUs and GPUs blur and what we know to be a motherboard and its components likely be far different by 2030. So R&D is likely not on GPU parallelization.
@@krozareq Totally agree, unfortunately Johnny Casseroles comments are removed so I will modify my comments a bit. I can understand the support for SLI/CF dropping since it has become a niche market with less adepts. But when you compare current GPU's with those 10-15 years ago, there has been a massive performance gain. Who knows what we will experience in 10 years from now (I will be retired by that time) Whether the graphics calculation will move to integrated instead of discrete will only be clear once we are there. Perhaps (swapable) GPU's on motherboards just like current CPU's are used. That integrated graphics will have more processing power is without a doubt so perhaps GPU's eventually become a niche market.
@@krozareq I i wouldn't necessarily blame the motherboard, what about the graphics cards that use one pci-e slot with twin gpu's inside of them, like the GeForce 9800 Gx2?
I... normally skip in vid ads.. but Jay... Jay my man.. This was pure genius. Actually made me watch the skit/ad. Oh and the vid itself was amazing as always
1 kHz is actually becoming a thing just right now with micro LED technology, which will probably kill OLED and quantum dot, before it even became a thing.
It's dead temporarily in my opinion, Watch it come back in a different form in coming years from all manufacturers. Silicon is reaching it's limits in terms of speeds and die shrinks. They will all move to MCM at some point, Instead of one large die, They will have smaller ones linked together, May have 2 dies linked doing normal gpu work like sli/xfire from the past, And another 1 dedicated to ray tracing, And one for A.I. There is only a few shrinks left then they are stuffed, If they want to keep increasing performance on the same infrastructure then they make smaller easier to make linked chips with less bad yields,Also bring costs down.
That's one way to look at it. It is simply because not enough people were using it. Developers don't like to serve minorities. Nvidia's choice to remove the possibility on lower tier cards also caused this. The last "budget"card was the GTX950, after that Nvidia dropped support for everything below GTX1070. I think it is just the evolution. Hey, in the old days, you had a 2d card and a loop through additional video card, the 3Dfx VooDoo. After that, the Voodoo 2 came and you could use 2 of them, so basically this was the first SLI GPU setup for consumers....
@@fermi3294 durrrrr everyone who disagrees with me is a paid shill. Good luck with hoping that every new game that releases will utilize Crossfire in any meaningful way.
@@ryrin6091 i don't think you're a paid shill, i think you believe all the bullshit you hear, and adopt it as fact (*when it lines up with your 'gut'). The fact is, i play all the games i have (practically, anyway) with my 2x rx580's, getting fucking great performance... with more of a pain setting up each game, but that isnt a deal breaker for me.
From the start, I think SLI and crossfire was marketed like an upgrade option, but that the benefit didn't justify the cost. It was better to just get a newer replacement card.
true, but multi-GPU also was the only way to go beyond the best possible performance for one part. For example the quad 1080Ti SLI set a rasterization record that the current 4090 only barely beat and it still had to be overclocked to do it. That's basically tech from almost a decade ago still managing to take home a win against the best of the modern world. if you were a person who truly wanted the absolute best with absolutely no concerns outside of performance then running multiple GPU's has been, and could still be the only way to take it to the next level past just water cooling and overclocking current tech. There is definitely a place for SLI and crossfire even if it is just another niche like custom water cooling, delidding, sanding, liquid metal thermal 'pasting', and liquid nitrogen cooling.
This explains why I do 4K gaming easily with my 2 1080TI SuperOCs, SLI does make a difference still at the 1080TI level.. which is why I did not go 2080TI I would have went SLI again, so now I figure the new 3080 or 3090s will NOT be SLI cards for sure.
Hey Pete... I'm running a - Gigabyte GTX 1080Ti Xtreme Waterforce 11Gb - contemplating adding a 'used' - Gigabyte GTX 1080 Ti Founders Edition 11Gb. My reasoning is... it is certainly NOT worth upgrading to a 2080Ti & the likely 30xx series of cards will be very pricey (even though desirable)... so for
PS. I'm running a - Gigabyte X299 Aorus Gaming 7 Pro - m/board with a - Intel i7 7800X (o/c to 4.5Ghz with Corsair H60 cooler) - SilverStone 1000w Platinum PSU - 32Gb G.Skill Trident Z RGB 3600Hz (17cas) - running on a - 4K Samsung 8-series TV @ 60Hz. ----- With my current CPU (28 PCIe-lanes)... the 1st GPU would be running @ x16, whilst the 2nd would be @ x8. Once I have a 10+ core CPU (44 PCIe-lanes), I could run both @ x16.
@@markhormann Hey Mark, here is my system.Asus Rog Maximus X Code, i7-8086K @ 5.2Ghz(All Cores) 1.32v, G.Skill Trident Z RGB 32G @ 3600, 2 Asus 1080Ti 11G Strix Gaming OC in SLI, M.2 NVMe Samsung 960 EVO 1TB, Phanteks - Enthoo Evolv ATX Case,Corsair - H150i PRO, RMx 1000W 80+ Gold-Acer - XB321HK I am gaming at 4K 60hz, my next upgrades I will get a 4K 144hz with the new 3080/90TI card. WHEN they come down in price.. Now to your Question. I play with all settings maxed, I play the following with zero issues: Witcher 3, Divsion 1,2, Ghost Recon Breakpoint, Metro Exodus, Dishonored series, Far Crys- All, Middle Earth SOW, Shadow of the Tomb Raider The Path Home... Plus many more, but his gives you an idea of what I play.. mostly single player, but LOTS of logged multi on D1, D2 and GR-Breakpoint. I will say some games my 2nd GPU does not show much load, others it shows more.. I hope this helps you make your decision, I think you would be well served to do just as you were thinking IMHO. I will go to a single GPU when I upgrade as Jay suggests, but for the 1080ITs, they are monsters at it.
I think it would make sense to give the decision into the hands of the game developers so they can test it all out and verify if SLI or DX12 multi GPU gives more performance, than the devs can get in touch with the driver developers and tell them to disable SLI for their title if e.g. DX12 is available. That way the devs can have it in their hands and maybe adjust some code to utilize the hardware even better
@Brad Viviviyal omfg I think you missed the point of this video, newer games will not have multi GPUs support so no matter what you say about dual GPUs are a waste of money, and IF you spend 2400$+ on 2x RTX 2080 Ti clearly have more money than brain.
So few game developers code for it in their games and so few people have sli/cf capable systems developers have to prioritize development for single card code from a budget perspective. Why waste budget for sli/cf code when the install base for single GPU is a much larger market.
@needabettername To be fair there, when dual GPU was reasonably viable the general consensus/word of advice was still to avoid it and get one really good GPU. Even with perfect scaling you wouldn't have a situation where the opposite advice made sense, simply because it wouldn't make business sense to allow that to happen. Say it was 50% better and 50% more expensive, it would be stuck between the cheaper single card, or the more expensive one.
@@Rararawr Sounds like a problem except it isn't since most of the workload from VR is effectively having to render double the framerate. Since a single frame of VR is actually quite easy to run since 90hz effectively requires 180fps from the gpu, two gpu's would make sync trivially easy since now the gpu's have a much smaller workload than a typical game.
I do wonder how well this would work with a Ryzen 3900 or 3950x. Would it be possible to split the load so each chiplet handles an individual gpu allowing it to work essentially as two separate systems. As the images they are rendering are pretty much the same I would expect relatively equal frame rates then just scale both displays to the slower of the two frame rates. I suspect this approach would work pretty well because it would allow excellent single threaded performance when necessary as well as plenty of cores for each gpu in multi threaded applications. Also, It might help with windows scheduling problems. Are they are still a thing?
@@Rararawr But they don´t get out of sync because the image is still transfered over 1 cable to the HMD. The image from the second GPU needs to be copied over to the main GPU first anyways. An having developed software to support it I can tell you that it works flawlessy.
The worst part about all of it is now that we’re getting to PCIe GEN5 and hardware that could actually handle two or three 3080s or 3090s, they’re not putting in the effort to make it work anymore
Sli and crossfire might be dead but dual gpu havent changed, gaming on primary (good card) and second card running utility monitors allows you to multitask without losing frames
Idk what you're doing on your secondary Displays that it hits performance, but my best guess is playing a second game on it or watching some VERY high Res video with something that is GPU accelerating.
@@taiiat0 Well i asume these days its memory limitation problem, but i have had problems with certain programs causing lag spiking with out capping any stats. And im running 3 monitors 2 main monitors are mostly for gaming (running 2 accounts) and third is for everything else running trough cheap gpu and that has worked really well for me sofar, currently waiting for the highend navis as need to upgrade whole system.
We might still see explicit multi-GPU support in applications, but developers will still need more time to support it. Unlike RTX / PhysX, nVidia isn't going to be sending people to support the developers to accomplish that though..
Remember the All In Wonder cards? Those were my fav way back when. I used to route cable, VCR, and even my NES through my PC. Was DVR'ing long before that was even a thing. Good times.
@@SangheiliSpecOp old ass graphics card manufacturers. AMD bought ATI and i think Nvidia bought 3dfx. 3dfx made Voodoo cards and ATI made cards like the 9700 Pro. Voodoo cards were like the first big push for 3d graphics back in the day.
@@drumyogi9281 I can recall both as well. There were also Physics X cards I can vaguely remember. I also have a question for you, if I may, what would you think the GPU market would be like today if VooDoo and 3DFX still existed? I mean, thinking about the GPU possibilities and what could have been on the market today.
Damn. There goes the super sexy aesthetic of two cards. I member back when two 980 tis in SLI was amazing especially if they were super sexy EVGA classifieds. RIP sexy aesthetic.
Jesper Andersen or triple 8800Ultra being he big boy setup for a time, and before that dual 7950Gx2... still have the cards but they don't work anymore at all
Multi GPU support is atleast temporarily dead for two good reasons. 1) Rendering and optimization techniques that are dependent on data available between and within a frame. If you need data from previous frame you cannot do it in parallel, if you rely data within a frame you need to do those computations on both cards. 2) Lower level API:s. DX12 and Vulkan define things at such a low level that driver cannot do MultiGPU, so each game would have to program their own version of multi GPU support instead of relying driver doing the hard work. Now for the temporarily part. Real time Ray Tracing have been done on cluster of cards on cluster of computers. The Ray Tracing doesn't have data dependencies between rays, so in theory fully ray traced game could use any number of cards and number of rays per frame would scale perfectly. I don't say FPS since the set up time would and final output would still be handled by single card, but the extra cards could allow more to be rendered.
So realistically Nvidia shouldnt have released a card that is both your GPU and Ray Tracing, but a card that links to your main GPU and this new card does NOTHING BUT the Ray Tracing. Think that would of went over better and would have given a legitimate reason to SLI and such, aka Ray Tracing on single card isnt possible so here is the RTX 2010 use your 1080TI for the frame rate and the 2010 handles the Ray Tracing. Also would of meant a more "reasonable" price for the Ray Tracing card.
Practical ray tracing requires cuda cores in addition to RT cores and they need ultra high bandwidth communication like being in the same die. The performance cost of ray tracing isn't ray tracing. It is the shaders that process what is ray traced. Traditional renderer benefits that adjacent pixels do similar things and touch nearby data. But with ray tracing that locality is broken. Currently main problem with ray tracing is on the software side. Turing is extremely powerful graphics engine that only tiny fraction of what is possible has been utilized inorder to use more of the code that was written when targeting pascal cards. Anyway just bought 2070 super, and soon could test programming it in practice. However my current project doesn't use RT cores but is mostly about integer compute. And the 1660 ti has higher integer peak performance compared to 1080 ti. openbenchmarking.org/showdown/pts/clpeak Turing has many cost effective improvements that all require software that is completely different to what is written for Pascal to be fully utilized. My favorite graphics feature would allow Turing to run games at 144 fps even with all the ray tracing effects on. But it is one new feature that Nvidia warns developers about how much work it is to use. And we have seen how long it has taken developers to implement "IT JUST WORKS" easy stuff :D
@@jouniosmala9921 Let me see if I got this right. The issue is not that RT cores aren't fast enough. RT cores trace rays more than fast enough for viable real time ray tracing. The issue is that doing something with those rays once they are traced requires traditional shaders, which have always been built to rely on predictable data usage. The shading performed when ray tracing requires a lot of random data access (meaning completely unpredictable), which neither software nor hardware is able to compensate for effectively. The performance hit from ray tracing is basically because GPU shaders spend a lot of time doing nothing while waiting on data. There is a lot of performance left over, but it is limited by latency. Simply put, the actual process of tracing rays is easy for RTX GPUs, but doing something useful with them causes a GPU to spend a lot of time idling while waiting on data since GPUs are built with a focus on high bandwidth and not low latency. This is something game developers and hardware manufacturers both need to figure out. I think AMD's RDNA architecture has a caching system designed to help solve this problem. Hopefully it won't be too long before we start seeing ray tracing capable GPUs from them, because I think they will be able to handle it very well. It certainly makes sense to me since next gen consoles are going to be ray tracing with Navi. Real time ray tracing is still in its infancy, and there is a lot of work left to be done to optimize it. Everybody seems to love hating on ray tracing, but it looks gorgeous and the performance is there. It just isn't easily accessible performance. Even as it is ray tracing isn't awful, but I do hope to see it improve. I have high hopes for the future of ray tracing in games, regardless of what people think right now.
1) Could something like frame interleaving solve this? In fact this is where SLI originally got its name, Scan-Line Interleave, before Nvidia bought 3DFX and changed the name to "Scalable Link Interface". I see the problem you describe as being a problem with the AFR scheme.
You can stream vr pretty well on a single high end graphics card. Rendering is pretty cpu dependent and dlss seems to be nvidias route for good fps with dxr
Jason U My two 980s are a pain in the ass in summer. Gets your room up to the 40s in summer with ease. Butbin winter they are great, tuning on the pc and starting any game warms the room up quite nicely! :D
everyone know long ago sli and croosfire it don't work with modern game.. since they all port from console.. so how the hell would you sli them? last game that work well with sli was crysis and remater they gave up on it... i don't know anygame do sli anymore..
Yep, same here. My dual watercooled 1080s still just shred through everything at 1440p ultrawide. I see no reason to upgrade any time soon, especially since RTX itself is mostly a gimmick for now.
My question would be, a year later after the continued slow adoption rate of 4k, do you still see the potential for pixel count to exceed graphical processing power again? Unless you're building a literal home theater, 8k doesn't improve visual fidelity while even at a below standard distance. For a monitor, despite being at point blank range, you need something nearing the size of a living room tv for most to notice pixelization. We can increase pixel counts nearly forever, but we're nearing the point of being beyond human perception. The same could be said for framerates, but there is still plenty of room for growth in the top end.
@@Tallnerdyguy Except when you are upgrading just the card. I got a 2nd GTX 1080 for $450 when the RTX cards were announced. I could have got it even cheaper if I bought 2nd hand. I get much better frames in most AAA titles than if I'd dropped almost twice that on an RTX. Personally, I think thats the real loss here.
I bought two gtx 580’s way back in the day and struggled with it all the time. Driver support from game to game was a crap shoot, and micro stuttering was usually a issue. The heat and power requirements shoot way up too. It would be more beneficial to save that extra dough to get something like a gsync monitor that increases fluidity imo
There is a plus side to this. This could open up opportunities for motherboard manufacturers by possibly removing the extra X16 slots and add different features. Maybe more m.2 or some others accessories.
My first PC build was in 2011 with twin HD6959s running triple 1080p Later on I had triple 7970 with triple 1440p Switched out for triple water cooled 660ti and found driver support for multi screen multi GPU worth it for the performance hit Now I have a single 1070 which struggles on a single 1440p ultra wide The price of single top tier cards is worse than ever too. While 1080p gaming is easier to get into, high resolution gaming has gone up in performance requirements faster than GPU power Now that it's getting more difficult to increase top tier power, why are they simultaneously destroying the only other method of achieving it?
A single powerful GPU is the only real viable option to get more performance. Multiple weak GPUs are just not worth it in terms of price and power consumption.
@@raven4k998 the ship that got sunk was a bot. an actual murderous PVE bot. I'd keep an eye on him if I worked there, they're programmed to kill even if they're not very good at it
siege, battlefield, witcher 3, and gta 5 all did well with sli, but having to disable it between games like doom got really annoying. might get a 2060 next week, or a nice 1660ti
DX12 supports it but devs didn’t implement it. However if gold standard starts next at 8k they may start supporting it as the way DX12 supports more then 1 GPU was massively better. Though sli would raise frames rather well it didn’t really lower lag as much a just getting a better card. Why Bc the gpu is rendering a frame after the gpu b4 it even post its frame. Though this cuts down on lag nothing will beat rendering after the gpu as posted the frame b4 it if all else is the same. So 2 960 may have made you feel good Bc you can hit 60/144fps but a 980 doing it with one card was just better.
So the only place its worth running SLI on would be in Free Build Mode on "PC Building Simulator" for the ingame 3DMark score. IRL you get no benefit except for live streaming like what Jay said at the end.
With AMD using the infinity fabric i think it's only a matter of time before 2 GPUs work in tandem but only show as 1 GPU, just like they have 2 chiplets with the infinity fabric showing as 1 CPU.
This is what Explicit Multi-GPU does basically, and it is a place where AMD will be going. It would be ideal to simply present the multiple chip dies to the OS layer as a single GPU block and have the tech to allocate resources in the GPU but the issue with GPU's is latency, more so than with CPU's so I think a fair bit of work still needs to be done.
@@Sipheren for datacenters where they will be using dozens of gpus, but for mainstream, probably not for 3 reasons. 1. Amd (and all gpu manufacturers) wants you to buy their newest and greatest, which people won't do if they can just buy another of last generation. 2. Amd doesn't have the resources for implementation for drivers and support. 3. People don't use it. Even when the 290x had like 85%+ scaling less than .1% of people actually did it. And AMD isn't really known for catering to select few
i still remember when 3DFX first used SLI with their Voodoo2 cards, was such an unreal thing at the time, then when Nvidia bought the tech from them it went bonkers, but i kind of knew that it was dying, single cards are just becoming so powerful there literally is no need anymore. Unless you do medical imaging or something.
I feel VR could really be alot more accepted all around especially with the casual players if they could use two cheaper gpu's like two 2060s or something... but i think because alot of ppl think you have to have a powerhouse pc to run vr well, they stray away from it for the expense reason alot of ppl i talk to think it takes alot of pc power to vr
I built a modest rig for my buddy in 2018 with a 1050ti and i5 4570? And some 1866mhz ddr3 Lol. I come back two months later and this boy has a fucking oculus rift hooked up to it. I did him a favor and clocked the poor little bastard of a card as hard as it could.. but it was actually a decent experience. Lower settings on most games ran really well. Medium on some. I was impressed. Bare in mind I spent like 450-500$ on this computer lol.
@@LiLBitsDK A guy who has to talk a lot to keep his channel going gets a slip of the tongue calling Hz MHz, and you find it extremely entertaining? I suppose a slip of the tongue never happened to you? How old are you?
Get Sideways slip ups are funny, if no one was hurt during or after a slip up, then it’s fine. The only person overreacting is you being over defensive over someone having a friendly laugh over a slip up. Hell, if Jay laughed because it was funny.
For games, it doesn't make sense but for rendering it can make a significant difference. You don't have to even join the cards, it can be two separate.
Well... i guess it's time i start looking for weird and different stuff i can put in my other PCIe slots then. Maybe a Sound card and some extra thunderbolt 3 port things ..
I've had both two way cross fire 4850s and 3 way SLI 980Ti. The cross fire works great but SLI barely makes any difference, but great for when one of the video cards gets old and fails and now you have a back up avIlable.
^ Still using two cards, happily. it's the only option when a single *Whatever Ti* isn't enough. Sometimes you just gotta drown your problems in cash. That said, I would like to see a return of 2 GPU's, 1 Cup cards. My GTX 690 lasted me a hideously long time, and I never regretted it. I just had to wear hearing protection.
Yeah I thought that someone had said that the 2080ti was scaling BETTER than the last few generations I don't think Jay's point was the cards though, I think it's more that new games just aren't utilizing it
NEVER! I use older cards to play older games. Its 2022 now and get no better joy in this hobby then to making it work. The tweeking, endless hours trying to find the hack, then playing the game for a couple or minutes only to watch it crash. but then finally it all comes together rendering perfectly with higher frame rates. Honestly, its not about playing in sli but its about getting it to sli. Any game can be sli ed. Dont give up, just make the impossible possible.
Most epic self done ad for a game I've ever seen, totally gonna play it. Was on the fence anyways cause it already looked awesome, Thanks Jay-z! lol :D
I just finished building an Intel 7700K, 1080TI SLI setup with 1000W of liquid cooling power and a dash of RGB. I know it's not the newest parts of today, but it's honest work. I built it over time, always wanting to achieve the dream of having a dual GPU gaming PC since it's like having a 2JZ-GTE with twin turbos! This video was really great. Also, just letting you know there are many of us out here still enjoying the nostalgia of an SLI'ed/Crossfired system along with you. I hope devs and GPU manufactures keep the idea of a Twin Turbo Gaming PC alive in the future. Keep up the good work Jay!
got 760's sli they still do good work and i feel and see what happens when i turn one of them off. sli safed me a lot of money back then when the 700 series relesed i gues thats why we cant get proper sli nvlink crosfire today... becous the companys want you to buy every 2 years theyre new gpu's
Previous build had a pair of 970's. They played everything at decent framerate and the Vive VR headset without me getting headaches. You will know its time to upgrade when you get a game that you cant play.
@@Rob-vl9ux good point Rob. I don't have much reason to upgrade other than i want something newer....but then i see the prices and get turned off. Plus i have a 2y/o that requires a lot of attention when she is not sleeping so I don't get nearly as much game time as did before she was born.
Manufacturers: Ok guys let's stop supporting dual gpu setups because we can't keep up with demand so people will only buy 1 gpu.. Also manufacturers: Let's keep pushing higher resolution screens and create really demanding video quality games...
I read a research about how we are getting near the end of chip smallness because of the quantum shifting that happens to electrons passing the transistors, so at the end it may come back.
I don't know anyone who buys boards based on that criteria anymore, most of the time boards that support sli or crossfire are more expensive and are usually just better, so some less financially bound customers might go to them. For the most part though, I don't think anyone cares anymore.
In a world where sli didn't die, and were supported. the 3090's would actually be 8k capable. But I guess people didn't wanna create support for that, and were more focused on support for rtx and various other game features.
For a second I thought that he was going to say the sad trombone "you lose" music from The Price is Right... they still use this right? th-cam.com/video/_asNhzXq72w/w-d-xo.html or the other sad trombone th-cam.com/video/yJxCdh1Ps48/w-d-xo.html
I love multi-GPU for the cool factor. If you can afford it, awesome. But for 99% of scenarios, its not worth it. I've got dual Radeon VII GPUs, and I don't regret it. Also, Jay, AMD has said for their newer GPUs that it will not support DX11 legacy CrossFire support, and only support explicit mGPU implementations of Vulkan and DX12, starting with the Radeon VII. So these newer AMD GPUs only support multi-GPU explicitly, using DX12 and Vulkan. So with newer games that use multi-GPU, you should still be good.
@@samgakgimbap9683 So I'm a fanboy for stating what AMD has officially said on the subject? And did I not say that for 99% of scenarios, you should NOT get 2 GPUs?
@@samgakgimbap9683 lol I wasnt defending the company so obviously not an amd fanboy moron hahahahaha you made me laugh though. Its not useless because it I have a bunch of titles that support it regardless if neweer titles don't
Best sponsor ad ever. Now to watch the sad part, the rest of the vid about the death of gpu scaling :(
I saw the comment first, not ready or disappointed!
[ I L I K E B O A T S ]
Dead until intel implement their own solution.
He nailed it bro
@@sorcierx2604 there were often times it did me way more good than harm but the titles these days, no dice.
Jay, you convinced me. Imma take the money I'm saving on the second video card and spending it on hookers.
I don't know how they will make your PC perform better, but you do you...
@@WolvenSpectre It'll make the load screens a non issue. That way you can save even more by using hard drives instead of SSD's. You also get more game time as you won't be watching as much porn.
@@WolvenSpectre they can distract you from how shitty your load times and frame rates are
Depends on which cards you are looking at, could be a lot of hookers
@@WolvenSpectre actually, he wouldn't do himself - hence the hookers.
JayzTwoCents: DONT BUY 2 GRAPHIC CARDS
Also JayzTwoCents: BUYS 7 GRAPHIC CARDS
I know it's a joke but he probably never paid for any of them. The big TH-camrs have a lot of stuff donated to them and get samples from companies as well as ship stuff to other TH-camrs once done playing with OCs and bench comparisons.
Maxed 8 7 is more
@Maxed 8 Good point lmao
he means dont use 2 video cards in 1 pc.... ofc he has 7 video cards he builds pcs what do u expect....
Remember the early 2000s when there were dual gpu cards
Hope you got compensated well for that Ad it was top tier.
goosenamedal X tier
I exhaled audibly through my nose.
I usually skip then but this one was soo good I watched it
@@vroomzoom4206 I do too but i got a good chuckle!
So good, I could see that Ad being an actual commercial on TV in the 80s or 90s.
This video contains :
- a great ad made by the crew
- a monologue about the fall of multi-GPU solutions.
- no charts, no figures, no gameplay or synthetic visuals.
I like da charts though
some smart people don't need charts or graphs to understand . some kids and children need beans and draws and play-doh to understand how to apply tooth paste.
*Ray Tracing* - extensive work by developers required to implement, supported by literally 3 games, *recommended by reviewers*
*Crossfire/SLI* - long established methods to implement, developers required to implement, *bashed by reviewers*
Makes Perfect Sense
@@LukeHimself Yeah but no developers care about implementing it do they.
@@Karthig1987 lots do, but there's no incentive from any company to get it done, monetary incentive.
Compared to Ray tracing
It's a catch 22. Devs don't make multi-gpu games because people don't have multi-gpu rigs. And people don't build multi-gpu rigs because devs don't support multi-gpu in games.
Problem is also that multi-GPU rigs have to be purposely designed, mostly by having a motherboard with twin PCIe slots, which is not on your average motherboard thats shipped with your home PC, so its not an option to simply "upgrade" a normal PC into SLI, youd have to build it yourself from the ground up. Who would go through that effort if you can just buy a single beefier graphics card for the same or less money of 2 lower end ones?
@@builder396 I think most good motherboards can handle it. The bigger issue for me is that the price of used GPUs doesn't drop as fast or as much as they used to. So I never got a chance where it made sense to buy a second 970, by the time the used prices finally started to drop, I needed more VRAM anyway and had buy a newer card with more VRAM.
For 3D VR you need two ocular references for binocular depth perception - say for hand/eye co-ordination in an interactive sim. I wonder that two gpu cards might handle the L/R processing and output their combined scan into a single signal composite that bump maps, texture wrap's, and contour trace's collision physics or otherwise combines object surfaces from slightly different views for an over the Plane 3D object semi-holo render, rather than a paper doll screen plane object with orthogonal render map (2D) and screen vector rasters for adding the onscreen world and its objects movement... or something like that. Anyone?
Santy Clause 🙇🏻♂️
Exactly the same thing going on with programs for Linux.
What you need is an Anthony. He’ll definitely figure it out.
Haha
So true
anthony will sit in front of the computer and it will work instantly
@@luizalves8071
I never cough when the doctor is in the room. Everything runs smooth when the professional is there.
Lmfao so much this 🤣🤣🤣
I wish more game (or more accurately game engine) devs took the "ashes of the singularity" approach to multiGPU. I know it's a lot harder but it allows near perfect scaling even with mixed AMD/Nvidia systems.
That's actually the problem with multicard. Ashes had functional multi-gpu support but in being DX12 is really poorly built.
It's like building a clunky V12 engine to get 200HP when a modern small engine can do it better. (For lack of a better metaphor)
@@DevTheBigManUno perfect analogy , Jay would approve!
@tk maou I know, the only other confirmed game with mixed vendor multi GPU support is The 2016 Hitman
ya if unity had it sli built in then sli would come back to life
Without multi GPU and a variety of expansion cards, PCs are pretty boring IMO. May as well get a console
I agree with you 100%. Learned that some time back, in earlier build, where I had two cards and had micro-stuttering. So, I just gave that system away (eventually), and I bought a nice top-tier card. And, I've been happy for last 3 years in 1080p and 1440p.
I love the magic trick at 9:38 (look on the left side of the cards)
Similar trick at 8:28.
OMG! This must be the multi-gpu spirits sending us a message: “It just works”
Clearly RTX 2080 SLI off at 8:29 SLI on at 9:38
Jay probably moonlights as a magician and is testing his craft on us.
The truly cool thing about it was that it was mid sentence. I did not see the cut except for that.... maybe I would have, but the graphics card stole the show.
"We're gonna talk about something that's near and dear to my heart. And I don't mean cholesterol"
Bruh XD
Seriously, lmao
Heart
@@devonhart8989 Excuse me, it was early late at night
Going SLI was the biggest mistake of my life. And I'm married.
@Nuno Herdeiro missed the joke huh
@Nuno Herdeiro r/wooosh
@@ZHGAmingAllTheWay r/lookatmeiamanormie
@@ZHGAmingAllTheWay r/Ihavereddit
@@aritra2116 r/youplayedyourself
This is still my favorite pre-roll ad. Also really like the fractal design Christmas one too.
My GTX 980's will be my last SLI setup. End of an era, but also Im looking forward to a lack of microstuttering.
Yeah my 970's sli will be my last. Looking at possibly getting a 2070 super next.
For now. I've got a feeling it's going to come back, nVidia didn't put NVLink on their cards for benchmarks. Now they can't sell to miners, they need another way to get people to buy multiple gpu's.
8K
Same here same cards too
Same, I have 980's in SLI and never anymore.
2:04 Did anyone else notice he was playing a video and not the game itself
Most of the reason to even use SLI / Crossfire in 2019 is so you can flex your dual RTX 2080 Supers on PCmasterrace
Until the next guy posts flexing his dual Rtx titans instead
Junior Nunez But then you can laugh at him for wasting money when two 2080s would’ve been fine.
TL;DW : Yes, they are pretty much dead for current and probably near future generation of cards
@Johnny Casserole I have left SLI for what is was 2 years ago, was sick of the constant misery after a driver update, micro stutter and less and less titles supporting it (sometimes even got lower frame rates than a single GPU). Been building since the 80s, have seen the rise and fall of many stuff.
SLI for gaming is dead, for editing/rendering other stuff/streaming/multiple monitors it can be unbeatable.
Not this video, but your comment is bragging rubbish....
@Johnny Casserole You didn't watch it or pay attention then. Game devs are done with it.
@@SimonLeBonbonbon That's the problem with SLI and Crossfire, it's at the mercy of game developers. These game engines don't directly communicate with the card anyways. They utilize a standard API such as DirectX. There is a huge performance gain in forgoing the API but game devs aren't going to heavily customize their engines for each kind of GPU out there. So since most of the heavy work is done by the API anyways, then parallel GPU optimization should really be mostly on the shoulders of the API.
The protocols for parallel GPU gaming need to be developed from scratch is they even bother to do that. In the next 10 years we'll start to see the lines between CPUs and GPUs blur and what we know to be a motherboard and its components likely be far different by 2030. So R&D is likely not on GPU parallelization.
@@krozareq Totally agree, unfortunately Johnny Casseroles comments are removed so I will modify my comments a bit. I can understand the support for SLI/CF dropping since it has become a niche market with less adepts. But when you compare current GPU's with those 10-15 years ago, there has been a massive performance gain. Who knows what we will experience in 10 years from now (I will be retired by that time) Whether the graphics calculation will move to integrated instead of discrete will only be clear once we are there.
Perhaps (swapable) GPU's on motherboards just like current CPU's are used.
That integrated graphics will have more processing power is without a doubt so perhaps GPU's eventually become a niche market.
@@krozareq I i wouldn't necessarily blame the motherboard, what about the graphics cards that use one pci-e slot with twin gpu's inside of them, like the GeForce 9800 Gx2?
7:01 he literally predicted the performance of 30 series and rDNA 2!
I... normally skip in vid ads.. but Jay... Jay my man.. This was pure genius. Actually made me watch the skit/ad. Oh and the vid itself was amazing as always
nah ads suck
@@psychozulu Yep I did xD
Jayz: "We don't need 1000 hertz panel?"
10 years later.............
1 kHz is actually becoming a thing just right now with micro LED technology, which will probably kill OLED and quantum dot, before it even became a thing.
someday. this say he'll hopefully So .actually important kinda actually is hertz 0001
He said 100 000
@@stevieg6418 No, he said 100 Megahertz which is 100 000 000 Hertz.
I am waiting right here bro
RIP SLI / crossfire... F to pay respects.
Wow..2K likesss......
it will be back. PS5 has a patent of AMD Dual GPU
F
@@qwerty6789x so what happened?
I love sli/xfire
I really honestly miss it... its sad to see that its dead
A lot of my builds have dual gpus still
Titan RTX SLI is the only way to game in 8K for now.
yeah well "I LIKE BOATS!!"
I still run 2x 4gb GTX760s in sli on my x58 rig.
It's dead temporarily in my opinion, Watch it come back in a different form in coming years from all manufacturers.
Silicon is reaching it's limits in terms of speeds and die shrinks.
They will all move to MCM at some point, Instead of one large die, They will have smaller ones linked together, May have 2 dies linked doing normal gpu work like sli/xfire from the past, And another 1 dedicated to ray tracing, And one for A.I.
There is only a few shrinks left then they are stuffed, If they want to keep increasing performance on the same infrastructure then they make smaller easier to make linked chips with less bad yields,Also bring costs down.
@@allansh828 8k in gaming is a thing!?
Multi-GPU is not dead. It's been murdered.
Big difference.
That's one way to look at it. It is simply because not enough people were using it. Developers don't like to serve minorities. Nvidia's choice to remove the possibility on lower tier cards also caused this. The last "budget"card was the GTX950, after that Nvidia dropped support for everything below GTX1070.
I think it is just the evolution.
Hey, in the old days, you had a 2d card and a loop through additional video card, the 3Dfx VooDoo. After that, the Voodoo 2 came and you could use 2 of them, so basically this was the first SLI GPU setup for consumers....
@@johnyang799 SLI/Crossfire is unstable and poorly supported. Enjoy your micro stuttering.
@@ryrin6091 These tech shill videos have an awful lot of big tech employees on guard for damage control. I know you guys aren't hurting that bad.
@@fermi3294 durrrrr everyone who disagrees with me is a paid shill. Good luck with hoping that every new game that releases will utilize Crossfire in any meaningful way.
@@ryrin6091 i don't think you're a paid shill, i think you believe all the bullshit you hear, and adopt it as fact (*when it lines up with your 'gut'). The fact is, i play all the games i have (practically, anyway) with my 2x rx580's, getting fucking great performance... with more of a pain setting up each game, but that isnt a deal breaker for me.
The similarities between Jay and Bill Burr is their ads are always fun to watch/listen. Everyone should be advertising this way. Top notch Jay!
I don't like any in-video ads only except Jay's. He pours tons of effort on it and makes it fun. Nice job, dude.
Dude, say what you want about sponsored videos, but god damn Jay commits!!! XD
It's called "going overboard", but I guess you could also call it "committing" as well...
100%, his adds are always shipshape
*gets sponsored* jay: LETS GO OVER THE TOP CRAZY TO GET THE SPONSOR WICKED SALES!!!!
JZ blows the competition out of the water
It’s called reading the script they send along with their money.
From the start, I think SLI and crossfire was marketed like an upgrade option, but that the benefit didn't justify the cost. It was better to just get a newer replacement card.
true, but multi-GPU also was the only way to go beyond the best possible performance for one part. For example the quad 1080Ti SLI set a rasterization record that the current 4090 only barely beat and it still had to be overclocked to do it. That's basically tech from almost a decade ago still managing to take home a win against the best of the modern world. if you were a person who truly wanted the absolute best with absolutely no concerns outside of performance then running multiple GPU's has been, and could still be the only way to take it to the next level past just water cooling and overclocking current tech. There is definitely a place for SLI and crossfire even if it is just another niche like custom water cooling, delidding, sanding, liquid metal thermal 'pasting', and liquid nitrogen cooling.
Very creepy when a gpu vanishes off the table while you’re talking!
It's now on MY desk! Incredible. I should see Dr. Xavier
Here's your answer, Nathan. Acme's Disappearing, Re-appearing GPU! Boy, that Acme, what a genius!
8:25 9:35
@@battt1718 you are the hero we need, not there hero we deserved
Did you not catch the one at 12:32
*I LIKE BOATS!!!* 1:02
Crashes through the window, best advert ever XD
This explains why I do 4K gaming easily with my 2 1080TI SuperOCs, SLI does make a difference still at the 1080TI level.. which is why I did not go 2080TI I would have went SLI again, so now I figure the new 3080 or 3090s will NOT be SLI cards for sure.
Hey Pete... I'm running a - Gigabyte GTX 1080Ti Xtreme Waterforce 11Gb - contemplating adding a 'used' - Gigabyte GTX 1080 Ti Founders Edition 11Gb. My reasoning is... it is certainly NOT worth upgrading to a 2080Ti & the likely 30xx series of cards will be very pricey (even though desirable)... so for
PS. I'm running a - Gigabyte X299 Aorus Gaming 7 Pro - m/board with a - Intel i7 7800X (o/c to 4.5Ghz with Corsair H60 cooler) - SilverStone 1000w Platinum PSU - 32Gb G.Skill Trident Z RGB 3600Hz (17cas) - running on a - 4K Samsung 8-series TV @ 60Hz.
-----
With my current CPU (28 PCIe-lanes)... the 1st GPU would be running @ x16, whilst the 2nd would be @ x8. Once I have a 10+ core CPU (44 PCIe-lanes), I could run both @ x16.
@@markhormann Hey Mark, here is my system.Asus Rog Maximus X Code, i7-8086K @ 5.2Ghz(All Cores) 1.32v,
G.Skill Trident Z RGB 32G @ 3600, 2 Asus 1080Ti 11G Strix Gaming OC in SLI,
M.2 NVMe Samsung 960 EVO 1TB, Phanteks - Enthoo Evolv ATX Case,Corsair -
H150i PRO, RMx 1000W 80+ Gold-Acer - XB321HK
I am gaming at 4K 60hz, my next upgrades I will get a 4K 144hz with the new 3080/90TI card. WHEN they come down in price.. Now to your Question.
I play with all settings maxed, I play the following with zero issues:
Witcher 3, Divsion 1,2, Ghost Recon Breakpoint, Metro Exodus, Dishonored series, Far Crys- All, Middle Earth SOW, Shadow of the Tomb Raider The Path Home... Plus many more, but his gives you an idea of what I play.. mostly single player, but LOTS of logged multi on D1, D2 and GR-Breakpoint. I will say some games my 2nd GPU does not show much load, others it shows more.. I hope this helps you make your decision, I think you would be well served to do just as you were thinking IMHO. I will go to a single GPU when I upgrade as Jay suggests, but for the 1080ITs, they are monsters at it.
8:29 I thought I was going crazy seeing the left most card suddenly disappear
It went into the void
it comes back at 9:39. its okay shhhh
Ghostly... wooowwowowooo
Yeah....wtf? :skep:
You are in drug
i was gonna buy 2 rtx 2080's but i used it for a down payment instead for a new car.
slipknot2k4 thats def more worth it tho i applaud your supreme self control
Either get a 2080 Ti or RX 5700 XT, the XT performs to close to the RTX 2080 to justify the extra +300
@@thetrueinferno7993 no raytracing tho
SamDJayRob Gaming honestly who cares
@@daddytachanka8076 I do ;) Although I would hope next generation GPU's allows better FPS when RT is enabled.
5 years ago I build my first and last SLI system. It was not worth the cost for what I got out of it. Better to buy the best card you can afford.
Funny, I did two 980s in 2015 as well.... not worth it. At least I got a good deal on the price.
Same here man I had 2 980
Hi! I was just about to build the 2x980ti setup. Can you please tell why I shouldn't? No game dev support so one card just stays idle? Thanks!
need more DX12 multiGPU... the scaling on games that support it (ashes, deus ex) was beautiful
that gdi logo on your profile though! tiberian sun?
I think it would make sense to give the decision into the hands of the game developers so they can test it all out and verify if SLI or DX12 multi GPU gives more performance, than the devs can get in touch with the driver developers and tell them to disable SLI for their title if e.g. DX12 is available. That way the devs can have it in their hands and maybe adjust some code to utilize the hardware even better
@Brad Viviviyal omfg I think you missed the point of this video, newer games will not have multi GPUs support so no matter what you say about dual GPUs are a waste of money, and IF you spend 2400$+ on 2x RTX 2080 Ti clearly have more money than brain.
winterkills Tiberian Sun.
Didnt even know the new amd gpus didnt support Crossfire, Thats actually kinda dissapointing, esp with their PCIE4 bandwith potential.
So few game developers code for it in their games and so few people have sli/cf capable systems developers have to prioritize development for single card code from a budget perspective. Why waste budget for sli/cf code when the install base for single GPU is a much larger market.
They also dont support fluid motion. Useless~
They also dont support windows 98 or DirectX 7.
What a shame how will i use my gpu on my new windows 98 pc.
This is how you sound
Since less than .01% of people actually use it, even if it was perfect, nobody still uses it
@needabettername
To be fair there, when dual GPU was reasonably viable the general consensus/word of advice was still to avoid it and get one really good GPU.
Even with perfect scaling you wouldn't have a situation where the opposite advice made sense, simply because it wouldn't make business sense to allow that to happen.
Say it was 50% better and 50% more expensive, it would be stuck between the cheaper single card, or the more expensive one.
I think that multi GPU setup could be still useful in non-gaming applications. It would be shame if they would stop supporting in completely.
which kind of applications?
@@dergunter1237 Scientists use it for computations, like physics simulations or fluid dynamics for example.
AI training
I was thinking twin GPU would be making a come back with VR. L & R one card for each. RIP!
oh god no. if those get desynced and one isn't rendering at the speed of the other you will probably get instantly sick
@@Rararawr Sounds like a problem except it isn't since most of the workload from VR is effectively having to render double the framerate. Since a single frame of VR is actually quite easy to run since 90hz effectively requires 180fps from the gpu, two gpu's would make sync trivially easy since now the gpu's have a much smaller workload than a typical game.
Was just about to say this but unfortunately no big VR developers seem to care about multi GPU support.
I do wonder how well this would work with a Ryzen 3900 or 3950x. Would it be possible to split the load so each chiplet handles an individual gpu allowing it to work essentially as two separate systems. As the images they are rendering are pretty much the same I would expect relatively equal frame rates then just scale both displays to the slower of the two frame rates. I suspect this approach would work pretty well because it would allow excellent single threaded performance when necessary as well as plenty of cores for each gpu in multi threaded applications. Also, It might help with windows scheduling problems. Are they are still a thing?
@@Rararawr But they don´t get out of sync because the image is still transfered over 1 cable to the HMD. The image from the second GPU needs to be copied over to the main GPU first anyways. An having developed software to support it I can tell you that it works flawlessy.
God damn it. The ad is so annoying but I laugh every time. Well played Jay.
Usually ads are annoying, but this one is cringy instead. But it also looks like they had annoyed him with ad requirements until he's snapped.
its hillarious. i never skip it lmao
The worst part about all of it is now that we’re getting to PCIe GEN5 and hardware that could actually handle two or three 3080s or 3090s, they’re not putting in the effort to make it work anymore
Sli and crossfire might be dead but dual gpu havent changed, gaming on primary (good card) and second card running utility monitors allows you to multitask without losing frames
Idk what you're doing on your secondary Displays that it hits performance, but my best guess is playing a second game on it or watching some VERY high Res video with something that is GPU accelerating.
@@taiiat0 Well i asume these days its memory limitation problem, but i have had problems with certain programs causing lag spiking with out capping any stats. And im running 3 monitors 2 main monitors are mostly for gaming (running 2 accounts) and third is for everything else running trough cheap gpu and that has worked really well for me sofar, currently waiting for the highend navis as need to upgrade whole system.
We might still see explicit multi-GPU support in applications, but developers will still need more time to support it. Unlike RTX / PhysX, nVidia isn't going to be sending people to support the developers to accomplish that though..
100 MEGA HERTZ panel , (guy in the background) -> DAYYUUUUMMMMMMMMM.
This is why I watch.
Phil. His name is Phil.
God Damn!!!!......Rick: Thanks Noob Noob
Wow. Voodoo cards. I haven't heard that name in a long ass time.
Remember the All In Wonder cards? Those were my fav way back when. I used to route cable, VCR, and even my NES through my PC. Was DVR'ing long before that was even a thing. Good times.
@@francischambless5919 Yep. I had a 9700 Pro from ATI and a Voodoo 3 3000 lol. Those were the days! Except the remote control never worked for me.
I have no idea what you guys are talking about :( lol
@@SangheiliSpecOp old ass graphics card manufacturers. AMD bought ATI and i think Nvidia bought 3dfx. 3dfx made Voodoo cards and ATI made cards like the 9700 Pro. Voodoo cards were like the first big push for 3d graphics back in the day.
@@drumyogi9281 I can recall both as well. There were also Physics X cards I can vaguely remember. I also have a question for you, if I may, what would you think the GPU market would be like today if VooDoo and 3DFX still existed? I mean, thinking about the GPU possibilities and what could have been on the market today.
Damn. There goes the super sexy aesthetic of two cards. I member back when two 980 tis in SLI was amazing especially if they were super sexy EVGA classifieds.
RIP sexy aesthetic.
2x 8800GTX :D
Jesper Andersen or triple 8800Ultra being he big boy setup for a time, and before that dual 7950Gx2... still have the cards but they don't work anymore at all
I have two identical pineapples running in dual fruit mode, 999999999999999^9999999 FPS
Maybe companies will make blank GPUs like they do with RGB RAM now. AEsthetic
@@ResilientME You could just buy borked ones on Ebay and slap 'em in. Might have to disable that PCIe slot, though, I dunno.
Multi GPU support is atleast temporarily dead for two good reasons.
1) Rendering and optimization techniques that are dependent on data available between and within a frame. If you need data from previous frame you cannot do it in parallel, if you rely data within a frame you need to do those computations on both cards.
2) Lower level API:s. DX12 and Vulkan define things at such a low level that driver cannot do MultiGPU, so each game would have to program their own version of multi GPU support instead of relying driver doing the hard work.
Now for the temporarily part. Real time Ray Tracing have been done on cluster of cards on cluster of computers. The Ray Tracing doesn't have data dependencies between rays, so in theory fully ray traced game could use any number of cards and number of rays per frame would scale perfectly. I don't say FPS since the set up time would and final output would still be handled by single card, but the extra cards could allow more to be rendered.
So realistically Nvidia shouldnt have released a card that is both your GPU and Ray Tracing, but a card that links to your main GPU and this new card does NOTHING BUT the Ray Tracing. Think that would of went over better and would have given a legitimate reason to SLI and such, aka Ray Tracing on single card isnt possible so here is the RTX 2010 use your 1080TI for the frame rate and the 2010 handles the Ray Tracing. Also would of meant a more "reasonable" price for the Ray Tracing card.
Practical ray tracing requires cuda cores in addition to RT cores and they need ultra high bandwidth communication like being in the same die. The performance cost of ray tracing isn't ray tracing. It is the shaders that process what is ray traced. Traditional renderer benefits that adjacent pixels do similar things and touch nearby data. But with ray tracing that locality is broken.
Currently main problem with ray tracing is on the software side. Turing is extremely powerful graphics engine that only tiny fraction of what is possible has been utilized inorder to use more of the code that was written when targeting pascal cards.
Anyway just bought 2070 super, and soon could test programming it in practice. However my current project doesn't use RT cores but is mostly about integer compute. And the 1660 ti has higher integer peak performance compared to 1080 ti. openbenchmarking.org/showdown/pts/clpeak
Turing has many cost effective improvements that all require software that is completely different to what is written for Pascal to be fully utilized. My favorite graphics feature would allow Turing to run games at 144 fps even with all the ray tracing effects on. But it is one new feature that Nvidia warns developers about how much work it is to use. And we have seen how long it has taken developers to implement "IT JUST WORKS" easy stuff :D
How to be smart like you guys
@@jouniosmala9921 Let me see if I got this right. The issue is not that RT cores aren't fast enough. RT cores trace rays more than fast enough for viable real time ray tracing. The issue is that doing something with those rays once they are traced requires traditional shaders, which have always been built to rely on predictable data usage. The shading performed when ray tracing requires a lot of random data access (meaning completely unpredictable), which neither software nor hardware is able to compensate for effectively. The performance hit from ray tracing is basically because GPU shaders spend a lot of time doing nothing while waiting on data. There is a lot of performance left over, but it is limited by latency.
Simply put, the actual process of tracing rays is easy for RTX GPUs, but doing something useful with them causes a GPU to spend a lot of time idling while waiting on data since GPUs are built with a focus on high bandwidth and not low latency. This is something game developers and hardware manufacturers both need to figure out. I think AMD's RDNA architecture has a caching system designed to help solve this problem. Hopefully it won't be too long before we start seeing ray tracing capable GPUs from them, because I think they will be able to handle it very well. It certainly makes sense to me since next gen consoles are going to be ray tracing with Navi.
Real time ray tracing is still in its infancy, and there is a lot of work left to be done to optimize it. Everybody seems to love hating on ray tracing, but it looks gorgeous and the performance is there. It just isn't easily accessible performance. Even as it is ray tracing isn't awful, but I do hope to see it improve. I have high hopes for the future of ray tracing in games, regardless of what people think right now.
1) Could something like frame interleaving solve this? In fact this is where SLI originally got its name, Scan-Line Interleave, before Nvidia bought 3DFX and changed the name to "Scalable Link Interface". I see the problem you describe as being a problem with the AFR scheme.
For gaming no, for GPU compute tasks, rendering, etc., very useful if your software can utilize both cards.
Outside gaming realm, multi gpu is a thing. Specially, in my case, when it comes to rendering oriented setups. Blender + CUDA for example.
Deep learning also
@romaneeconti02 He never said that it was. I think we all agree that SLI is dead. The video above was about multi GPU not only SLI.
Damn, that ad is freaking awesome. Capt Stabbin just Kool-Aid Mans through the window to throw down the game about sinkin' yo b......ships.
8:28 - 9:38 see's the magic disappearing graphics card trick
Omg Jay was getting so emotional over the death of SLI. Rip Dual cards.
RIP SLI, you were barely old enough to drink...
Not quite that old the first SLI Voodoo GPU came out in the late 90's
Brad Viviviyal legally Drink WHERE?
If SLI was born in the UK, then he would have already been allowed to drink for 3 years now. Legal drinking age is 18 in the UK.
I mean if SLI had kids. What, kids at 13? Not old enough to drink
I had voodoo 2 12mb in SLI back in the day.. Quake 2 100fps
But what about VR gaming and streaming on the same computer.
Or 1440p - 4K and 144hz rendering
Edit) What if you want both RTX and good frame rate?
GD FL3X then you are wrong
You can stream vr pretty well on a single high end graphics card.
Rendering is pretty cpu dependent and dlss seems to be nvidias route for good fps with dxr
For streaming you dont really even need a gpu just a good enough cpu.
Warships sponsorship? THE BISMARCK AND THE KRIEGSMARINE.
He was made to rule the waves across the seven seas
To lead the war machine
To rule the waves
And lead the kriegsmarine
WG has eternal hate for kriegsmarine. Every german ship is gimped
I thought it was either KantCol or Azur Lane ...
At least he didnt highlight the Derpitz..
2X HD4890's 1GB @ 1GHZ+ each in crossfire! GOOD TIMES!!!
i remember my dual XFX HD 6950's
didnt even need to turn on the heater in the winter time
Jason U My two 980s are a pain in the ass in summer. Gets your room up to the 40s in summer with ease. Butbin winter they are great, tuning on the pc and starting any game warms the room up quite nicely! :D
@@DrakkarCalethiel Nowadays take two Vega 64 at Stock Settings :D Nice and cozy
No, the 4890 was a single GPU card. It was a weird card that came out late and it was replaced by the 5870 after 5 to 6 months.
@@wesw9586 - nope, 4870X2 was. 4890 was a refreshed 4870
Jay: Don't get two graphics cards to run in SLI
Me: I wonder if I can set up four 690s in SLI to essentially have eight GPUs running at once.
everyone know long ago sli and croosfire it don't work with modern game.. since they all port from console.. so how the hell would you sli them? last game that work well with sli was crysis and remater they gave up on it... i don't know anygame do sli anymore..
The ultimate minecraft gaming pc
Precisely why I will be running my dual GTX 1080 SC's into the ground before I upgrade again.
Still using my laptop gtx 1080s no need to upgrade...and fuck rtx
Yep, same here. My dual watercooled 1080s still just shred through everything at 1440p ultrawide. I see no reason to upgrade any time soon, especially since RTX itself is mostly a gimmick for now.
Yep, I bought two ftw liquid cooled 1080ti hybrids for my next build which will probably last at least until the 4080ti.
Bought my 1060 card. I never play games that needs RTX
I would love to get the 1080ti but because of how bad the rtx has been dying the 1080ti cost just as much and sometimes more than the 2080ti
I remember jay saying he wanted to be into film production, you can definitely tell by that ad, he’d be great.
My question would be, a year later after the continued slow adoption rate of 4k, do you still see the potential for pixel count to exceed graphical processing power again?
Unless you're building a literal home theater, 8k doesn't improve visual fidelity while even at a below standard distance. For a monitor, despite being at point blank range, you need something nearing the size of a living room tv for most to notice pixelization. We can increase pixel counts nearly forever, but we're nearing the point of being beyond human perception. The same could be said for framerates, but there is still plenty of room for growth in the top end.
RIP SLI. Not-RIP my bank account
Wait till you hear that single top end GPU's are up to 1,3k€ now. You used to get a good 4 way SLI for that.
Blame publishers for rushing devs to get games out leaving xfire and sli the last priority
And it's actual popularity, even a few years ago, nobody was doing it, was cheaper to buy better card than 2 cheaper cards
@@Tallnerdyguy Except when you are upgrading just the card. I got a 2nd GTX 1080 for $450 when the RTX cards were announced. I could have got it even cheaper if I bought 2nd hand. I get much better frames in most AAA titles than if I'd dropped almost twice that on an RTX. Personally, I think thats the real loss here.
Warm Soft Kitty cause its more practical than buying the cards now.
Xfire is not short for CrossFire, the only thing that Xfire and Crossfire have in common is that they both are dead...
@Warm Soft Kitty agreed, and so do developers
I bought two gtx 580’s way back in the day and struggled with it all the time. Driver support from game to game was a crap shoot, and micro stuttering was usually a issue. The heat and power requirements shoot way up too. It would be more beneficial to save that extra dough to get something like a gsync monitor that increases fluidity imo
I think dual socket cards may make a comeback ala the 5970. AMD is moving back to scalable chiplet style achitectures.
Yes. Scalable on one PCB.
Vulkan...everybody doing eet.
Dude, that's the best ad I've seen all week! It's not often I share ads with others, but that was great. Well done! 10/10
There is a plus side to this. This could open up opportunities for motherboard manufacturers by possibly removing the extra X16 slots and add different features. Maybe more m.2 or some others accessories.
More m.2 would be awesome, i only have 2 and would like a couple more
"No king rules forever, my son." -Terenas Menethil II
"Father... Is it over?" Sli enthusiasts
- Michael Scott
My first PC build was in 2011 with twin HD6959s running triple 1080p
Later on I had triple 7970 with triple 1440p
Switched out for triple water cooled 660ti and found driver support for multi screen multi GPU worth it for the performance hit
Now I have a single 1070 which struggles on a single 1440p ultra wide
The price of single top tier cards is worse than ever too.
While 1080p gaming is easier to get into, high resolution gaming has gone up in performance requirements faster than GPU power
Now that it's getting more difficult to increase top tier power, why are they simultaneously destroying the only other method of achieving it?
A single powerful GPU is the only real viable option to get more performance. Multiple weak GPUs are just not worth it in terms of price and power consumption.
BEST WORLD OF WARSHIPS AD EVER> Almost enough to make me want to play it. xD
you sunk my ship
@@raven4k998 the ship that got sunk was a bot.
an actual murderous PVE bot. I'd keep an eye on him if I worked there, they're programmed to kill even if they're not very good at it
It’s fun try it
TBH it was kinda toast after 2016. I remember even DOOM having issues with SLI.
siege, battlefield, witcher 3, and gta 5 all did well with sli, but having to disable it between games like doom got really annoying. might get a 2060 next week, or a nice 1660ti
@@garrettzkool63 doom in volkan work realy well with muligpu
Remember the broken promises of DX12? We were promised driverless Multi GPU support
No we were not lol
@@everythingpony 3:56
DX12 supports it but devs didn’t implement it. However if gold standard starts next at 8k they may start supporting it as the way DX12 supports more then 1 GPU was massively better. Though sli would raise frames rather well it didn’t really lower lag as much a just getting a better card. Why Bc the gpu is rendering a frame after the gpu b4 it even post its frame. Though this cuts down on lag nothing will beat rendering after the gpu as posted the frame b4 it if all else is the same. So 2 960 may have made you feel good Bc you can hit 60/144fps but a 980 doing it with one card was just better.
So the only place its worth running SLI on would be in Free Build Mode on "PC Building Simulator" for the ingame 3DMark score. IRL you get no benefit except for live streaming like what Jay said at the end.
Really like your sponsorship ads, really unique and better to watch than some scripted forced thing.
He should get paid extra for it. :D
With AMD using the infinity fabric i think it's only a matter of time before 2 GPUs work in tandem but only show as 1 GPU, just like they have 2 chiplets with the infinity fabric showing as 1 CPU.
It may ONLY work with amd cpu, but interesting point
@@Tallnerdyguy they can make it work with the GPUs as well if they want. AMD has a dual GPU card that's recognized as one.
@@scottyhaines4226 the infinity fabric is powered by ryzen and chipset, can't do it on Intel motherboard
This is what Explicit Multi-GPU does basically, and it is a place where AMD will be going. It would be ideal to simply present the multiple chip dies to the OS layer as a single GPU block and have the tech to allocate resources in the GPU but the issue with GPU's is latency, more so than with CPU's so I think a fair bit of work still needs to be done.
@@Sipheren for datacenters where they will be using dozens of gpus, but for mainstream, probably not for 3 reasons. 1. Amd (and all gpu manufacturers) wants you to buy their newest and greatest, which people won't do if they can just buy another of last generation. 2. Amd doesn't have the resources for implementation for drivers and support. 3. People don't use it. Even when the 290x had like 85%+ scaling less than .1% of people actually did it. And AMD isn't really known for catering to select few
i still remember when 3DFX first used SLI with their Voodoo2 cards, was such an unreal thing at the time, then when Nvidia bought the tech from them it went bonkers, but i kind of knew that it was dying, single cards are just becoming so powerful there literally is no need anymore.
Unless you do medical imaging or something.
Crypto mining will happily slurp all GPU cards you have.
That ad lmao. Jay busting through the window
3 years ago I bought a motherboard with 4x PCI express lanes with the intent of adding to my GTX 980. Guess the dream's dead.
I feel VR could really be alot more accepted all around especially with the casual players if they could use two cheaper gpu's like two 2060s or something... but i think because alot of ppl think you have to have a powerhouse pc to run vr well, they stray away from it for the expense reason alot of ppl i talk to think it takes alot of pc power to vr
I built a modest rig for my buddy in 2018 with a 1050ti and i5 4570? And some 1866mhz ddr3 Lol. I come back two months later and this boy has a fucking oculus rift hooked up to it. I did him a favor and clocked the poor little bastard of a card as hard as it could.. but it was actually a decent experience. Lower settings on most games ran really well. Medium on some. I was impressed. Bare in mind I spent like 450-500$ on this computer lol.
I'm still rocking an extreme multi-gpu setup. Two Sapphire R9 295x2 8gb GPU's in my rig.
I'm still running a Voodoo 2 SLI.
@@fullmetaljacket7
I remember those but I dont remember an sli two card config.
@Naomi Kitsune how is it for you? I'm tempted to buy two more radeon pro duo 8gb (if i buy them I will have, yes, three of them) (stupidly for gaming)
Damn, even I missed the 100Mhz panel lmao
yeah and he mentions the next mistake with 100.000 FPS... but it would be 100.000.000 FPS :D
@@LiLBitsDK No need to overreact like the guy in the background did. All people make mistakes.
@@getsideways7257 that's what's so fun about it :D
@@LiLBitsDK A guy who has to talk a lot to keep his channel going gets a slip of the tongue calling Hz MHz, and you find it extremely entertaining? I suppose a slip of the tongue never happened to you? How old are you?
Get Sideways slip ups are funny, if no one was hurt during or after a slip up, then it’s fine. The only person overreacting is you being over defensive over someone having a friendly laugh over a slip up. Hell, if Jay laughed because it was funny.
For games, it doesn't make sense but for rendering it can make a significant difference. You don't have to even join the cards, it can be two separate.
Well... i guess it's time i start looking for weird and different stuff i can put in my other PCIe slots then. Maybe a Sound card and some extra thunderbolt 3 port things ..
If you want extra stuff. Check out unbox therapy channel big omen. This company put an xbox, ps4, and switch in an PC case with the pc.
NVMe RAID card?
Or how about you look for an old physics X card.
4:38 "[...]2070, 2070 Super and up, does support SLI."
Sorry to correct you Jay but the "normal" RTX 2070 never supported SLI.
Even 2070 Super doesn't support SLI. They support NVLink.
You know what he meant you idiots.
I've had both two way cross fire 4850s and 3 way SLI 980Ti. The cross fire works great but SLI barely makes any difference, but great for when one of the video cards gets old and fails and now you have a back up avIlable.
^ Still using two cards, happily.
it's the only option when a single *Whatever Ti* isn't enough.
Sometimes you just gotta drown your problems in cash. That said, I would like to see a return of 2 GPU's, 1 Cup cards. My GTX 690 lasted me a hideously long time, and I never regretted it. I just had to wear hearing protection.
Me: Immediately looks at 2080ti SLI comparisons.
Yeah I thought that someone had said that the 2080ti was scaling BETTER than the last few generations
I don't think Jay's point was the cards though, I think it's more that new games just aren't utilizing it
@Jesse M I think new games do not support SLI better since a single card is good enough
Using 2 2080tis right now, 1440p reaching 180 fps gta v on ultra
Should mention its running on 3 moniters too which fucks up the fps a bit
Is hard to believe but GTA V was released in 2013. Still looks amazing.
NEVER! I use older cards to play older games. Its 2022 now and get no better joy in this hobby then to making it work. The tweeking, endless hours trying to find the hack, then playing the game for a couple or minutes only to watch it crash. but then finally it all comes together rendering perfectly with higher frame rates. Honestly, its not about playing in sli but its about getting it to sli. Any game can be sli ed. Dont give up, just make the impossible possible.
BEST. SPONSOR AD. EVER!!!
(SO GOOD,... I didn't forward the vid to get past it!)
Most epic self done ad for a game I've ever seen, totally gonna play it. Was on the fence anyways cause it already looked awesome, Thanks Jay-z! lol :D
I just finished building an Intel 7700K, 1080TI SLI setup with 1000W of liquid cooling power and a dash of RGB. I know it's not the newest parts of today, but it's honest work. I built it over time, always wanting to achieve the dream of having a dual GPU gaming PC since it's like having a 2JZ-GTE with twin turbos! This video was really great. Also, just letting you know there are many of us out here still enjoying the nostalgia of an SLI'ed/Crossfired system along with you. I hope devs and GPU manufactures keep the idea of a Twin Turbo Gaming PC alive in the future. Keep up the good work Jay!
Sorry for your loss. Words are never adequate. 🌹🌼🌷🌼🌹
Im still running a pair of 970's in sli.... I feel like I should upgrade but all the titles I play run great.
Stay with them if they work. I had a single 970 and went to a 1080ti
got 760's sli they still do good work and i feel and see what happens when i turn one of them off.
sli safed me a lot of money back then when the 700 series relesed
i gues thats why we cant get proper sli nvlink crosfire today... becous the companys want you to buy every 2 years theyre new gpu's
Previous build had a pair of 970's. They played everything at decent framerate and the Vive VR headset without me getting headaches. You will know its time to upgrade when you get a game that you cant play.
@@Rob-vl9ux good point Rob. I don't have much reason to upgrade other than i want something newer....but then i see the prices and get turned off. Plus i have a 2y/o that requires a lot of attention when she is not sleeping so I don't get nearly as much game time as did before she was born.
Manufacturers: Ok guys let's stop supporting dual gpu setups because we can't keep up with demand so people will only buy 1 gpu..
Also manufacturers: Let's keep pushing higher resolution screens and create really demanding video quality games...
*mining has entered the chat
A single more powerful GPU is better than two less powerful GPU.
I’ll still always SLI!!! Been since day one!! 1080 ti sli is still my champ!! Good vid Jay
When you tested the RTX 2080Ti MSI cards using NVlink you was getting 50 tp 60% increases in game benchmarks!!
Yeah, I'm not sure why he's suddenly forgot that NVlink and sli are different.
I read a research about how we are getting near the end of chip smallness because of the quantum shifting that happens to electrons passing the transistors, so at the end it may come back.
Man, I can't afford one GPU let alone two! I've gotta live vicariously through your over the top builds until I win a giveaway 😂
the benefits of having a job and still living with your parents... i can buy whatever i want with nearly no expenses ;)
@@lytheus69 Not exactly worth living with your parents 😂😂😂
@@mc2hrs Really depends on your situation lol.. I'd love to save 1K a month..
@@lytheus69 I bet you have to hold the women back at gunpoint when you tell them your roommates names are mom and dad.
😂😂😂😂
so why are we paying extra $$$ for SLI and crossfire support on new motherboards?
I don't know anyone who buys boards based on that criteria anymore, most of the time boards that support sli or crossfire are more expensive and are usually just better, so some less financially bound customers might go to them. For the most part though, I don't think anyone cares anymore.
In a world where sli didn't die, and were supported. the 3090's would actually be 8k capable. But I guess people didn't wanna create support for that, and were more focused on support for rtx and various other game features.
You could straight up disable multiGPU, and gamers would still claim it improves performance...
Jay: Play that sad walking away song from "The Incredible Hulk".
*Doo doo doo doooooooooooooooo...*
For a second I thought that he was going to say the sad trombone "you lose" music from The Price is Right... they still use this right? th-cam.com/video/_asNhzXq72w/w-d-xo.html or the other sad trombone th-cam.com/video/yJxCdh1Ps48/w-d-xo.html
as he hitch hikes away, LOL
You can still take advantage of SLI if you do heavy AI / machine learning if you write your own code ...
I love multi-GPU for the cool factor. If you can afford it, awesome. But for 99% of scenarios, its not worth it. I've got dual Radeon VII GPUs, and I don't regret it.
Also, Jay, AMD has said for their newer GPUs that it will not support DX11 legacy CrossFire support, and only support explicit mGPU implementations of Vulkan and DX12, starting with the Radeon VII. So these newer AMD GPUs only support multi-GPU explicitly, using DX12 and Vulkan. So with newer games that use multi-GPU, you should still be good.
its still bullshit because there are barely any vulkan and dx 12 titles so i say they should still support it by default
BS. Useless tech tbh. Amd fanboy
Before I got my 2080ti I had radeon vii and it was great even before I upgraded to my Intel i99900k
@@samgakgimbap9683 So I'm a fanboy for stating what AMD has officially said on the subject? And did I not say that for 99% of scenarios, you should NOT get 2 GPUs?
@@samgakgimbap9683 lol I wasnt defending the company so obviously not an amd fanboy moron hahahahaha you made me laugh though. Its not useless because it I have a bunch of titles that support it regardless if neweer titles don't