Why I (personally) chose the 7900xtx nitro+ over a 4080: - 24GB VRAM, 384 bit bus for considerably better longevity (devs keep optimizations low and games keep being poorly designed and its getting worse and worse) - RT/PT is still a poorly developed feature Nvidia tries to sell since the 20 series and Im generally not interested in most AAA games unless they are well made and fun (I enjoyed CP2077 and Hogwards legacy a lot for example) - I prefer native res, no upscaling and fake frames for 100-180 fps in 4k/1440p in the games I play (BG3, Dead by daylight, Helldivers 2, PoE2, Witcher 3, TLOU etc.) - I want a safe, non experimental power connector that doesnt melt (yes 4080s melt too, just not as much as the 4090s) - AMD adrenaline is far better than Nvidias control panel, by a mile - DP 2.1 and not only 1.4a - overall very reliable core (7000 series) and superb board quality (Sapphire nitro+) - very good OC/UV potential - no issues with multi monitor set ups - potential switch to Linux when Win10 support ends - the design and RGB are amazing Those are my reasons for purchasing the 7900xtx nitro+. If you made a proper research and the 4080 fits your personal needs better, Im happy for you. Enjoy!
It seems that the RTX4080 / 4090 both have design flaws. The PCB can crack under the weight of the cooler, and there's also a problem with the connectors melting. Fortunately, Nvidia has fixed the connector problem. The RTX40 series cards (at least most 4080S models) have a redesigned 12V 2x6 power connector that only draws power when the cable is FULLY plugged into the connector. YT experts have tested this connector and even if you bend the cable in this connector, nothing bad happens. My RTX4080S also has an anti-gravity plate, so the PCB cannot bent no matter what (but I still bought a support bracket just to be 100% sure). The 7900XTX also has design flaws. Our good friend Frogboy has already replaced his XTX because it blew up. The 7900XTX's chip isn't perfectly flat, and that's a problem because it can only fully touch the cooler in the middle. As a result, the thermal paste dries out very quickly and then GPU chip can overheat. Not many experts are willing to repair 7900XTX chips because AMD has made it very difficult to reball these new RDNA3 chips. As for your other "arguments", I will try to answer some of them later.
- The RX7900XTX has 24GB. Of course that's more compared to the RTX4080S 16GB VRAM, but it's not something that can make a difference in current games. Most games use 9-12GB VRAM even at 4K. There are few games that can use more than 16GB VRAM with PT + FG at 4K native, but my card cant even run such extremely high settings. I need to use DLSS to get smooth fps and then my card is no longer VRAM limited. Your AMD card has 24GB but with 4x worse performance in PT you arnt going to play the most VRAM intensive PT games anyway. VRAM requirements will skyrocket when developers start porting PS6 games to PC, and that won't happen any time soon (2028/2029). By the time PS6 launches, both my 4080S and RX7900 XTX will be probably obsolete anyway. I have heard RDNA4 RX9700 will "only" have 16GB too. It seems even AMD knows 24GB in gaming GPU is overkill at the moment. I would be worried of my card would only have 12GB VRAM, but 16GB is still plenty. - RT is poorly developed feature? Well, to be fair some games indeed have poor RT implementation, but based on what I tested most RT games runs great and RT does make a difference. When I first played The Witcher 3 with RT, I was shocked at how much better it looked. Lighting with RT was no longer flat, because RT adds indirect lighting / shadows. RT reflections look much sharper and no longer fade during movement. Also, thanks to RT shadows no longer draw a few metres in front of the character. It's impossible to ignore such huge difference unless you really dont care about the graphics at all. In 2018, people could ignore RT because there were only a few games with RT. As of now however, there are over 100 games with RT support and we see new games with RT almost every week (and not just in AAA games, but indie as well). - You said that you prefer to play at native resolution and real frames, but in raster both cards offer literally the same performance, so IDK why you even mentioned this argument in favor of your card. According to techpowerup's test, the RTX4080S averaged 95 fps based on 25 games tested at 4K native, while the RX7900XTX averaged 96 fps. Because both cards offer similar/equal performance, I'm not losing anything, and I can't say the same about the AMD card, because you cant get the same performance in RT games. The RX 7900XTX gets destroyed in RT, especially in PT games. In Black Myth Wukong at 1440p DLSS Q + FG I only lose 3% performance with medium PT compared to lumen (123fps vs 127fps). Try to run medium PT in BMW on your 7900XTX at 1440p FSR Quality + FG and you will get drastically worse performance compared to lumen (like 4x times :P). On my PC I can even play BMW with fullRT (PT) and maxed out settings. I get 103fps in built in benchmark at 1440p Q + FG and that's still perfectly playable. At 4K I need to use DLSS performance but with the latest DLSS 3.8.1 and my reshade settings I still get 4K like image quality (far better even compared to PS5Pro version running in balance 40fps mode) and 86fps in built in benchmark (80-100fps during gameplay). That's still perfectly playable experience. I doubt even RDNA4 GPUs will be able to run this game as smoothly as my card. Maybe RDNA5, because I read that AMD wants to rebuild their architecture from the ground up and go the Nvidia route (add AI and RT cores). Playing Cyberpunk with Psycho RT at 140-170 fps at 1440p DLSSQ + FG is an amazing experience. I get 23ms latency at these settings. Even PT runs great 110-130fps. I'm 100% happy with the performance of my card and that would not be the case if I had bought an AMD card.
- As for RGB lighting, my card has it as well. - When it comes to DLSS image reconstruction, it offers the same and often better image quality than TAA native. I tried playing RDR2 in native TAA. The image was extremely blurry and I wouldnt want to play this game like that. With the latest 3.8.1 DLSS (it's very easy to swap DLL files) I get MUCH better image quality even at DLSS performance settings compared to TAA native in RDR2. No sane person would want to play this game at native TAA when DLSS looks so much better. What's more, I can also use DLSS + DLDSR (AI powered downsampling that looks the same as DSRx4 but runs at half the performance cost). With this combination, even the most blurry games like Hellblade 2 look sharp and I still get 3-5fps more compared to native TAA. AMD cards can use FSR2, but this image reconstruction technology is definitely inferior compared to DLSS. On the static image FSR2 image can look quite good, but during motion FSR2 has shimmering problem. Perhaps the AI-powered FSR3 will look better, but for now I try not to use FSR even if game doesnt support DLSS. In RE4 Remake even 2880p FSR2 Quality was still shimmering even when downsampled to 1440p. Nvidia FG is also better than AMD FG. Nvidia FG works well even below 60fps, it has less noticeable input lag, and no motion judder. Most people who say crap about DLSS FG havent even used this technology on VRR display. DLSS FG works extremely well and is a very useful technology because performance can drop below 60fps from time to time even if average framerate is much much higher. Thanks to DLSS FG I no longer worried about sub 60fps dips because I always get enjoyable experience. Many people chose 7900XTX over 4080S simply because AMD card is $100 cheaper, but that's not the case in the long run. I saw bank4buckPCgamer YT gameplay videos and his 7900XTX at 99% GPU usage always draws 465W. My RTX4080S draws between 260-315W depending on the game (at 99% GPU usage), so that's about 150W difference. My entire PC draws 430W at absolute max. 150W in use for 8 hours (480 minutes) is 1.20 kWh, or 438.00 kWh per year. I have to pay 452zł for that (109 $USD at today's exchange rate). In my country, the 7900XTX would definitely end up costing more money in the long run, maybe not the first year becasue I'm not playing for 8 hours daily, but after 3 years that would be the case. If I were to buy an AMD card I would wait for the RDNA4 GPUs as they will offer AI powered FSR, much better RT performance and should be much cheaper than RTX4080S. If AMD can deliver all this, then I might even recommend my sister to build a PC with an AMD card over RTX4080S.
Pretty trash but let's see how their texture compression works. DLSS and FG have now been copied by the rest of the industry. Bet this is another feature that has the same trajectory.
@@Phil_529 I would rather see no textures used at all!. Imagine like raytracing used to be, you set up physics based materials. Not a single texture in sight, and it looked more realistic. But it is also a lot more compute intensive. Or we go the stable diffusion way, generate a vector cloud that is decoded into a realistic picture, it takes a second todo on a GPU today, but with specialised hardware for this, it might be realtime.. I am very interesting to see what their Neural rendering is anyway :)
@@Phil_529 frame gen is horrible though, way more visual glitches than dlss upscale, also very noticeable input lag in some games, indiana jones being one recent example of it... frame gen really is a hit or miss, in every game i tried i could only stand it to the end in one game and even in this game (god of war ragnarok) i had annoying glitches during the campaign, in some areas i have been forced to disable it completely, when Atreus is with Heimdall in its first time in asgard. I thought nvidia frame gen would be a huge improvement over amd frame gen since that was one of the reason i have upgraded to rtx 4000 gpu, turns out i was wrong... with dlss frame gen you go from 60 fps to like 90s but input lag feels like 40~50s, feels like you are playing via cloud sometimes and it also adds an extra layer of blur to the image.
But but you will get DLSS 4.0 and and next gen Frame genaration and all those crap😒 It is madness indeed and thkse prices are banana's and if Trump put tarrief it will go even higher those prices😂
@@igorrafael7429 Frame gen is designed to make an already playable experience better. Not to make an unplayable game playable. FG is only as good as the information you give it.
There is one thing I found when I was testing the 4080 vs. The 7900s and 6900XT. The 4080 would have a much higher frame rates when displaying black. Black of course on a game is the lack of information. I found this when testing Ori Willow of the Wisp when the scene was almost empty (black) and Rift Breaker at night where the detail is truncated. The AMD would jump to 150 fps while the NVIDIA would hit 600. This would throw a counter.
There was no better alternative when I bought the RTX4080S. AMD cards are useless for my needs (poor RT performance, poor image quality, poor frame generator), so I wasnt even considering AMD, while the RTX4090 24GB was $1000 more expensive compared to my card while only offering 10-15fps more (that's like the difference between DLSS Quality vs Balance). I would rather save that $1000 for my next upgrade, than pay that much for the RTX4090 to see just a smal uplift in performance. I'm very happy with my purchase. There isn't a single game (PT) that wouldn't run great on my PC and the price/performance ratio is also very good. According to the techpowerup benchmark, my card has 95fps at 4K native based on 25 games tested (and my card is OC'ed to 1925MHz core and 820GB/s memory bandwidth so it should reach 100fps barier). The $299 RTX4060 has less fps (85fps) at 1080p. Getting 4x the resolution for 3.3x the price is a fair price to pay.
@@Spratty-wb3is It can run games at 4K natively without RayTracing. But now almost all games have RT, so It's either 4k native without RT, or FSR(1440p)+FG with RT.
@matthuang4632 the XTX doesn't do 60fps in unoptimised games in the same way the 4080 doesn't They both perform about 45fps. They both need upscalers to hit above 60fps If that makes the 4080 a 1440p gpu it also makes the 7900xtx a 1440p gpu. RT isn't in the equation..
It's a 1440p card because of the fact Nvidia are artificially making it obsolete with this VRAM 16GB which isn't enough to allow gamers to max out texture settings on 4k . I hearing of this AI texture compression thing which Nvidia would upsell gamers to believe that's better than native texture resolution . I can see the bullshit coming
Let’s be honest, PC gaming has been giving a lot more diminishing returns when compared to consoles this gen. Consoles have done a better job at closing the gap for dollar/performance this gen. For most games playing on PC is redundant.
Time will tell. I have both AMD and Nvidia. To be truthful my AMD card has caused me more issues. My games seem to stutter a little more, some games straight up refused to run correctly and using my VR headset was frustrating. Never had any issues with nvidia. However, I will also never rule out either company for future purchases as the price to performace from my AMD card is very good. At one point i upgraded a 3070 to a 7900xt and had the same performance so I sent it back.
People shouldn't take your videos seriously. Like it's fun to watch every once in a while. Your videos are funny. But definitely not someone I would take serious when it comes to product advice or knowledge.
Did you even update your bios btw. my 4080 super had loads of issues like that until I updated my bios that was from before 4080 super was launched. once I updated the bios all the 1% lows were good and had no issues.
I have an rtx 3080 10GB vram. I play on a 3440x1440 34 inch monitor. I have 140 games, except for a very small number I play all games on the max settings (ray tracing off, I don't need that). Yes, the only games that have a bit more difficulty are the heavy games like cyberpunk, if necessary I turn on dlls. But at the moment i play all games on the max settings with my rtx 3080. The point is that the rtx 5080 will certainly be playable on a 4K monitor in many games and you will achieve over 60fps in many games (but not in all games). But who can buy an rtx 5090? And wich card gonna be better then the rtx 5080 for 4K in affordable price?
FSR is catching up to DLSS. A card with software/hardware upscaling will always to better. AMD software approach is great but it’s done to sell cards at a cheaper price.
They are definitely overcharging us. But truthfully, there is additional value for some. Weather it’s the lower power consumption, 30ish% better Ray tracing, the upscaling. Or all of the above. AMD is superb though. I welcome all competition. In the end it benefits us gamers. Hoping the new 9070xt upscaling tech has made enough advancements to warrant NVidia price drops. Someone needs to keep them on their toes.
The 5080 does look like it will be significantly inferior to a 4090 for 4K gaming. Most of the NVidia features need more VRAM to work and the impact of insufficient VRAM doesn't always show up in short review sessions. The 5080 might be a good 4k card if they do a super/refresh version with 3GB VRAM chips A 24GB 5080 might be slightly better than a 4090 for for 4K gaming. If the readily available versions of the 5080 are 1500 bucks, and the 5070TI is available for 900, the 5070TI will be a lot more interesting as a 1440P/4K with DLSS card
The other thing Frog is Radeon GPUs defo have a more colourful image, it's in their design Bang4buckPCgamer confirmed this to me so we're not the only two seeing the difference I put the Witcher 3 on once put the XTX back in & it defo is better for my eyes, same on Ratchet & Clank, it's like a more faded colour on Nvidia I find, I did everything to try and change that also but never truly got it to how I wanted tbh
16GB 5080 is a 1440p max settings card, but only a non-max settings card at 4K ... coz some AAA titles already using >16GB VRAM at 4K. Thus, I won't be buying this card for 4K.
People need to understand that nVidia isn’t a gaming company anymore. What they make in GeForce sales is utterly dwarfed by datacenter and enterprise. Same goes for AMD’s and soon even Intel’s GPU division.
It is not longer 1:1 of what is generated and what is presented. We always had this with texture filtering and stuff, but DLSS, upscaling, motion interpolation, and other modern methods add new layers which makes it very difficult to measure performance or even the resolution. (The parts would even have their separate resolutions like shadows or reflections, but this is now much more). I'm not sure we can do any more objective analysis. Some people will like smoother graphics, others like crisper ones. Some don't perceive lags, others are annoyed by it. It has become a multiple levers of tradeoffs with no clear winner.
You make some good points. I don't think it is a 1 to 1 comparison anymore honestly. In my eyes AMD has a very different look and feel when it comes to visuals and performance that I greatly prefer.
@@Frogboyx1gaming Yes. In the past it was a strict API, and what you described would be what you got. The only difference was performance (or feature sets). Today, you basically tell your "intent", and the driver "optimizes" it for whatever priorities they have. The results are similar, but as we have all experienced, no longer the same.
Man, why in the hell AMD have failed to push its 7900xtx? I mean a 24GB video card for less than 1.000$. And Believe me it still have a more for improvement, even for ray tracing!
The problem isn’t Nvidia or AMD. GPU’s are plenty powerful, even the midrange stuff. The issue is these devs we have today seem to have less work ethic and don’t put in the work required to properly optimize a game. They just shove the game out and expect Nvidia and AMD to optimize it for them through drivers and AI upscaling. Devs need to start doing a better job.
Look, I despise Nvidia more than you know, but your claims don't hold water. I've been gaming at 4K Ultra settings exclusively for years now. Dozens and dozens of games, and there are only a handful of games which can't handle 4K on a high-end 16GB GPU regardless of brand, and those are mostly ray tracing heavy games. There are thousands and thousands of games out there, and if you judge a GPU by the 10 games which don't run smoothly, you're being ridiculous.
Frogman be like Nvidia is charging way2much Ps5pro is 2x the price of a ps5 But not double the power Frogman b like I love my ps5pro🤣🤣🤣 Y wont frogman keep the same energy?🤔
99% of tech youtubers miss the software part of it all. Like when Last of us came out, they start claiming it was lack of VRAM that was the trouble.. when in reality it was the texture compression that was leaking memory filling up the cards.. and yes the cards with more vram would last better.. but no it was 8GB vrams fault.
Here is some food for thought that most people don't realize. Steam OS has better performance than windows does with the same hardware configuration. I believe the biggest problem with most of the hardware out there not performing to expectations is how bloated the OS for gaming PCs is. Windows 10 and 11 even worse so are bloated with so much spyware it's ridiculous. This is the very same reason Microsoft is actively developing a lighter OS for gaming and handhelds specifically. People don't think about this much but it is definitely a factor.
im just glad we're finally getting a viable 3rd option with intel's new GPUs. they probably still need another year or two to really fine tune the comparability stuff but they'll probably catch up / beat NVIDIA and AMD in the coming years.
That's not how different class works. The 5080 will be a 4K class GPU but not at max settings. Only the 5090 will be a maxed out 4K class GPU. 5070 - 1440p medium-high settings level class 5070 Ti - 1440p max settings / entry 4K level class 5080 - 4K medium-high settings level class 5090 - 4K max settings level class
3090 was advertised as first 8k GPU... In that case any xx80 class card should be for 4k with max settings... Nvidia is moving performance away from xx60/xx70/xx80 class cards creating wider gap between this one top card and slower cards in the lineup. In 2016 1060 had around 50% spec and performance of 1080 Ti, in 2025 5080 will have around 50% spec of top Nvidia gpu but will have 70% of price... I have RX 7800 XT with 16GB VRam and right now I'm playing Witcher 3 with high res textures with everything set on UBER+ without RT in 5120x1440p@60 and depends on location this 10 year old game can use 11-14GB of VRam. This is ok for me, my card cost me around 500$ not 1200$ or 1500$ like Nvidia wants to get... Also in time when ultra wide monitors are getting cheaper and cheaper resolutions like 5120x1440p or 5120x2160p will be more popular and in that case having 12GB or 16GB ram in 800-1600$ card will be like deliberate aging of the product. Fore me Nvidia is just greedy and it's just a scam.
it is 4k gpu for games previously released... its going to be 1440p for games after its release, remember, 3080 was once a 4k card too , how long did it last? lmao...
The 4090 is a 1440p card. I use it on my main system and would never go to 4K because there is already not enough FPS on 1440p. The 5090 will also be a 1440p card.
I watched Fabio's video too... The lower 1% lows will be because both of the AMD cards have more Vram! ... There is a reason why it is often referred to as a "frame buffer" they are not faking anything! 😂😂 For example... Again I run my 3060 Vram 1ghz faster than stock, and the biggest difference I noticed was my 1% lows being higher, this is because with more speed comes a bigger bandwidth! AMD has bandwidth advantage due to MORE Vram. You seem quick to ignore that in that video, the games this didn't matter on the 4080S out performed the 7900xtx at 4K! A card you label 1440p! But outperforms the 7900xtx in some games at 4K? Nvidia aren't faking anything it is just a fact of both AMD cards have more Vram! Again... "Frame Buffer!" Why don't we get videos on the AMD leaks? Why do we not get your thoughts on the obvious deception attempt by AMD with their new "Nvidia from Wish" naming scheme? Froggy, you being someone that has made both Xbox and PS content and seeing the comments, you should understand what I mean by this... AMD fan boys are the GPU equivalent of PS Fan boys, Nvidia Users are the GPU equivalent of Xbox Users, yes Xbox & Nvidia have "Fan boys" but not in the same way... Most of the retaliation is from them hearing someone blatantly lie about a product to try and discredit the opposition instead of promoting the product they like. I never hear anyone bad mouth AMD cards while reviewing Nvidia cards, in fact more often than not most offer a "cheaper" AMD alternative at the end. AMD reviews tends to turn into a hate on Nvidia Video! And don't usually end with being offered an Nvidia card alternative, to me this shows that most tech channels push AMD first, but you say the opposite but often show we watch some of the same channels. 🤔🤔😂
The 4080 Super is a decent 4K card but a phenomenal 1440P card. I sold my 4090 and bought a used 4080 Super for $600. I also picked up a 27' 1440P QD-Oled and removed my Samsung 4K IPS monitor. With that said, I enjoy the 1440P QD-Oled more than my 4K display. I think the overall experience is better with 1440P, visually on par and FPS is better. 1440p O-led > 4k IPS
Most of the time it is Windows that causes problems. And all the additional programs that run in the background. You should really deal with it and optimize it. Afterburner e.g. Can also cause severe stuttering. So many possibilities 😵💫
Frog you stay with amd I’ll stay with NVIDIA we all can’t like the same things It’s pointless to keep comparing Just like cycling 🚴 I ride….. Giant propel.. Scott Foil RC and form specialize the S-works tarmac….
If it isn't a 1440p card out the gate UE5 engine games will in time undoubtedly make it a 1440p card. It happened with UE4 before it and it will again and since developers are incentivized with deadlines they relay on upscalers like DLSS for their lack of optimization further driving the business model of just upgrading every generation. Its so pervasive in the industry as more and more studios switch to UE5 leaving you as a consumer in an environment that perfectly suits to punish you for not buying the highest end card. Every card in NVidias stack served to upsell you to the next card up.
one slightly possible reason might be and i love amd all day, that you're using a cohesive system aka amd cards with amd cpu ryzen systems whereas most nvidia owners are using an intel based nvidia system with the graphics being nvidia and the cpu being intel and therefore having synergy but if you mixed an nvidia card with an amd cpu system maybe it wouldn't work as smoothly in consort with one another as its recommended for both card and cpu to match in a system rebar and smart access memory? So to see what youre possibly seeing and feeling the out of sync possibly stuttery jittery feeling and color mismatch } is for an nvidia user gamer to have an nvidia card on a ryzen based system.
Nope, just in the last 3 days... FFXVI using 16.9gb while playing at 4k with fsr quality. Tarkov at 4k was using around 19.5 which is maxing out my 7900xt and causing performance issues, This was after wipe/update and on PVE so that might be the issue also... 16gb at 4k will be a issue with some games already out and many games in 2025+. Sad thing is I wanted to finally go green after rocking amd 5700xt,6800xt,7900xt in a row, but not seeing the 16gb couple with possible pricing I think ima have to wait things out to see what both companies end up doing over the next year.
It won't be a 1440p card. GDDR7 will deal with resolutions better and will lower Vram usage on 4K. I think 5080 won't age as fine as 1080ti but it won't age as bad as the 4060.
When it comes out it will a 4k card but as time goes on and games get more demanding and require more vram it probably will be a high end 1440p card. I got a pc’s with 6900xt 7900xt and 2 with rtx 3090’s and all of the are 4k 60 1440p 120 cards. I base my pc builds on the consoles. I just need to pc’s what my console can’t and I’m satisfied.
The 1% low margin are less than 5%. The goal post keep moving 😂. The funny part is to see all this seals on the comments 😭🤣🤣🤣 People push back cause where are your numbers and numbers dont lie... You should take the L 😂
Well at least he tested it himself, how many other ppl here can say that. But frog, I would have kept side by side footage of this so-called stuttering on the 4080 at 4k vs 7900xtx, can just make comments about it and show no footage.
Yeah I definitely should've kept it. I will be getting a 5070 9070XT and possibly the B770 this year for covering them over the next generation of PC gaming
1% are important indeed but i guess that’s also related to the game optimization itself.. we all know many if not all ue5 games are stuttering like crazy. Both gpus are good, 4080s costs more because it has better rt than amd, thats everybody knows it.. also dlss. I had both amd and nvidia and imo they both have their advantages.. about 5080 i would prefer it to be 16gb and not cost more than 1200-1300 rather than being 20gb or more and cost 1500-1600..
And about colors amd vs nvidia gpu, i think this is true, amd has slightly more saturated?! colors, I’ve realised this years ago but I thought im just crazy and imagining things
@@perceptixnyou're not crazy, I swapped around my 6600 and 3060ti quite alot during the mining craze a few years ago and the 6600 does have better colors and sharper texture on similar settings.
How about it be 1200-1300 AND be 20/24GB? This is not too much to ask. It is actually the minimum this very capable 4K silicon should be equipped with and if they have to reduce their profit margin to 50 or even 40% then they should.
My guy... why you always just making up false narratives? You mention Fabio's video and the 1% low data is pretty much identical between the 7900XTX and 4080S. 67.5fps for 7900XTX, 66.1fps for 4080S. 1fps difference is literally margin of error and not even worth mentioning. You'll harp on and on about it like it's some win for AMD but objective reality doesn't support that. That's just for raster average... slap on ray tracing and you already know the AMD cards 1% lows are going to be a joke.
I have watched that video and though Frog would finally step back into reality. I was wrong. I love my nitro + 7900xtx but does frog realize RT is here to stay? I feel that frog is in full denial.
@@Ireland-m5p LOL imagine playing Cyberpunk with only raster last gen graphics on a 4080S. Slap on path tracing and the 4080S wins by like 250%. You can see the Alan Wake II results in that video. You can also see the new UE5 engine games like Silent Hill II and Wukong struggle tremendously on the AMD side. Not a good look for the future. Also you apparently can’t read a graph so let me help you. I took the 4K average of 1% lows. It’s an absolute nothing burger.
NOT at all sell the least can for as much as they can . The cut off the 4090 and the 5090 is insulting . The 3080-3080ti those were 80 class cards not the 4080 or the soon to be 5080 those are 70 class cards . 16g of varm at 1200-1500 or what ever they are going to charge is a fxk you to gamers that it.
See Fabio prefers AMD but he is straight down the line, he keeps it balanced & he's never wrong on anything, been watching his videos since I got my XTX That's why he's at the top for me of ppl I listen too Quite often if I get issues, i'll ask his opinion just to double check & he confirmed to me about my issues with the 4080 I knew I wasn't going mad Another reason why i'm going to get a 9800X3D or whatever because he told me 1% lows would be even better and give me smoother gaming That appeals to me more atm than buying any of those overpriced GPUs
@Frogboyx1gaming Yeah that's why I reckon I'll op for the 9950X3D Although it uses an extra 50 watts apparently but needing to wait for CES for confirmation My 7950X3D uses the same power as a 7800X3D so could just be an overlooked TDP what I saw I did find some games work a bit better with 16 cores over 8 with my 7950X3D & makes me wonder if newer games will take advantage of it also. Anyway both are neck and neck majority of time both great 9800X3D or 9950X3D should be great
Overall I'm 99% sure now I'm not buying Nvidia next year because this experience with my 4080 Super has just further proved to myself why I just don't like the Nvidia experience as much as AMD Gaming at 4K I feel much safer with AMD and I'm not paying 2 grand for a 5090 AMD feel more reasonable with their flagship and I defo prefer the driver's and how their software works, vram A new CPU with better 1% lows sounds good to me & after watching some bang4buckPC gamer videos he said it's much better with the XTX, same as Fabio said
I like my VR as well and my VR has been so much better with 4080 super. And that is why i changed to Nvidia. loads smoother VR. Which is most of my gaming lately tbh. AMD is not great at VR.
To be completely honest his Review was very Bad. Current NVIDIA GPUs are the best in highest resolutions. To be honest the RTX 4080 is already better in 4K Raytraycing. The 4080 is better factual. You always have to crank up the settings to the highest when testing. You should watch Hardware Unboxed. They are far ahead in testing Methodology. Fabio in my opinion is AMD biased Not neutral. Hardware unboxed just dropping facts. They Trash every Company when they produce shit. Odssey runs on my Samsung Super Ultrawide OLED (near 4K like Pixel- Count) with RTX 4080 and 7800x3D with about 70-100 fps native.
Been saying it for months 1% lows are worse Of course I had the usual pushback from "Some"Nvidia fans" not all LOL but finally someone like Fabio touched on it last night with a video & he really knows his stuff I talked about how Stalker 2 felt worse, Witcher 3 etc on my 4080 Super, but on my XTX felt really smooth especially once FG went on That's why I've been saying DLSS FG feels crap, the thing with Nvidia is I feel nothing is as good as they say, it's just selling you something which feels fake DLSS really good, only it's not enough for me to just buy Nvidia just for that tbh & with FSR 4 on the horizon even less so This is actually why i'm prepared to leave all the RT on the table & not use it if AMD cannot run it in certain games I have learnt lots over the past 4 months since comparing the two, most of it I already knew tbh about what I prefer, i've come to the the ultimate conclusionnow that it's more about , raster vram, not feeling restricted in terms of using higher resolution textures and having that option to go 4K native in some games also I know it's not for everyone but RT doesn't pull me like it does others, I do much prefer AMDs software & want that extra vram gaming at 4K 🤷♂ Nvidia's tech is good when it works, but i've just been finding more issues than enjoyment
Didn’t you say you were getting the 5090 few months ago no matter what? I think you are just creating these videos for a chance that it reduces nvidia prices so you can keep buying them
My thing is it seems like you are way lenient on AMD’s prices they are not much better than Nvidia’s prices, better? Yes better not really all that much especially in the high-end 7900xtx and 4080 are basically the same performance in raster within the margin of error basically but in ray tracing it’s not really close at all so the price difference is pretty proportionate not perfectly so but close plus the work station stuff too ( I don’t give a damn about that lol but it’s there) and some people absolutely do I absolutely do not think they deserve praise any praise for their prices either especially high end they get away with as much as they absolutely can. … I do not expect the 9070 to be any better with price either
@ if it hits that price with much better rt and ai upscaling that works well that won't be completely awful. But my guts telling me they ain't going doin anything less then 700 I hope im wrong though
I guess you just see what you wanna see. I saw the average over games at the end, and the 4080 on average raster was ahead at every resolution, and lows were within a few fps. He didn't answer a few times asked if SAM was being used. And he could be using the Afterburner issue that drops 1% lows on the Nvidia that terra just did a video on. His frame gen lows I'd need to see to understand. His graph made it look terrible but im not sure if that's afterburner reporting lows different for Nvidia over AMD. An example would be for me running 80fps no FG. Then turning on FG my overlay will report my fps at 140 but my lows at the original 80 (prior to FG) however very smooth frametime. I assumed this was normal but AMD FG appears to report more like real frames.
@Frogboyx1gaming I've no idea 😂.. My comments or reply wouldn't work on your channel for the last 4 hours so that's all I could assume....or your channel was messed up again..
Yes, you're being unfair. According to techpowerup my RTX4080S has 95fps average at 4K native based on 25 games tested. With DLSS and FG I get 120fps even in the most demanding RT games. The RTX5080 will be even faster and yet you want to tell people it will be 1440p card :P. If you consider RTX5080 a 1440p card, than the RX7900XTX should be considered 480p card (even at 480p internally 7900XTX cant run PT games smoothly). It's true, though, that Nvidia likes to gimp their cards. That way they can force people to upgrade their GPUs more frequently. For example the RTX3070 it's still a great card (in terms of performance), but due to VRAM limitations you have to reduce texture detail a lot to get a smooth experience. I knew 8GB VRAM wouldn't be enough in 2020, because even PS5 had more memory, and you never buy a new GPU with less VRAM than a console at the start of a new generation. 12GB VRAM is however still enough, yet alone 16GB. I play games with OSD and usually see 9-12GB VRAM usage. There are few games that can use more than 16GB with PT at 4K native with mods, but my RTX4080S cant even run such extremely high settings even at 30fps. Your AMD card has 24GB but with 4x worse PT performance you arnt even going to play PT games anyway at these settings. VRAM requirements will skyrocket when developers start porting PS6 games to PC, and that won't happen any time soon (2028/2029). By the time PS6 launches, both my 4080S and RX7900 XTX will be obsolete anyway, so 24GB on an AMD card is literally pointless outside of professional use. I have heard RDNA4 RX9700 will "only" have 16GB too. It seems even AMD knows 24GB in gaming GPU is overkill at the moment.
For the majority of games you will probably never see a difference but for the games that actually matter the 4080 Super is 100% a 1440p card. Show me these games running with PT at 4k over 60fps even using DLSS and frame generation
@@Frogboyx1gaming PT is an experimental feature designed for FUTURE GPUs (PT was even called a technology preview in Cyberpunk). Thanks to DLSS technology however, I can play every single PT game at over 60fps (80-100fps) and still get 4K-like image quality. I know how to tweak the DLSS/reshade settings to get far better results even compared to standard DLAA / TAA, let alone image quality on the PS5Pro, so when I say I get 4K-like image quality I really mean it. Black Myth Wukong (BMW) is the most demanding PT game right now. Here's my built-in benchmark results, but keep in mind, my RTX4080S is OC'ed (2925MHz 820GB/s memory bandwidth), stock 4080S would get a little bit worse results. -4K DLSS Q + very high settings + full RT (PT) I get 34fps. Not much but console players sometimes play at 30fps. On PC platfrom 30fps feels even smoother compared to consoles, becasue all games support LFC. 30fps games on PC are also more responsive without VSync lag. With FG I would probably get PS5 like performance (around 60fps) while still having better experience / input latency. DLSS FG works well even below 60fps (unlike FSR FG) and in this particular game FG has lower latency even compared to FG off (because FG activates NVIDIA Reflex and you cannot use this feature separately in BMW). -4K DLSS Q + FG, very high settings, medium RT (PT) 76fps -4K DLSS P + FG, very high settings, fullRT 86fps. This is the way that I prefer to play this game. With my dlss / reshade settings I get 4K like image quality and 80-100fps during gameplay. The game is perfectly smooth and responsive at these settings. -4K DLSS P + FG, very high settings, medium RT (PT) 103fps -1440p, DLSS Q + FG, cinematic settings, fullRT (PT) 103fps -1440p, DLSS Q + FG, very high settings, fullRT (PT) 112fps -1440p, DLSS Q + FG, cinematic settings, medium RT (PT) 123fps. -1440p, DLSS Q + FG, cinematic settings, lumen (software RT) 127fps At 1440p with medium RT I only get 3% worse performance compared to "lumen" (123fps vs 127fps), so on my PC it desont even make sense to play this game without PT. Try to play this game on 7900XTX and even medium PT will destroy performance compared to "lumen" lighting. Cyberpunk runs even better. With Psycho RT I get 140-170fps with 1440p DLSS Q + FG and 110-130fps with PT. At 4K I have to use DLSS Balance to get smooth fps with Psycho RT (120fps). As for PT I need to use DLSS Performance + FG to play at (80-100fps). I cant complain about performance in PT games and the vast majority of games in my library run at 4K native 120fps. Of course, RT games are more demanding, but quite a few run at well over 60fps even at 4K native, for example games like Shadow Of The Tomb Raider, Metro Exodus, Guardians Of The Galaxy. Some like the RE3Remake runs even at 130-200fps. Of course the most demanding UE5 or RT games will not run as well, but thanks to DLSS + FG, even these demanding games run at 4K-like image quality and over 100fps. For example I get 45-50fps in hellblade 2 at 4K native, bur with DLSS Q + FG 110-120fps. I'm 100% happy with the performance of my card even in PT games, but please ignore all that and tell me my RTX4080S is a 1440p card. I can say the same absurd thing about your 7900XTX and call it a 720p card, and I'm being generous because I doubt the 7900XTX can run PT games smoothly even at 720p internally.
Nope the 5080 will be excelent 1440p gpu but 4k well 2 or 3 years maybe yes but today when games are extremly unoptimized i would say 16gb of VRAM dont get you that far away of you wanna activate dlaa ak 4k native also IF you activate frame gen on top of that even less VRAM so yeah its 1440p gpu cuz most of the tíme when you play at 4knyou activate dlss and samé goes with 4080super and samé it will be with 5080 Like dont get mé wrong cuz new gen 5080 vs 4080super the difference isnt that big and also IF nvidia give around 20gb of VRAM for 5080 all woukd bé great but since they wont do that anytime soon we never know but price of the even 4080super is price of 4k gpu but realisticly it is 1440p gpu and with 5000series this gonna feels even worse cuz price will be around 1200usd imo maybe a bit more Like 1300usd and 5090 will be about of the hand but that gpu is no longer for gamers its for AI bros, miners and scalpers.
No you are just wrong 16Gb is the reccomended for a 4K card anything above is really overkill and tbh not even futre proofing. Co's the card will need upgrading before 24 gb is needed. If that was the case my 4080 super would be a 1440p card lol And news it isn't.
Depends on what Game or if you are a Sim player, on X plane 12 at 4K I am pushing max VRAM on 4K and on MSFS 2020 it certain situations with 3rd party aircraft like the FENIX and high quality payware scenery VRAM is almost gone on my 4080 super
It depends on the game. Some games will use more if more is available. I remember seeing a video that said that new Star Wars game will use up to 20gb of vram if you play on rtx 3090 maxed out.
Easy to explain here 4k buy a 5090 1440p buy a 5070 ti. 5080 doesn't make any sense has the same amount of Vram as the 5070 and it has almost the same amount of cuda cores. So is either a 5090 or a 5070. I don't think the 5080 will sell that much unless they do a ti or super version with at least 20vram and a few more cores. Other than that I don't see this card doing so well in the market specially if the rumored price is real. I mean of course some fanboys will buy this card for odd reasons but it's definitely not the best choice of the 50 series.
It's all a matter of perspective. I once joked that the 4090 is an amazing 720p-low card if you don't plan on upgrading for 10+ years.
Why I (personally) chose the 7900xtx nitro+ over a 4080:
- 24GB VRAM, 384 bit bus for considerably better longevity (devs keep optimizations low and games keep being poorly designed and its getting worse and worse)
- RT/PT is still a poorly developed feature Nvidia tries to sell since the 20 series and Im generally not interested in most AAA games unless they are well made and fun (I enjoyed CP2077 and Hogwards legacy a lot for example)
- I prefer native res, no upscaling and fake frames for 100-180 fps in 4k/1440p in the games I play (BG3, Dead by daylight, Helldivers 2, PoE2, Witcher 3, TLOU etc.)
- I want a safe, non experimental power connector that doesnt melt (yes 4080s melt too, just not as much as the 4090s)
- AMD adrenaline is far better than Nvidias control panel, by a mile
- DP 2.1 and not only 1.4a
- overall very reliable core (7000 series) and superb board quality (Sapphire nitro+)
- very good OC/UV potential
- no issues with multi monitor set ups
- potential switch to Linux when Win10 support ends
- the design and RGB are amazing
Those are my reasons for purchasing the 7900xtx nitro+. If you made a proper research and the 4080 fits your personal needs better, Im happy for you. Enjoy!
It seems that the RTX4080 / 4090 both have design flaws. The PCB can crack under the weight of the cooler, and there's also a problem with the connectors melting. Fortunately, Nvidia has fixed the connector problem. The RTX40 series cards (at least most 4080S models) have a redesigned 12V 2x6 power connector that only draws power when the cable is FULLY plugged into the connector. YT experts have tested this connector and even if you bend the cable in this connector, nothing bad happens. My RTX4080S also has an anti-gravity plate, so the PCB cannot bent no matter what (but I still bought a support bracket just to be 100% sure).
The 7900XTX also has design flaws. Our good friend Frogboy has already replaced his XTX because it blew up. The 7900XTX's chip isn't perfectly flat, and that's a problem because it can only fully touch the cooler in the middle. As a result, the thermal paste dries out very quickly and then GPU chip can overheat. Not many experts are willing to repair 7900XTX chips because AMD has made it very difficult to reball these new RDNA3 chips.
As for your other "arguments", I will try to answer some of them later.
- The RX7900XTX has 24GB. Of course that's more compared to the RTX4080S 16GB VRAM, but it's not something that can make a difference in current games. Most games use 9-12GB VRAM even at 4K. There are few games that can use more than 16GB VRAM with PT + FG at 4K native, but my card cant even run such extremely high settings. I need to use DLSS to get smooth fps and then my card is no longer VRAM limited. Your AMD card has 24GB but with 4x worse performance in PT you arnt going to play the most VRAM intensive PT games anyway.
VRAM requirements will skyrocket when developers start porting PS6 games to PC, and that won't happen any time soon (2028/2029). By the time PS6 launches, both my 4080S and RX7900 XTX will be probably obsolete anyway. I have heard RDNA4 RX9700 will "only" have 16GB too. It seems even AMD knows 24GB in gaming GPU is overkill at the moment. I would be worried of my card would only have 12GB VRAM, but 16GB is still plenty.
- RT is poorly developed feature? Well, to be fair some games indeed have poor RT implementation, but based on what I tested most RT games runs great and RT does make a difference. When I first played The Witcher 3 with RT, I was shocked at how much better it looked. Lighting with RT was no longer flat, because RT adds indirect lighting / shadows. RT reflections look much sharper and no longer fade during movement. Also, thanks to RT shadows no longer draw a few metres in front of the character. It's impossible to ignore such huge difference unless you really dont care about the graphics at all.
In 2018, people could ignore RT because there were only a few games with RT. As of now however, there are over 100 games with RT support and we see new games with RT almost every week (and not just in AAA games, but indie as well).
- You said that you prefer to play at native resolution and real frames, but in raster both cards offer literally the same performance, so IDK why you even mentioned this argument in favor of your card. According to techpowerup's test, the RTX4080S averaged 95 fps based on 25 games tested at 4K native, while the RX7900XTX averaged 96 fps. Because both cards offer similar/equal performance, I'm not losing anything, and I can't say the same about the AMD card, because you cant get the same performance in RT games. The RX 7900XTX gets destroyed in RT, especially in PT games.
In Black Myth Wukong at 1440p DLSS Q + FG I only lose 3% performance with medium PT compared to lumen (123fps vs 127fps). Try to run medium PT in BMW on your 7900XTX at 1440p FSR Quality + FG and you will get drastically worse performance compared to lumen (like 4x times :P).
On my PC I can even play BMW with fullRT (PT) and maxed out settings. I get 103fps in built in benchmark at 1440p Q + FG and that's still perfectly playable. At 4K I need to use DLSS performance but with the latest DLSS 3.8.1 and my reshade settings I still get 4K like image quality (far better even compared to PS5Pro version running in balance 40fps mode) and 86fps in built in benchmark (80-100fps during gameplay). That's still perfectly playable experience. I doubt even RDNA4 GPUs will be able to run this game as smoothly as my card. Maybe RDNA5, because I read that AMD wants to rebuild their architecture from the ground up and go the Nvidia route (add AI and RT cores).
Playing Cyberpunk with Psycho RT at 140-170 fps at 1440p DLSSQ + FG is an amazing experience. I get 23ms latency at these settings. Even PT runs great 110-130fps. I'm 100% happy with the performance of my card and that would not be the case if I had bought an AMD card.
- As for RGB lighting, my card has it as well.
- When it comes to DLSS image reconstruction, it offers the same and often better image quality than TAA native. I tried playing RDR2 in native TAA. The image was extremely blurry and I wouldnt want to play this game like that. With the latest 3.8.1 DLSS (it's very easy to swap DLL files) I get MUCH better image quality even at DLSS performance settings compared to TAA native in RDR2. No sane person would want to play this game at native TAA when DLSS looks so much better.
What's more, I can also use DLSS + DLDSR (AI powered downsampling that looks the same as DSRx4 but runs at half the performance cost). With this combination, even the most blurry games like Hellblade 2 look sharp and I still get 3-5fps more compared to native TAA.
AMD cards can use FSR2, but this image reconstruction technology is definitely inferior compared to DLSS. On the static image FSR2 image can look quite good, but during motion FSR2 has shimmering problem. Perhaps the AI-powered FSR3 will look better, but for now I try not to use FSR even if game doesnt support DLSS. In RE4 Remake even 2880p FSR2 Quality was still shimmering even when downsampled to 1440p.
Nvidia FG is also better than AMD FG. Nvidia FG works well even below 60fps, it has less noticeable input lag, and no motion judder. Most people who say crap about DLSS FG havent even used this technology on VRR display. DLSS FG works extremely well and is a very useful technology because performance can drop below 60fps from time to time even if average framerate is much much higher. Thanks to DLSS FG I no longer worried about sub 60fps dips because I always get enjoyable experience.
Many people chose 7900XTX over 4080S simply because AMD card is $100 cheaper, but that's not the case in the long run. I saw bank4buckPCgamer YT gameplay videos and his 7900XTX at 99% GPU usage always draws 465W. My RTX4080S draws between 260-315W depending on the game (at 99% GPU usage), so that's about 150W difference. My entire PC draws 430W at absolute max. 150W in use for 8 hours (480 minutes) is 1.20 kWh, or 438.00 kWh per year. I have to pay 452zł for that (109 $USD at today's exchange rate). In my country, the 7900XTX would definitely end up costing more money in the long run, maybe not the first year becasue I'm not playing for 8 hours daily, but after 3 years that would be the case.
If I were to buy an AMD card I would wait for the RDNA4 GPUs as they will offer AI powered FSR, much better RT performance and should be much cheaper than RTX4080S. If AMD can deliver all this, then I might even recommend my sister to build a PC with an AMD card over RTX4080S.
It's pretty wild that in 2024/2025 you ONLY get 16gb of vram when spending over $1,000. Madness.
Pretty trash but let's see how their texture compression works. DLSS and FG have now been copied by the rest of the industry. Bet this is another feature that has the same trajectory.
@@Phil_529 I would rather see no textures used at all!. Imagine like raytracing used to be, you set up physics based materials. Not a single texture in sight, and it looked more realistic. But it is also a lot more compute intensive. Or we go the stable diffusion way, generate a vector cloud that is decoded into a realistic picture, it takes a second todo on a GPU today, but with specialised hardware for this, it might be realtime.. I am very interesting to see what their Neural rendering is anyway :)
@@Phil_529 frame gen is horrible though, way more visual glitches than dlss upscale, also very noticeable input lag in some games, indiana jones being one recent example of it... frame gen really is a hit or miss, in every game i tried i could only stand it to the end in one game and even in this game (god of war ragnarok) i had annoying glitches during the campaign, in some areas i have been forced to disable it completely, when Atreus is with Heimdall in its first time in asgard. I thought nvidia frame gen would be a huge improvement over amd frame gen since that was one of the reason i have upgraded to rtx 4000 gpu, turns out i was wrong... with dlss frame gen you go from 60 fps to like 90s but input lag feels like 40~50s, feels like you are playing via cloud sometimes and it also adds an extra layer of blur to the image.
But but you will get DLSS 4.0 and and next gen Frame genaration and all those crap😒
It is madness indeed and thkse prices are banana's and if Trump put tarrief it will go even higher those prices😂
@@igorrafael7429 agree about frame generation, it is bad. But I like DLSS at highest quality, cause it is like a anti-aliasing on steroids.
FPS is not a measure of smoothness. It is a measure of performance. You can have high FPS but no smoothness. That's why Frame Time is so important.
1% lows
@
1% low is derivative of frame time.
what do you mean? you get 50% more fps by turning dlss frame gen on, its free performance, bro, you are a hater, get out.
@@igorrafael7429 Frame gen is designed to make an already playable experience better. Not to make an unplayable game playable.
FG is only as good as the information you give it.
@@igorrafael7429 you are very intelligent
There is one thing I found when I was testing the 4080 vs. The 7900s and 6900XT. The 4080 would have a much higher frame rates when displaying black. Black of course on a game is the lack of information. I found this when testing Ori Willow of the Wisp when the scene was almost empty (black) and Rift Breaker at night where the detail is truncated. The AMD would jump to 150 fps while the NVIDIA would hit 600. This would throw a counter.
Any card for 1000$ and more in 2023/2024 with 16GB or less was like greedy move... in 2025 will look like scam...
There was no better alternative when I bought the RTX4080S. AMD cards are useless for my needs (poor RT performance, poor image quality, poor frame generator), so I wasnt even considering AMD, while the RTX4090 24GB was $1000 more expensive compared to my card while only offering 10-15fps more (that's like the difference between DLSS Quality vs Balance). I would rather save that $1000 for my next upgrade, than pay that much for the RTX4090 to see just a smal uplift in performance.
I'm very happy with my purchase. There isn't a single game (PT) that wouldn't run great on my PC and the price/performance ratio is also very good. According to the techpowerup benchmark, my card has 95fps at 4K native based on 25 games tested (and my card is OC'ed to 1925MHz core and 820GB/s memory bandwidth so it should reach 100fps barier). The $299 RTX4060 has less fps (85fps) at 1080p. Getting 4x the resolution for 3.3x the price is a fair price to pay.
It is a 1440p card because in order to get above 60fps in every unoptimized modern UE5 game we will have to use DLSS which renders at 1440p or lower.
That’s a good point!
@@matthuang4632 so is that the same for the 7900xtx?
@@Spratty-wb3is It can run games at 4K natively without RayTracing. But now almost all games have RT, so It's either 4k native without RT, or FSR(1440p)+FG with RT.
@matthuang4632 the XTX doesn't do 60fps in unoptimised games in the same way the 4080 doesn't
They both perform about 45fps.
They both need upscalers to hit above 60fps
If that makes the 4080 a 1440p gpu it also makes the 7900xtx a 1440p gpu.
RT isn't in the equation..
It's a 1440p card because of the fact Nvidia are artificially making it obsolete with this VRAM 16GB which isn't enough to allow gamers to max out texture settings on 4k .
I hearing of this AI texture compression thing which Nvidia would upsell gamers to believe that's better than native texture resolution . I can see the bullshit coming
Let’s be honest, PC gaming has been giving a lot more diminishing returns when compared to consoles this gen. Consoles have done a better job at closing the gap for dollar/performance this gen. For most games playing on PC is redundant.
I agree with you. There are already games that suck 18 - 19GB of VRAM on 4k ultra. They will make 5080 cry
The 80 model is what Nvidia offers for the 1440 experience. Anything less than that is a - you're too poor to afford more model.
Time will tell. I have both AMD and Nvidia. To be truthful my AMD card has caused me more issues. My games seem to stutter a little more,
some games straight up refused to run correctly and using my VR headset was frustrating. Never had any issues with nvidia. However, I will also never rule out either company for future purchases as the price to performace from my AMD card is very good. At one point i upgraded a 3070 to a 7900xt and had the same performance so I sent it back.
The big problem with next RTX 5080/Rx 9070 xt is 16 GB VRAM. For example, heavy raytracing requires more 16 GB VRAM in Indiana Jones
Yup, shame that AMD gave the new mid-tier flagship only 16GB, that's when i decided to buy a 7900XTX for 4K gaming
Big problem is maybe on 5080 that will cost over 1000$. Its not a problem on 9070xt that will be round 600-650 and marketed as 1440p gpu..
@@roki977 Yes, i agree
People shouldn't take your videos seriously. Like it's fun to watch every once in a while. Your videos are funny. But definitely not someone I would take serious when it comes to product advice or knowledge.
Did you even update your bios btw. my 4080 super had loads of issues like that until I updated my bios that was from before 4080 super was launched. once I updated the bios all the 1% lows were good and had no issues.
I have an rtx 3080 10GB vram.
I play on a 3440x1440 34 inch monitor.
I have 140 games, except for a very small number I play all games on the max settings (ray tracing off, I don't need that). Yes, the only games that have a bit more difficulty are the heavy games like cyberpunk, if necessary I turn on dlls. But at the moment i play all games on the max settings with my rtx 3080.
The point is that the rtx 5080 will certainly be playable on a 4K monitor in many games and you will achieve over 60fps in many games (but not in all games). But who can buy an rtx 5090? And wich card gonna be better then the rtx 5080 for 4K in affordable price?
The 4080 can play most games in 4K@60 with max settings. The 5080 will be the same but with higher FPS. What’s important is the price point.
Not to mention how well Amd can emulate games , well most Gpu’s can be emulation and high refresh rate for multiplayer experience
Also Fabio had several games where there was little difference between the XT and the XTX. This must be a driver issue on the games.
Exactly
Soon as FRS4 matches DLSS, I am going AMD...
FSR is catching up to DLSS. A card with software/hardware upscaling will always to better. AMD software approach is great but it’s done to sell cards at a cheaper price.
I hope FSR4 is available for 7000 cards.
I just bought my XFX 7900 GRE last November 😅
They are definitely overcharging us. But truthfully, there is additional value for some. Weather it’s the lower power consumption, 30ish% better Ray tracing, the upscaling. Or all of the above. AMD is superb though. I welcome all competition. In the end it benefits us gamers. Hoping the new 9070xt upscaling tech has made enough advancements to warrant NVidia price drops. Someone needs to keep them on their toes.
Yes but the vram potentially undercuts the RT gains
The 5080 does look like it will be significantly inferior to a 4090 for 4K gaming.
Most of the NVidia features need more VRAM to work and the impact of insufficient VRAM doesn't always show up in short review sessions.
The 5080 might be a good 4k card if they do a super/refresh version with 3GB VRAM chips
A 24GB 5080 might be slightly better than a 4090 for for 4K gaming.
If the readily available versions of the 5080 are 1500 bucks, and the 5070TI is available for 900, the 5070TI will be a lot more interesting as a 1440P/4K with DLSS card
The other thing Frog is Radeon GPUs defo have a more colourful image, it's in their design
Bang4buckPCgamer confirmed this to me so we're not the only two seeing the difference
I put the Witcher 3 on once put the XTX back in & it defo is better for my eyes, same on Ratchet & Clank, it's like a more faded colour on Nvidia I find, I did everything to try and change that also but never truly got it to how I wanted tbh
I don't like Nvidia color science AMD is significantly more pleasant to look at
@Frogboyx1gaming Yup
16GB 5080 is a 1440p max settings card, but only a non-max settings card at 4K ... coz some AAA titles already using >16GB VRAM at 4K. Thus, I won't be buying this card for 4K.
I agree.
People need to understand that nVidia isn’t a gaming company anymore. What they make in GeForce sales is utterly dwarfed by datacenter and enterprise. Same goes for AMD’s and soon even Intel’s GPU division.
It is not longer 1:1 of what is generated and what is presented. We always had this with texture filtering and stuff, but DLSS, upscaling, motion interpolation, and other modern methods add new layers which makes it very difficult to measure performance or even the resolution. (The parts would even have their separate resolutions like shadows or reflections, but this is now much more).
I'm not sure we can do any more objective analysis. Some people will like smoother graphics, others like crisper ones. Some don't perceive lags, others are annoyed by it. It has become a multiple levers of tradeoffs with no clear winner.
You make some good points. I don't think it is a 1 to 1 comparison anymore honestly. In my eyes AMD has a very different look and feel when it comes to visuals and performance that I greatly prefer.
@@Frogboyx1gaming Yes. In the past it was a strict API, and what you described would be what you got. The only difference was performance (or feature sets).
Today, you basically tell your "intent", and the driver "optimizes" it for whatever priorities they have. The results are similar, but as we have all experienced, no longer the same.
Man, why in the hell AMD have failed to push its 7900xtx? I mean a 24GB video card for less than 1.000$. And Believe me it still have a more for improvement, even for ray tracing!
The problem isn’t Nvidia or AMD. GPU’s are plenty powerful, even the midrange stuff. The issue is these devs we have today seem to have less work ethic and don’t put in the work required to properly optimize a game. They just shove the game out and expect Nvidia and AMD to optimize it for them through drivers and AI upscaling. Devs need to start doing a better job.
I don’t think so. In fact keeping it real from a pc gamers perspective opens eyes to what Nvidia is doing.
Look, I despise Nvidia more than you know, but your claims don't hold water. I've been gaming at 4K Ultra settings exclusively for years now. Dozens and dozens of games, and there are only a handful of games which can't handle 4K on a high-end 16GB GPU regardless of brand, and those are mostly ray tracing heavy games. There are thousands and thousands of games out there, and if you judge a GPU by the 10 games which don't run smoothly, you're being ridiculous.
Frogman be like
Nvidia is charging way2much
Ps5pro is 2x the price of a ps5
But not double the power
Frogman b like
I love my ps5pro🤣🤣🤣
Y wont frogman keep the same energy?🤔
PS5 Pro 256-bit bus for $700. Literally unplayable.
😂
😂 i said the same thing in another video when he said the price of the new cards ain't worth it for what you get 😂😂😂😂
Even Cyberpunk was running better at 4k raster on 7900 xtx than the 4080 super in the Ancient vid
99% of tech youtubers miss the software part of it all. Like when Last of us came out, they start claiming it was lack of VRAM that was the trouble.. when in reality it was the texture compression that was leaking memory filling up the cards.. and yes the cards with more vram would last better.. but no it was 8GB vrams fault.
What game in the background?
Deus Ex.
@kalleanka2228 i dont think so, I was initially thinking a roman style game, then maybe from an assassin's creed franchise
Assassin's Creed Odyssey
Here is some food for thought that most people don't realize. Steam OS has better performance than windows does with the same hardware configuration. I believe the biggest problem with most of the hardware out there not performing to expectations is how bloated the OS for gaming PCs is. Windows 10 and 11 even worse so are bloated with so much spyware it's ridiculous. This is the very same reason Microsoft is actively developing a lighter OS for gaming and handhelds specifically. People don't think about this much but it is definitely a factor.
His earlier review showed a different result. He must have changed the stable of games.
im just glad we're finally getting a viable 3rd option with intel's new GPUs. they probably still need another year or two to really fine tune the comparability stuff but they'll probably catch up / beat NVIDIA and AMD in the coming years.
Not an NVIDIA fan but I have to admit GDDR7 sounds like a decent upgrade on paper. Curious how it will translate to game performance 🤔
That's not how different class works. The 5080 will be a 4K class GPU but not at max settings. Only the 5090 will be a maxed out 4K class GPU.
5070 - 1440p medium-high settings level class
5070 Ti - 1440p max settings / entry 4K level class
5080 - 4K medium-high settings level class
5090 - 4K max settings level class
3090 was advertised as first 8k GPU...
In that case any xx80 class card should be for 4k with max settings...
Nvidia is moving performance away from xx60/xx70/xx80 class cards creating wider gap between this one top card and slower cards in the lineup.
In 2016 1060 had around 50% spec and performance of 1080 Ti, in 2025 5080 will have around 50% spec of top Nvidia gpu but will have 70% of price...
I have RX 7800 XT with 16GB VRam and right now I'm playing Witcher 3 with high res textures with everything set on UBER+ without RT in 5120x1440p@60 and depends on location this 10 year old game can use 11-14GB of VRam.
This is ok for me, my card cost me around 500$ not 1200$ or 1500$ like Nvidia wants to get... Also in time when ultra wide monitors are getting cheaper and cheaper resolutions like 5120x1440p or 5120x2160p will be more popular and in that case having 12GB or 16GB ram in 800-1600$ card will be like deliberate aging of the product. Fore me Nvidia is just greedy and it's just a scam.
@@WujoFeferUltra wide has 7.3M pixels so it’s close to 4k which is 8.3M pixels. Hardware isnt halving in price like it used to.
it is 4k gpu for games previously released... its going to be 1440p for games after its release, remember, 3080 was once a 4k card too , how long did it last? lmao...
The 4090 is a 1440p card. I use it on my main system and would never go to 4K because there is already not enough FPS on 1440p. The 5090 will also be a 1440p card.
I watched Fabio's video too... The lower 1% lows will be because both of the AMD cards have more Vram! ... There is a reason why it is often referred to as a "frame buffer" they are not faking anything! 😂😂 For example... Again I run my 3060 Vram 1ghz faster than stock, and the biggest difference I noticed was my 1% lows being higher, this is because with more speed comes a bigger bandwidth! AMD has bandwidth advantage due to MORE Vram. You seem quick to ignore that in that video, the games this didn't matter on the 4080S out performed the 7900xtx at 4K! A card you label 1440p! But outperforms the 7900xtx in some games at 4K? Nvidia aren't faking anything it is just a fact of both AMD cards have more Vram! Again... "Frame Buffer!" Why don't we get videos on the AMD leaks? Why do we not get your thoughts on the obvious deception attempt by AMD with their new "Nvidia from Wish" naming scheme? Froggy, you being someone that has made both Xbox and PS content and seeing the comments, you should understand what I mean by this... AMD fan boys are the GPU equivalent of PS Fan boys, Nvidia Users are the GPU equivalent of Xbox Users, yes Xbox & Nvidia have "Fan boys" but not in the same way... Most of the retaliation is from them hearing someone blatantly lie about a product to try and discredit the opposition instead of promoting the product they like. I never hear anyone bad mouth AMD cards while reviewing Nvidia cards, in fact more often than not most offer a "cheaper" AMD alternative at the end. AMD reviews tends to turn into a hate on Nvidia Video! And don't usually end with being offered an Nvidia card alternative, to me this shows that most tech channels push AMD first, but you say the opposite but often show we watch some of the same channels. 🤔🤔😂
The 4080 Super is a decent 4K card but a phenomenal 1440P card. I sold my 4090 and bought a used 4080 Super for $600. I also picked up a 27' 1440P QD-Oled and removed my Samsung 4K IPS monitor. With that said, I enjoy the 1440P QD-Oled more than my 4K display. I think the overall experience is better with 1440P, visually on par and FPS is better. 1440p O-led > 4k IPS
Absolutely brother I kind of think 4080 Super is great 3440 1440p card
Most of the time it is Windows that causes problems. And all the additional programs that run in the background. You should really deal with it and optimize it. Afterburner e.g. Can also cause severe stuttering. So many possibilities 😵💫
My thing between the 2 is in the Price. I Refuse to pay $1,500 or over.
Frog you stay with amd
I’ll stay with NVIDIA
we all can’t like the same things
It’s pointless to keep comparing
Just like cycling 🚴
I ride…..
Giant propel..
Scott Foil RC and form specialize the S-works tarmac….
If it isn't a 1440p card out the gate UE5 engine games will in time undoubtedly make it a 1440p card. It happened with UE4 before it and it will again and since developers are incentivized with deadlines they relay on upscalers like DLSS for their lack of optimization further driving the business model of just upgrading every generation. Its so pervasive in the industry as more and more studios switch to UE5 leaving you as a consumer in an environment that perfectly suits to punish you for not buying the highest end card. Every card in NVidias stack served to upsell you to the next card up.
one slightly possible reason might be and i love amd all day, that you're using a cohesive system aka amd cards with amd cpu ryzen systems whereas most nvidia owners are using an intel based nvidia system with the graphics being nvidia and the cpu being intel and therefore having synergy but if you mixed an nvidia card with an amd cpu system maybe it wouldn't work as smoothly in consort with one another as its recommended for both card and cpu to match in a system rebar and smart access memory? So to see what youre possibly seeing and feeling the out of sync possibly stuttery jittery feeling and color mismatch } is for an nvidia user gamer to have an nvidia card on a ryzen based system.
It's definitely possible
Nope, just in the last 3 days... FFXVI using 16.9gb while playing at 4k with fsr quality. Tarkov at 4k was using around 19.5 which is maxing out my 7900xt and causing performance issues, This was after wipe/update and on PVE so that might be the issue also... 16gb at 4k will be a issue with some games already out and many games in 2025+. Sad thing is I wanted to finally go green after rocking amd 5700xt,6800xt,7900xt in a row, but not seeing the 16gb couple with possible pricing I think ima have to wait things out to see what both companies end up doing over the next year.
It won't be a 1440p card. GDDR7 will deal with resolutions better and will lower Vram usage on 4K. I think 5080 won't age as fine as 1080ti but it won't age as bad as the 4060.
When it comes out it will a 4k card but as time goes on and games get more demanding and require more vram it probably will be a high end 1440p card. I got a pc’s with 6900xt 7900xt and 2 with rtx 3090’s and all of the are 4k 60 1440p 120 cards. I base my pc builds on the consoles. I just need to pc’s what my console can’t and I’m satisfied.
The 1% low margin are less than 5%. The goal post keep moving 😂. The funny part is to see all this seals on the comments 😭🤣🤣🤣
People push back cause where are your numbers and numbers dont lie...
You should take the L 😂
Well at least he tested it himself, how many other ppl here can say that. But frog, I would have kept side by side footage of this so-called stuttering on the 4080 at 4k vs 7900xtx, can just make comments about it and show no footage.
Yeah I definitely should've kept it. I will be getting a 5070 9070XT and possibly the B770 this year for covering them over the next generation of PC gaming
1% are important indeed but i guess that’s also related to the game optimization itself.. we all know many if not all ue5 games are stuttering like crazy. Both gpus are good, 4080s costs more because it has better rt than amd, thats everybody knows it.. also dlss. I had both amd and nvidia and imo they both have their advantages.. about 5080 i would prefer it to be 16gb and not cost more than 1200-1300 rather than being 20gb or more and cost 1500-1600..
And about colors amd vs nvidia gpu, i think this is true, amd has slightly more saturated?! colors, I’ve realised this years ago but I thought im just crazy and imagining things
@@perceptixnyou're not crazy, I swapped around my 6600 and 3060ti quite alot during the mining craze a few years ago and the 6600 does have better colors and sharper texture on similar settings.
How about it be 1200-1300 AND be 20/24GB? This is not too much to ask. It is actually the minimum this very capable 4K silicon should be equipped with and if they have to reduce their profit margin to 50 or even 40% then they should.
My guy... why you always just making up false narratives? You mention Fabio's video and the 1% low data is pretty much identical between the 7900XTX and 4080S. 67.5fps for 7900XTX, 66.1fps for 4080S. 1fps difference is literally margin of error and not even worth mentioning. You'll harp on and on about it like it's some win for AMD but objective reality doesn't support that. That's just for raster average... slap on ray tracing and you already know the AMD cards 1% lows are going to be a joke.
I have watched that video and though Frog would finally step back into reality. I was wrong. I love my nitro + 7900xtx but does frog realize RT is here to stay? I feel that frog is in full denial.
What are you talking about? There is a 6 fps difference in 1% lows with the 7900 xtx winning in 1% lows in the 15 game average
Even your precious Cyberpunk runs much better at 4k raster on the XTX compared to the 4080 super
@@Ireland-m5p LOL imagine playing Cyberpunk with only raster last gen graphics on a 4080S. Slap on path tracing and the 4080S wins by like 250%. You can see the Alan Wake II results in that video. You can also see the new UE5 engine games like Silent Hill II and Wukong struggle tremendously on the AMD side. Not a good look for the future.
Also you apparently can’t read a graph so let me help you. I took the 4K average of 1% lows. It’s an absolute nothing burger.
@@Phil_529 fool amd is getting better a rays so cool it the 1% low with RT on on 4080 is bad
The 5080 will be a 4k card for a little while
NOT at all sell the least can for as much as they can . The cut off the 4090 and the 5090 is insulting . The 3080-3080ti those were 80 class cards not the 4080 or the soon to be 5080 those are 70 class cards . 16g of varm at 1200-1500 or what ever they are going to charge is a fxk you to gamers that it.
See Fabio prefers AMD but he is straight down the line, he keeps it balanced & he's never wrong on anything, been watching his videos since I got my XTX
That's why he's at the top for me of ppl I listen too
Quite often if I get issues, i'll ask his opinion just to double check & he confirmed to me about my issues with the 4080
I knew I wasn't going mad
Another reason why i'm going to get a 9800X3D or whatever because he told me 1% lows would be even better and give me smoother gaming
That appeals to me more atm than buying any of those overpriced GPUs
I might get a 9850X3D when it comes out I miss having more cores
@Frogboyx1gaming Yeah that's why I reckon I'll op for the 9950X3D
Although it uses an extra 50 watts apparently but needing to wait for CES for confirmation
My 7950X3D uses the same power as a 7800X3D so could just be an overlooked TDP what I saw
I did find some games work a bit better with 16 cores over 8 with my 7950X3D & makes me wonder if newer games will take advantage of it also.
Anyway both are neck and neck majority of time both great 9800X3D or 9950X3D should be great
Overall I'm 99% sure now I'm not buying Nvidia next year because this experience with my 4080 Super has just further proved to myself why I just don't like the Nvidia experience as much as AMD
Gaming at 4K I feel much safer with AMD and I'm not paying 2 grand for a 5090
AMD feel more reasonable with their flagship and I defo prefer the driver's and how their software works, vram
A new CPU with better 1% lows sounds good to me & after watching some bang4buckPC gamer videos he said it's much better with the XTX, same as Fabio said
Nvidia has only 8 Gb intermerory, so for me Rx from AMD is far better, Nvidias problem is that they stayd with 8 GB, thats so old.
I like my VR as well and my VR has been so much better with 4080 super. And that is why i changed to Nvidia. loads smoother VR. Which is most of my gaming lately tbh. AMD is not great at VR.
To be completely honest his Review was very Bad. Current NVIDIA GPUs are the best in highest resolutions. To be honest the RTX 4080 is already better in 4K Raytraycing. The 4080 is better factual. You always have to crank up the settings to the highest when testing. You should watch Hardware Unboxed. They are far ahead in testing Methodology. Fabio in my opinion is AMD biased Not neutral. Hardware unboxed just dropping facts. They Trash every Company when they produce shit. Odssey runs on my Samsung Super Ultrawide OLED (near 4K like Pixel- Count) with RTX 4080 and 7800x3D with about 70-100 fps native.
Been saying it for months 1% lows are worse
Of course I had the usual pushback from "Some"Nvidia fans" not all LOL
but finally someone like Fabio touched on it last night with a video & he really knows his stuff
I talked about how Stalker 2 felt worse, Witcher 3 etc on my 4080 Super, but on my XTX felt really smooth especially once FG went on
That's why I've been saying DLSS FG feels crap, the thing with Nvidia is I feel nothing is as good as they say, it's just selling you something which feels fake
DLSS really good, only it's not enough for me to just buy Nvidia just for that tbh & with FSR 4 on the horizon even less so
This is actually why i'm prepared to leave all the RT on the table & not use it if AMD cannot run it in certain games
I have learnt lots over the past 4 months since comparing the two, most of it I already knew tbh about what I prefer, i've come to the the ultimate conclusionnow that it's more about , raster vram, not feeling restricted in terms of using higher resolution textures and having that option to go 4K native in some games also
I know it's not for everyone but RT doesn't pull me like it does others, I do much prefer AMDs software & want that extra vram gaming at 4K
🤷♂
Nvidia's tech is good when it works, but i've just been finding more issues than enjoyment
I have a 7800xt and it is more than adequate for my wants. I could afford better but I am content with my frame rate for the games I play.
7800XT is a great card
Didn’t you say you were getting the 5090 few months ago no matter what? I think you are just creating these videos for a chance that it reduces nvidia prices so you can keep buying them
My thing is it seems like you are way lenient on AMD’s prices they are not much better than Nvidia’s prices, better? Yes better not really all that much especially in the high-end 7900xtx and 4080 are basically the same performance in raster within the margin of error basically but in ray tracing it’s not really close at all so the price difference is pretty proportionate not perfectly so but close plus the work station stuff too ( I don’t give a damn about that lol but it’s there) and some people absolutely do I absolutely do not think they deserve praise any praise for their prices either especially high end they get away with as much as they absolutely can. … I do not expect the 9070 to be any better with price either
I expect AMD 9070XT to be around 650
@ if it hits that price with much better rt and ai upscaling that works well that won't be completely awful. But my guts telling me they ain't going doin anything less then 700 I hope im wrong though
I guess you just see what you wanna see.
I saw the average over games at the end, and the 4080 on average raster was ahead at every resolution, and lows were within a few fps.
He didn't answer a few times asked if SAM was being used.
And he could be using the Afterburner issue that drops 1% lows on the Nvidia that terra just did a video on.
His frame gen lows I'd need to see to understand.
His graph made it look terrible but im not sure if that's afterburner reporting lows different for Nvidia over AMD.
An example would be for me running 80fps no FG.
Then turning on FG my overlay will report my fps at 140 but my lows at the original 80 (prior to FG) however very smooth frametime.
I assumed this was normal but AMD FG appears to report more like real frames.
What are you talking about? There is a 6 fps difference in 1% lows with the 7900 xtx winning in 1% lows in the 15 game average
@Ireland-m5p there's no 6fps difference!
I'd suggest you look again
I haven't blocked you brother. Why would I?
@Frogboyx1gaming I've no idea 😂..
My comments or reply wouldn't work on your channel for the last 4 hours so that's all I could assume....or your channel was messed up again..
Your bias toward AMD and Nvidia is showing it's ugly head.............
Yes, you're being unfair. According to techpowerup my RTX4080S has 95fps average at 4K native based on 25 games tested. With DLSS and FG I get 120fps even in the most demanding RT games. The RTX5080 will be even faster and yet you want to tell people it will be 1440p card :P. If you consider RTX5080 a 1440p card, than the RX7900XTX should be considered 480p card (even at 480p internally 7900XTX cant run PT games smoothly).
It's true, though, that Nvidia likes to gimp their cards. That way they can force people to upgrade their GPUs more frequently. For example the RTX3070 it's still a great card (in terms of performance), but due to VRAM limitations you have to reduce texture detail a lot to get a smooth experience. I knew 8GB VRAM wouldn't be enough in 2020, because even PS5 had more memory, and you never buy a new GPU with less VRAM than a console at the start of a new generation.
12GB VRAM is however still enough, yet alone 16GB. I play games with OSD and usually see 9-12GB VRAM usage. There are few games that can use more than 16GB with PT at 4K native with mods, but my RTX4080S cant even run such extremely high settings even at 30fps. Your AMD card has 24GB but with 4x worse PT performance you arnt even going to play PT games anyway at these settings.
VRAM requirements will skyrocket when developers start porting PS6 games to PC, and that won't happen any time soon (2028/2029). By the time PS6 launches, both my 4080S and RX7900 XTX will be obsolete anyway, so 24GB on an AMD card is literally pointless outside of professional use. I have heard RDNA4 RX9700 will "only" have 16GB too. It seems even AMD knows 24GB in gaming GPU is overkill at the moment.
For the majority of games you will probably never see a difference but for the games that actually matter the 4080 Super is 100% a 1440p card. Show me these games running with PT at 4k over 60fps even using DLSS and frame generation
@@Frogboyx1gaming PT is an experimental feature designed for FUTURE GPUs (PT was even called a technology preview in Cyberpunk). Thanks to DLSS technology however, I can play every single PT game at over 60fps (80-100fps) and still get 4K-like image quality. I know how to tweak the DLSS/reshade settings to get far better results even compared to standard DLAA / TAA, let alone image quality on the PS5Pro, so when I say I get 4K-like image quality I really mean it.
Black Myth Wukong (BMW) is the most demanding PT game right now. Here's my built-in benchmark results, but keep in mind, my RTX4080S is OC'ed (2925MHz 820GB/s memory bandwidth), stock 4080S would get a little bit worse results.
-4K DLSS Q + very high settings + full RT (PT) I get 34fps. Not much but console players sometimes play at 30fps. On PC platfrom 30fps feels even smoother compared to consoles, becasue all games support LFC. 30fps games on PC are also more responsive without VSync lag. With FG I would probably get PS5 like performance (around 60fps) while still having better experience / input latency. DLSS FG works well even below 60fps (unlike FSR FG) and in this particular game FG has lower latency even compared to FG off (because FG activates NVIDIA Reflex and you cannot use this feature separately in BMW).
-4K DLSS Q + FG, very high settings, medium RT (PT) 76fps
-4K DLSS P + FG, very high settings, fullRT 86fps. This is the way that I prefer to play this game. With my dlss / reshade settings I get 4K like image quality and 80-100fps during gameplay. The game is perfectly smooth and responsive at these settings.
-4K DLSS P + FG, very high settings, medium RT (PT) 103fps
-1440p, DLSS Q + FG, cinematic settings, fullRT (PT) 103fps
-1440p, DLSS Q + FG, very high settings, fullRT (PT) 112fps
-1440p, DLSS Q + FG, cinematic settings, medium RT (PT) 123fps.
-1440p, DLSS Q + FG, cinematic settings, lumen (software RT) 127fps
At 1440p with medium RT I only get 3% worse performance compared to "lumen" (123fps vs 127fps), so on my PC it desont even make sense to play this game without PT. Try to play this game on 7900XTX and even medium PT will destroy performance compared to "lumen" lighting.
Cyberpunk runs even better. With Psycho RT I get 140-170fps with 1440p DLSS Q + FG and 110-130fps with PT. At 4K I have to use DLSS Balance to get smooth fps with Psycho RT (120fps). As for PT I need to use DLSS Performance + FG to play at (80-100fps).
I cant complain about performance in PT games and the vast majority of games in my library run at 4K native 120fps. Of course, RT games are more demanding, but quite a few run at well over 60fps even at 4K native, for example games like Shadow Of The Tomb Raider, Metro Exodus, Guardians Of The Galaxy. Some like the RE3Remake runs even at 130-200fps. Of course the most demanding UE5 or RT games will not run as well, but thanks to DLSS + FG, even these demanding games run at 4K-like image quality and over 100fps. For example I get 45-50fps in hellblade 2 at 4K native, bur with DLSS Q + FG 110-120fps.
I'm 100% happy with the performance of my card even in PT games, but please ignore all that and tell me my RTX4080S is a 1440p card. I can say the same absurd thing about your 7900XTX and call it a 720p card, and I'm being generous because I doubt the 7900XTX can run PT games smoothly even at 720p internally.
I dont trust Fabio. Keep at whatever you think. I agree with you, but if its what you think, tell us.
Yes. Because u have no idea how the new tech will work…. New dlss new neural engine this time.
Fuzzy frames! Turn off DLSS.
Nope the 5080 will be excelent 1440p gpu but 4k well 2 or 3 years maybe yes but today when games are extremly unoptimized i would say 16gb of VRAM dont get you that far away of you wanna activate dlaa ak 4k native also IF you activate frame gen on top of that even less VRAM so yeah its 1440p gpu cuz most of the tíme when you play at 4knyou activate dlss and samé goes with 4080super and samé it will be with 5080 Like dont get mé wrong cuz new gen 5080 vs 4080super the difference isnt that big and also IF nvidia give around 20gb of VRAM for 5080 all woukd bé great but since they wont do that anytime soon we never know but price of the even 4080super is price of 4k gpu but realisticly it is 1440p gpu and with 5000series this gonna feels even worse cuz price will be around 1200usd imo maybe a bit more Like 1300usd and 5090 will be about of the hand but that gpu is no longer for gamers its for AI bros, miners and scalpers.
No 😅
No you are just wrong 16Gb is the reccomended for a 4K card anything above is really overkill and tbh not even futre proofing. Co's the card will need upgrading before 24 gb is needed. If that was the case my 4080 super would be a 1440p card lol And news it isn't.
Depends on what Game or if you are a Sim player, on X plane 12 at 4K I am pushing max VRAM on 4K and on MSFS 2020 it certain situations with 3rd party aircraft like the FENIX and high quality payware scenery VRAM is almost gone on my 4080 super
My 6800 XT pulled 14gb VRAM usage at 1440p on FFXVI everything maxed....pretty sure it can go above 16gb if I'm using 4k.
Yea not 4k with above 70fps it’s not
It depends on the game. Some games will use more if more is available. I remember seeing a video that said that new Star Wars game will use up to 20gb of vram if you play on rtx 3090 maxed out.
@@wumpratt Worth noting that AMD cards do worse with VRAM management. Techpowerup lists FFXVI as using 13.8GB of VRAM at 4K with frame generation on.
Easy to explain here
4k buy a 5090
1440p buy a 5070 ti.
5080 doesn't make any sense has the same amount of Vram as the 5070 and it has almost the same amount of cuda cores.
So is either a 5090 or a 5070. I don't think the 5080 will sell that much unless they do a ti or super version with at least 20vram and a few more cores. Other than that I don't see this card doing so well in the market specially if the rumored price is real. I mean of course some fanboys will buy this card for odd reasons but it's definitely not the best choice of the 50 series.
Mid-range NVIDIA cards like rtx 3050 already offer 16 GB of VRAM.
What? RTX 3050 doesn't have 16gb