think that ray tracing is a gimmick no one needs, certainly not fake frames to compensate it's infancy. no single player game will live long enough for it to matter, no multiplayer gamer will ever use it.
i thnik companies should foucs on pure performance not this shit stuff also players are doing a mistake by buying such cards that are extermely overpriced
@@zazo.. Cyberpunk might live long enough looking at those online player numbers on Steam alone.🙂 I assume that mods keep people coming back to this game.
@@larsenmats ray traced reflections look so nice tho. I don't care about lighting realism. But reflection make me want to enable ray tracing every time.
i dont see a problem with giving people settings options that go higher than current cards can handle. its actually what drives hardware makers to give us more and more powerful products. what if there was never a Crysis?
But the 4080 can most definitely handle Cyberpunk at 4k 60 fps using pathtracing. The price you pay are slight artifacts and barely noticeable input lag. People who downplay AI upscaling as if its something sacrilegious, really are spoiled beyond belief. They want The perfect picture quality with high framerates and no compromises, eventhough pathtracing in real time was a pipe dream 5 years ago.
@DrKrFfXx000000000000 diminiahing returns for sure. Just like 8k, its hard to make things look any better than they already do. I just played through horizon zero dawn... that shit looked great and it's like 8 years old at this point.
@@snakeace0I got an 4080 super and I’m even like yea it pricey but it performance is a beast to me coming from an 2060 super! I don’t understand all the talk it performs well! It plays anything I throw at it @4k!
thats bullshit i have a 4080 super and i7-14700kf and i run the game on max graphics with no ray tracing and ultrawide monitor the game barely hits 100 fps at most locations lmao
@@finnishpuppy Depends on the game you are playing. Ill upload some videos of my games in 4k and let you decide. But yes, many multiplayer games do get between 80 and 100 fps, but I have not had any issues in 95% of games.
I ordered 4080 super TUF yesterday, am5 x670-E TUF, Ryzen 7800X3D proc, CORSAIR - VENGEANCE RGB 32GB (2x16GB) DDR5 6000MHz C36, WD - BLACK SN850X 2TB. Can't wait to see what I am going to see on my end.
It seems like you dont comprehend what pathtracing even is, and why its so expensive to run. 5 years ago, you couldnt even dream of running a game like cyberpunk in real time using pathtracing. It is future technology thats already implemented. DLSS with ray reconstruction and Frame generation was specifically designed to give the GPU enough headroom to play it at 4k 60 fps using Pathtracing. The fact that its even possible is mind boggling tp anyone who knows what amount of calculations are being made per second in real time. If you want your pure rasterized experience, then turn off Raytracing. If you want Raytracing, then turn on DLSS. In fact, DLSS quality mode actually looks better than native these days thanks to a much more stable image especially for thin and distant objects. The reluctance to use Upscaling is the mark of someone stuck 10 years in the past.
"The reluctance to use Upscaling is the mark of someone stuck 10 years in the past." thisss man, makes my blood boil every time. Most of these people couldnt tell the difference in a blind test too.
Upscaling was the best joke, the snake oil sales department ever developed. All the tricks to mask the lacking power of today's GPUs are lowering the image quality. In many cases massively. Ridiculous to render an image with radiosity, global illumination... to finally let it take a bath in vaseline.
I know from the comment that you missed my point because you didn't watch the whole video. I do use DLSS and think it's great, especially DLSS 3's image quality. But that's not the main issue. The industry is changing rapidly with these technologies, and I'm wondering if it's a positive or negative development. Simply looking to start a debate to learn what the community thinks while sharing my own thoughts on the matter.
@@theivadim You enable DLSS but absolutely refuse to disable ray tracing which is pretty much an experimental tech absolutely not worth 60% of our FPS. 4k DLSS-Q looks equal or better than native 4k (instead you go straight to DLSS-P which looks like crap) and without RT you would have ZERO performance issues. I should know, I play at 4k DLSS-Q 60+ fps on a 3090.
It's not gpu fault. No one optimizes games anymore. Every new game is made with dlss and fg in mind. I'm sick and tired of this. Pure laziness from devs
using the tech to push games further i'm fine with. the fact that some are being made with these in mind(what about older systems?) to just get 60fps(mh wilds) is bad. using these won't this keep the price of consoles down as well? they won't really need to [try] to compete with pc's and everyone can play games that look/run great. i know the choose my side people will always put what they rejected down there is no stopping that it seems.
I swear to god, I have 7800x3d + 4080 super, 32gb ddr5 6000mhz, Cyberpunk with gsync on 4K TV Everyrthing Psycho include RTX exclude Pathracing, just every single posible option on ultra psycho 4k, I have also modded with 4K textures and I running DLSS Quality, FG with very stable 94 fps
Maybe I have that Intel CPU instability causing performance degradation🙂 The PC that I'm using is equipped with the i7 14700KF. My Ryzen PC is a bit weak for the 4080 SUPER. It has the 7600X. Or maybe our settings are not 100% matched.🤷♂️ Or maybe something else entirely.
Obviously you and I both have because our CPU isn't bottlenecking GPU performance like in this guy's video. 80 degrees CPU heat on 15% of its utilization lmao and he asks where are his fps? Evaporated in that heat
I just ordered almost the same setup. plus am5 x670-E TUF and 4080 SUPER ASUS TUF as well as 32gb ddr5 6000mhz. Let's see what i am going to see on my end.
Yeah bro..I have a 4080 super and I’ve been playing cyberpunk at 144fps with raytracing. Frame generation is just the new standard and makes the game play much better.
My 4070 can run The First Descendant around 48-52 FPS all max out at 1440p. Comparing it to your 4080 seems underwhelming when the FPS difference is like 10 FPS.
The fps to dollar-ratio above the 4070 super is a catastrophe. Without the usual price-gouging this old fashioned piece (192 bit/12 GB... where did i read that raytracing is memory bound, 1950?) would cost 200 bucks... and everybody should buy the 4080 for $300 instead.
Totally agree! I have the 4070TI Super and I'm not that impressed with it. I want brute-force hardware power without frame gen. It's really disappointing.
On CP2077, i got 23-27fps with 12900K(OC) with 4080 Super and 64Gb DDR5 on a 3840x1600 Ultrawide at native res + path tracing. With DLSS Quality and Framegen it's ~80-100fps. Not Path traced, but mostly Ultra RT. Following the optimized settings with a bump here and there. This was on the DLC, main game isn't as heavy for some reason.
@@varios4883 My first AMD CPU left me disappointed, so I switched over to Intel, that was nearly 10 years ago. Now, I'm switching back to AMD because Intel is leaving me disappointed😂😂
I have a 4080 super, Ryzen 7 7800x3d and I get 90 - 120 fps max settings 1440p ray tracing and path tracing in cyberpunk so i dont know the issue you were having
The most surprising thing about when I got a 4090 is that some games are really hard to run to the point that I'm glad I don't have anything slower. I was originally looking for a 4080 and I'm glad I found a good deal on a used 4090 instead. Also, I have a pair of speakers like yours in the video, Edifier MR4, except mine are black and they sound amazing.
I went from the 4080 to 4090 I felt that the 4080 was a ok card for 1440p not a 4k gaming card even if it can run some games at 4k but these newer games coming out it gonna struggle to run at 4k the 4090 does a much better job at 4k I have 2 montior sony m9 4k 144 and Lg ultragear 3440 1440p 240 oled and I think the 4090 is the best at 1440p and will run newer games at 1440p for next 2 gens fine.
@@theivadim The problem isn't game optimisation, as some people think it is. It's just that current gen hardware simply doesn't have the processing power to properly handle ray tracing. Most current ray traced games run at a low ray count, which causes a noisy image with artefacts, and the performance is still low. Current gen GPUs are just not powerful enough. More rays mean higher image quality, better lighting effects, and less noise and artifacts. Increasing the number of rays and bounces is the most effective way to increase RT/PT quality. The problem is that the higher the ray count, the more demanding it becomes. Video games still haven't unlocked the full potential of RT graphics. Hardware needs to become more powerful so that it will be able to handle running video games at a higher ray count. Then the RT would look way better without all the artefacts and noise you see in current ray traced games. Even the 4090 isn't powerful enough to trace a high number of rays. However, if we are referring to raster performance, I think current gen high end GPUs have smashed that, even at native 4K ultra.
@@theivadim RT is the biggest scam to overprice cards I've ever seen, even tho my GPU has it I never use it except for like 5min when I first boot a new game to see how it looks, snake oil my friend, snake oil
I ordered 4080 super TUF yesterday, am5 x670-E TUF, Ryzen 7800X3D proc, CORSAIR - VENGEANCE RGB 32GB (2x16GB) DDR5 6000MHz C36, WD - BLACK SN850X 2TB. Can't wait to see what I am going to see on my end.
Very good explanation and information in this video! I have myself an rtx 4070ti super which is still very expensive in my opinion. When you pay 800-1000$ for a high end gpu...you expect to play the latest games without upscaling and with ray tracing set to max. Unfortunately,I think the UE5 games and other with the latest graphical engines are made in that way with upscaling and frame generation in mind because in native they're just too demanding and complicated to run. Anyway,you're doing a pretty good job explaining to us enthusiasts how the things are with the latest games and GPUs. Keep up the good work!
Thanks for your input. I always thought resolution upscaling would eventually become standard in games. DLSS is impressive now, and I use it in Quality mode most of the time. I'm surprised the shift is happening so quickly. I expected more discussion and debate about it. But I don't see the community discussing it. So, I thought that I'd start.
Of all the people I've met, you're the most intelligent person in really seeing the gpus market, I've been commenting on these same things, and I'm very happy to find someone who explains the limitations of cards today, even Even a 4080s or 4090 has limitations, apart from the RT, which once you try it, never lets go again (It's the true Ultra of games) and man I'm very happy to see someone with a different thought, showing the truth, in fact, Keep it up bro, hail from Brazil!
My personal opinion is since DLSS released reviewers haven't found a good way to review graphics yet. Obviously DLSS often even improves the image or atleast shifts the downside from for example TAA jagginess to smearing. But my greater point would be the overall image quality achieved should be the focus. To elaborate - 4k DLSS - P will look and probably even run better than 1440p native. So saying a card is no 4k card would be weird. What about 4k DLSS - Q with FG. This will in most instances run and look just fine. Most likely a overall way better experience than 4k native. To sum my point up. The amount of variables make it tedious to test and allmost impossible to properly comapare graphical fidelity from one game to another but those variables can give us a visual experience we couldn't otherwise have. Thanks to DLDSR/DLSS/FG/Reflex+Boost I play most games at 5k/60fps with a 4080. They look breathtaking and feel (to me) just fine.
Indeed. That is why I don't want to make industry-standard reviews. Because showcasing various graphics settings + resolution upscaling and frame generation is important now. It's a huge part of the experience that simply can't be overlooked.
first of all, you need to post your computer spec, just the videocard didn't meant you have the system to able to use the videocard in full, there things call CPU bottleneck, ram speed, all kind stuff can bring down your fps.
Full PC specs are always in the description. I always pair GPUs with appropriate hardware unless stated otherwise for experiments such as "Can the Ryzen 5 7600X handle RTX 4080 SUPER" which can be found on my channel.
I have 4090 since oct 2022 from zotac which consider one of the worst crazy good gpu i paid 1700$ for it and its still same price zero problems better than 4080 by almost 30% in fps
Great video and a valid point! Nvidia fanboys are going to be mad I guess 😂. The point the video is making is that if you are paying around 1.2k+ on rtx 4080 (non super) or 1k on rtx 4080 super and getting sub 60 fps and forced to use frame generation to get a 60+ fps experience on a “flagship” GPU then yeah something seems a bit fishy. To all the people saying “tweak the settings a bit”… bro the 4080s is meant to be a flagship card and I understand that ray tracing might be demanding but if you are paying that much then I would expect to run the latest games at or near max settings and still get above 60fps with all the features it has without being forced to use frame generation. I have used a RTX 2080 for 5 years and I only swapped recently to RX 7900 XTX (just because it was on a great deal for £765) and I had doubts after my purchase if I missed out on some features but yeah thank you for making a valid point. The only bit of remorse I have I guess is the higher power consumption of the rx 7900 xtx compared to the rtx 4080s but yeah other than that thanks for the great video and keep up the great work!
Just upgraded from 1050ti to a used 1660 super and the difference is amazing, so sad to seeing even a 4080s struggle with something it is made for... Epic
RTX4080 aged like milk 🥛, I thought 4080 will be reduced to 1440p card , but this test shows that for Morden games ( UE 5.x type , RT etc ) it is just a 1080p card . Hope 5080 with it's GDDR7 VRAM and new cache will be game changer for RT workloads .
The same way NVidia increased the number of shaders and RT-cores, they did lower the max. TDP of the new AD106. A strategy also to find in their laptop series. Many mobile models (up to 4080m) are sold as 150W-parts, but barely drawing a 100 under strong load.
Now people will start defending NVIDIA in the comments, but let's be honest, NVIDIA is asking a lot of money for a graphics card that is the second in their ranking for the current generation. Its raw performance is barely enough for 2K, and to play in 4K, you must use DLSS and frame generation. And for those who say to just turn off ray tracing, at 4K with maximum graphics settings without ray tracing in games like Alan Wake 2 and Hellblade 2, it won't deliver stable 60 frames per second.
Not defending Nvidia. They are a monopoly executing typical monopolistic activity. But the truth is its the second best GPU out right now. There’s nothing forcing Nvidia to make anything better between it and the 4090. They aren’t going to compete with themselves. The 7900XTX is not a real threat, especially with developers including AMD sponsored titles like Avatar are including baked in raytracing. This is the reality of an Nvidia monopoly moving forward until someone can force them to make better products. But still, you can’t say the 2nd best GPU is bad because its still literally the second best GPU. By that standard, everything except the 4090 is bad. The only way the 50 series midrange will be great is if AMD/Intel kill in that segment forcing Nvidia not to gimp it like they did for the 40 series.
I don’t know if your using PCI e 5 nvme but if you are on this msi tomohawk board it pulls the pci e on the gpu down to 8 lanes. It would make these tests invalid .
I hate to disagree, i mean i can appreciate the effort put into this, but i like some others say, get almost 200fps at certain points with my 4080 Super in 4k maxed out graphics, plus some extras. 155-170 being average, still high though
Every comment starts out enthusiastic and then we find out they used DLSS. i have 4080 super OC and it is a bit disappointing at times tbh. Im not trying to mess with DLSS on a brand new GPU. Just doesnt look as good with it. 4070 can be OC close to 4080, so its not a bad deal.
I think that 4K gaming is still too taxing even without all fancy features like Ray tracing. I just bought MSI 4080 Super slim version. I play on my old 27” 1440p 144hz TN monitor. At this reasolution I can turn everything on max even without frame gen. I want to upgrade my monitor to Alienware 32” 4K Oled 240hz in near future. I know that at 4K I have to sacrifice some settings for more fps. If not 5080 than 6080 will probably change this. If 4090 doesnt cost nearly twice more money($1200 rtx 4080 vs 2100+ 4090)in my country, I would go for it.
I am averaging 130 fps on alan wake 2... 330 in rb6 and 125 in 6 days in falujah.. all in max settings.. running off of i9 14th gen.. ddr5 and 6tb. 64 gb ram.
I'm sorry but you are really pushing this video into a wrong direction - even while being aware that DLSS and FG is pushing you into the direction you want to be in, youre still so nostalgic and want process games like they were processed 10 years ago. In other words: Please do the same test with a 4090 - a card which nearly costs 2k nowadays and literally is miles away from what everything the competition like AMD has to offer. After the test, please come back and let us talk. You will be shocked, that even with a 4090 you wont get stable 60 FPS native in games like Alan Wake 2 etc. The market of graphics card is changing a lot. Where raw power once was extremely important, nowadays upscaling, framegeneration etc are getting more and more important. You can totally see this on games like Alan Wake 2 or even games like Hogwarts Legacy. The top tier model 4090 from nvidia isnt able to run this games on 4k native and no frame generation on with stable 60fps. Using DLSS/Frame Gen start to make those games playable and developers really started to rely on this and arent optimizing their games anymore. And you know what? Those kind of cases arent happening the first time in the videosgames-industry. It is completely normal. In the 1990s, early 3D graphics faced significant hardware limitations, preventing real-time rendering of complex models and textures. Developers used pre-rendered backgrounds in games like "Resident Evil" and "Final Fantasy VII" to create detailed environments without overloading the hardware. Techniques like Level of Detail (LOD) and billboarding further optimized performance by reducing polygon counts for distant objects and using 2D textures for certain elements. These creative solutions, similar to modern upscaling and frame generation, allowed the industry to deliver impressive gaming experiences despite technical constraints. I was asking myself some days ago whether to buy the 4080 super or an AMD card which is approx 20% cheaper and has more raw power. I decided to go for an 4080 anyway because i do think that the next KPIs for graphics cards will be: Software. DLSS/Frame Gen from NVIDIA is absolutely insane and they did very well in the last years. AMD lacks in case of RayTracing and amd fsr is getting better and better - but is still miles away from DLSS imo. In conclusion: Native 4k is dead - or even never was alive - rendering is simply way too slow currently. It might be that they will fix that - but currently the KPIs for good graphics cards move away from raw power / native graphics and move to: Software! The RTX4080 is a great 4K card and I cant wait to see what NVIDIA will bring next.
I'm a big fan of DLSS and use it all the time. DLSS Quality on a 4K monitor is my preferred setting. Balanced mode is acceptable when gaming on a budget GPU like the 4070 SUPER at 4K, but I start to have issues with the image quality when using top-tier GPUs. DLSS Performance simply isn't good enough for a $1K+ graphics card. I've had the chance to use an RTX 4090 and know its capabilities firsthand. I might need to get my hands on one again for a more in-depth video. While resolution upscaling and frame generation are undoubtedly the future, I won't accept a significant drop in image quality. The visuals need to justify the high price tag. Otherwise, I might as well opt for a less powerful 70-class GPU and accept more compromises at 4K.
@Garrus-w2h Im talking about the maximum settings one want to play... and thats not 1080p. Im talking 4k res. Lets see how your AMD card will perform on Path-Tracing games without DLSS or FG. The AMD FSR functionalities are ages behind. Also please consider your monthly electricity bills.
Wise choice. If you already have the 4070 then it's not going to feel as good swapping to a 4080 or 4090 at full price right now. I had a good time using the 4070 even at 4K thanks to DLSS. What about you?
Yeah it is true and a sad story. I have moved from a 4070super to a 4080super hoping to crush all the new games in 4k but u have to forget that. 4080s just an 1440p card. 4090 can do 4k upscaled. In this generation with the current prices only 2 graphics card worth to buy. Rtx 4070 ti super £750 for 1440p Rx 7900 xt for £650 for 1440p One give u option for ray tracing the other not. The rest just overpriced.
noo its still fine at 4k, its, maybe dont use path tracing in all your games? ok? wait until gpus come out that can actually run path tracing come out.
I that is true but the 4080 is more of 1440p card not a true 4k That being said these game developers are getting lazy releasing these games not ready and run like shit on alot of cards the as long the games run great on PS5 or the XboX
@@0mnikron702 I don't understand how people calling the 4080 SUPER a 1440p GPU like WTF? This card literally came out this year and 80 class GPUs were always 4K high/ultra settings GPUs. For $1,000 you'd expect the 4080 SUPER to run games at high settings with RT enabled at 60 FPS.
Is it not reasonable to want to play these games if they exist? I ended on a positive note by showing that there are more reasonable games out there (in terms of performance). Not all is lost. But we are being pushed towards using DLSS in some games. Which is not a good thing if we have to use Balanced or Performance DLSS on a 4K screen.
@@Milkykt Alan Wake 2 is not poorly optimised. If you want to call a game poorly optimised, then refer to a game like Jedi Survivor or Ark Survival. That's what you call poorly optimised. Alan Wake 2 is just extremely demanding, and its graphics are insane with path tracing enabled. It's actually well optimised now. Hardly any frame spikes, mininal stutters, or constant FPS fluctuation, not a massive difference between average FPS and 1% lows, GPU load is at 99% most of the time. Which is good because all of that means a game is usually well optimised. Can't say the same thing about games such as Jedi Survivor or Ark Survival, unfortunately.
i sold my RTX 4070 super to buy RTX 4080 super and i regret it... sure perfomance is more than my 4070 super but the graphic setting still the same (ex : Cyberpunk 2077, 4K, DLSS ON : Quality, RT ON, Path Trace off with optimized graphic setting from Hardware Unboxed : Volumetric Fog : Medium, Screen Space reflection : Medium, Cloud : Medium and Color Precis : Low) just to get AVG 80FPS all the time in 4K.. so whats the point to buy new more expensive card if the result in graphical setting is the same (can't up the "medium" setting to high or ultra for smoothly gameplay, try it once and FPS drop down so much) but just have bit more FPS ?? (with RTX 4070 super i get AVG FPS : 68FPS and 4080 super : AVG 80FPS in same setting), visualy same just bit more FPS : nothing "WOW!" about it, esp in tripple A games, 68FPS and 80FPS feel similiar to me, both not choppy in motion but yeah maybe am i the one who feel wrong about it here but thats what i feel, feel just the same.. esp with FG ON, both graphic card just play that game smoothly 70-90FPS but 4080s more expensive, bigger in size and more power consumption, for temp more or less same (AVG 65C)... or maybe i expect too much with 4080s.. sure perfomance is more than my 4070 super but the graphical setting, visualy just the same.. not say 4080s more sharpness when FG ON.. all the same, BLURRY VISUAL
What about the RTX 4090’s performance uplift compared to the RTX 3090? If you’re referring to the rest of the lineup, though, I agree-the performance gains haven’t been particularly revolutionary. I don't expect NVIDIA's strategy to change for the RTX 50-series. The 5090 will get a huge uplift and the rest will be just normal.
It's funny how many people have this "waiting" mindset. You have no guarantee to be alive when the next line of cards release. You need a gpu NOW in order to enjoy games, you can always upgrade later and sell your old GPU for extra cash.
@@iLegionaire3755 I have bought a 4080 super since this comment and you're absolutely right, it's a baller graphics card for any game on the market these days. Barely breaks a sweat on graphically demanding games on high settings from my experience.
7:00 + that is not native, its native + DLAA, so it is extremely hardware intense. PS: And you are wrong, DLSS is active, as we can see in your Settings, simply set to DLAA... but you are right, about Framegen being deactivated... EDIT: Alan Wake II does not support native Resolution but forces Upscalers. Instead of picking one Setting that increases Performance, you instead selected the most demanding one... 😅 Also, your Cyberpunk 2077 Version is not the updated one, options are different... what Platform did you get it from? I can not see that in the Video.
GPUs are very costly and they don't even perform as per expectations , i bought new 4070 Ti super 16 GB with every last penny i saved from last few years and the result is not satisfying at all 😢
4080ti was artificially removed from 4000 lineup not to eat into the 4090 sales, I know it It's just 1+1 4080 Super is slightly better than 4080, and 4090 and 4080 are miles apart in games such as Red De*d Redemption 2, and I game in 1080p. So, what people do is they say "Okay, then I must buy 4090" If 5000 series once again releases without 5080ti, you know exactly why XX80ti was always a flagship, and XX90 was for the enthusiasts, usually with larger ram capacity. Nowadays, it is not the case. It's money driver, so the gap between 4080 Super and 4090 is immense
You need to upgrade your ssd to gen 5 and definitely more than 1tb storage. Youll get a way better performance for gaming with ryzen 7 7800x3d instead of intel
I'm looking forward to upgrading to the 9800X3D later this year. However, I don't see any significant benefits to swapping my i7 for the 7800X3D right now. The overall performance difference isn't substantial. While there are some games that benefit greatly from the larger cache, they seem to be relatively few based on my research.
thats right, 4080s is very dissapointing from performance value, relying on DLSS / DLAA is running the game on lower texture pack. using less VRAM to push more FPS but sacrifying graphic quality. i play on 65" 120hz monitor, DLSS look crap. and HDR look far better than path/ray tracing combined without sacrifying performance.
Its the guys system, i run a 4080 super and get nearly 3 times the frames he does, only difference is i'm using a 7800X3D and 64gb ram, intel CPU's are really bad right now, which hurts me to say because ive been exclusively "blue team" for 20 years now, but the 7800X3D smokes every intel CPU in gaming right now. The 4080 super has no problem reaching 100+ fps at 4k ultra in every game aside from like Cyberpunk and allen wake 2. Most games are closer to like 200-300 fps on the 4080 super. The problem is made a little worse by devs all making games assuming everyone will automatically be using DLSS and frame generation. But don't stress if you bought a 4080 super, this guys vid is hugely misleading, i own one and eaaasily go above 100+ at 4k ultra in basically everything, i play cyberpunk at 1440 but it still looks amazing on a 27 inch monitor, the pixel density is still great
His camera angle insane making his setup look bad. The curve monitor does not look good and if i looked at this looking at the monitor i would pass. First impression is everything
well, days ago I installed Crysis 3 from 2013 and my RTX 4070 Super paired with R5 5700X 3D cannot run at max settings at 4k 144hz( 80-100fps), an 11 year old game!! should be 144fps all the time right?????
@@theivadim it would be awesome If u could test on your own old triple A games from 10+ years when there was no DLSS or FG and see how modern GPUs do. Thanks for the feedback and keep up the awesome content!
Exactly. I hate Videos that say 4080s and 90s are Pointless for 1080P. Exotic Textures Pathtracing Raytracing will utilize the Card and you won't Melt your 16pin connector at 1080P.😂
Well drop to 540pNative and problem solved. RT Overdrive is a probly even for 4090 and 5090 probably won't perform well enough for your tastes. 🤣 That being said , it all depends upon our settings+performace needs. 4080 Super as a 4K card
Thank you - pulling the proverbial trigger on aero 4080 super today, pairing with 7800 x3d. Wanted to wait for 8000 series but they will be scalped as well.
I have it and Microsoft flight simulator it will struggle if I run everything ultra 3440x1440. I have to use frame gen. Without it. I’m barely reaching 40fps in some places.
All these comments that defend this 4070 with a 4080 sticker on are loosing the point. The card does output these low numbers with rtx on yeah but its one of its main selling points. Its overpriced for what it does simple as that.
it was made for 1440p gaming and every single gpu is like this in the 40 series, even the 4090 would be getting like 60 fps or maybe 70 in the final descendant
@@emy-m3i no it's not. NVIDIA marketed the 4080 SUPER as a 4K GPU. Also spending $1,000 to game at 1440p is absurd. If you spend a $1,000 you'd expect a smooth 4K experience with maxed settings.
@@laszlozsurka8991 if we are being fair the 4090 in most these games at 4k is barely doing better and its like 2000 dollars in cyberpunk it still gets around 25 to 30 fps at max settings. no gpu is running any of these games at 4k max graphics no dlss at above 70 fps
@@laszlozsurka8991 yeah not the gpus fault, just that its reallty unoptimized you cant get on the 4080 super when the gpu thats 800 dollars mroe epensive has the same problem
This is not the GPU's fault. What a silly video. If a 4080 struggles, every card that's cheaper and less powerful will struggle harder. The performance is a reflection of the game you are trying to play.
BINGO! I just bought an 4080 super and coming from my 2060 super it’s so night and day I’m pretty confident I can get another 5 years out of it maybe even more! All this talk of well this card sucks! If it the 2nd fastest card out atm? Than what else can you say for everything beneath it!😭 people more upset about the pricing but I mean what market isn’t inflated shoot I got me an 4080 for 40$ off with a free game so pretty much 100 value and i can’t complain been enjoying my rig so much more!
If the 4080 Super isn’t powerful enough for PC gaming you are effectively saying that the only GPU anyone should even consider using is the 4090 after selling both your kidney’s on the black market to afford one or worse the new 5090 upcoming that your grandkids will have to help pay off after you’re gone.
lot of people here missing the point. I've used the 4080 super for awhile. this guy is spot on. it's not as powerful as we make it out to be. people yap this is a 4k card but in reality this is a really great 1440p card. imo graphics cards haven't evolved to a stage where they blow 1440p just out of the freaking water everything cranked up including RT, which might sound like an insane thing to say but it's the truth
So he's not spot on at all then, is he? Because he said numerous times that it's not even a playable 1080p card. Not to mention being a good 1440p card. And all he's defining "unplayable" with is frame rate numbers when he's played games all his life at half the rates he's calling playable.
2024, why my newest GPU cant run 4K video game in 60fps? 2034, why my newest GPU cant run 8K video game in 60fps? 2044, why my newest GPU cant run 16K video game in 60fps? . . . . Both Xbox and PlayStation: ⚰⚰
The only thing i can think of is that you dont have drivers updated or you know, you cpu could be shit. No one else has these issues so your wrong "CybErPuNK Is An oLd GaME" cyberpunk is as old as the ps5. Is that old? No
unlike what people said in the comment section you are actually right. Even if you buy one of the most expensive gpus you are still not going to be able to play games on max settings
I think that the concept of "max settings" is shifting.🙂 I bet that in 2-3 years most people will accept that DLSS Quality is a default setting. It'll probably be enabled in settings by default when you first load a game in 2027. I use DLSS Quality all the time at 1440p as well as 4K. The image quality is nice, you get Anti-aliasing included, and FPS is higher. Only Ws right there.
Lmao I play every thing in 4k max settings, with a 4080 super/ ryzen 7 7800x3d and I have a youtube channel with benchmarks. You can watch my cyberpunk benchmark, 4k resolution maxed out settings ray tracing on getting a solid 90 fps. On games like spiderman remastered in 4k I get a solid 120-130 FPS, re4 remake also get 120 fps. Black myth wukong 80 fps. red dead 2 max settings 90-100 fps.
I think we need Rtx 6090 to run full path traced games at 4k native 60+ fps. Considering Nvidia only improve ray tracing performance upto 70% on average every generation( of course dlss and fg can boost it to over 200fps)
People won't be happy even then. Even 20 years from they will still be complaing because PT will make games work at 1000 instead of 2000fps at 1080p. Then again maybe not. By the release of 7000 series , few shall remember times before upscaling .
Am I the only one that is still amazed by real time Ray tracing like that technology to me only existed in rendered anime movies that took years to make
Lol, it’s the second best GPU you can buy right now. It crushes the 7900XTX in even light ray tracing workloads disproportionally to the latter’s raster wins. And now, even AMD admits raytracing is become a feature focus of games moving forward, so it can’t just be ignored. It is what it is, an overpriced second best graphics right now. Nvidia has no competition, so I don’t expect them to try to blow everyone away with the 50 series either. They don’t have to do much to stay on top. That’s why monopolies suck.
it might be faulty card i hope it is faulty card if not then nvidia is a joke by providing a shitty card for 1000 usd perhaps 7900 xtx would be suitable for 4k
What do you think about this?🙂☝
think that ray tracing is a gimmick no one needs, certainly not fake frames to compensate it's infancy.
no single player game will live long enough for it to matter, no multiplayer gamer will ever use it.
i thnik companies should foucs on pure performance not this shit stuff also players are doing a mistake by buying such cards that are extermely overpriced
I agree in that Ray tracing is mostly a gimmick. Maybe it will be more useful in 5 years
@@zazo.. Cyberpunk might live long enough looking at those online player numbers on Steam alone.🙂 I assume that mods keep people coming back to this game.
@@larsenmats ray traced reflections look so nice tho. I don't care about lighting realism. But reflection make me want to enable ray tracing every time.
next video: 4090 Is Officially A 720P Monitor!
This is not a card issue it's a game optimization issue. Devs are getting lazy with optimization because of more powerful hardware.
💯💯💯💯 I totally agree 👍
He is using ray tracing on ultra btw.
i dont see a problem with giving people settings options that go higher than current cards can handle. its actually what drives hardware makers to give us more and more powerful products. what if there was never a Crysis?
But the 4080 can most definitely handle Cyberpunk at 4k 60 fps using pathtracing. The price you pay are slight artifacts and barely noticeable input lag. People who downplay AI upscaling as if its something sacrilegious, really are spoiled beyond belief. They want The perfect picture quality with high framerates and no compromises, eventhough pathtracing in real time was a pipe dream 5 years ago.
But thinks like the first descendant doesn't looks anything groundbreaking like Crysis was. This looks like a cookie cutter game.
@DrKrFfXx000000000000 diminiahing returns for sure. Just like 8k, its hard to make things look any better than they already do. I just played through horizon zero dawn... that shit looked great and it's like 8 years old at this point.
@@snakeace0I got an 4080 super and I’m even like yea it pricey but it performance is a beast to me coming from an 2060 super! I don’t understand all the talk it performs well! It plays anything I throw at it @4k!
my rtx 4080 sitting here at 4k hitting 120 fps easily. Seems to be a you issue.,
yep
leather jacket suckers xD
Hes using max graphics and max RT
thats bullshit i have a 4080 super and i7-14700kf and i run the game on max graphics with no ray tracing and ultrawide monitor the game barely hits 100 fps at most locations lmao
@@finnishpuppy Depends on the game you are playing. Ill upload some videos of my games in 4k and let you decide. But yes, many multiplayer games do get between 80 and 100 fps, but I have not had any issues in 95% of games.
I have gigabyte 4080s and R7 7800X3D
It can run everything on 4K 60-100fps with native graphic, with DLSS even more.
I ordered 4080 super TUF yesterday, am5 x670-E TUF, Ryzen 7800X3D proc, CORSAIR - VENGEANCE RGB 32GB (2x16GB) DDR5 6000MHz C36, WD - BLACK SN850X 2TB. Can't wait to see what I am going to see on my end.
@@diegoriveira3579that’s a bomb setup, ideal for 4K imho. What monitor/TV are you going to use with this I wonder.
@@complex4059monitor is dell that is 3 year old 1440p but I will be upgrading to 4k
4080s running 4k perfectly fine
I am calling bullshit on this video, the RTX 4080 Super can handle Path Tracing with DLSS.
It seems like you dont comprehend what pathtracing even is, and why its so expensive to run. 5 years ago, you couldnt even dream of running a game like cyberpunk in real time using pathtracing. It is future technology thats already implemented. DLSS with ray reconstruction and Frame generation was specifically designed to give the GPU enough headroom to play it at 4k 60 fps using Pathtracing. The fact that its even possible is mind boggling tp anyone who knows what amount of calculations are being made per second in real time. If you want your pure rasterized experience, then turn off Raytracing. If you want Raytracing, then turn on DLSS.
In fact, DLSS quality mode actually looks better than native these days thanks to a much more stable image especially for thin and distant objects. The reluctance to use Upscaling is the mark of someone stuck 10 years in the past.
"The reluctance to use Upscaling is the mark of someone stuck 10 years in the past." thisss man, makes my blood boil every time. Most of these people couldnt tell the difference in a blind test too.
Upscaling was the best joke, the snake oil sales department ever developed. All the tricks to mask the lacking power of today's GPUs are lowering the image quality. In many cases massively. Ridiculous to render an image with radiosity, global illumination... to finally let it take a bath in vaseline.
@@lorsheckmolseh3345 in the last review I saw, 14 games, 9 looked equal or better with DLSS. So yeah, gtfo.
I know from the comment that you missed my point because you didn't watch the whole video. I do use DLSS and think it's great, especially DLSS 3's image quality. But that's not the main issue. The industry is changing rapidly with these technologies, and I'm wondering if it's a positive or negative development. Simply looking to start a debate to learn what the community thinks while sharing my own thoughts on the matter.
@@theivadim You enable DLSS but absolutely refuse to disable ray tracing which is pretty much an experimental tech absolutely not worth 60% of our FPS.
4k DLSS-Q looks equal or better than native 4k (instead you go straight to DLSS-P which looks like crap) and without RT you would have ZERO performance issues. I should know, I play at 4k DLSS-Q 60+ fps on a 3090.
It's not gpu fault. No one optimizes games anymore. Every new game is made with dlss and fg in mind. I'm sick and tired of this. Pure laziness from devs
fr
using the tech to push games further i'm fine with. the fact that some are being made with these in mind(what about older systems?) to just get 60fps(mh wilds) is bad.
using these won't this keep the price of consoles down as well? they won't really need to [try] to compete with pc's and everyone can play games that look/run great.
i know the choose my side people will always put what they rejected down there is no stopping that it seems.
Path tracing is demanding doesn't matter how well your game is optimized. Even old games like Quake 3 with path tracing tax GPUs a lot.
I swear to god, I have 7800x3d + 4080 super, 32gb ddr5 6000mhz, Cyberpunk with gsync on 4K TV Everyrthing Psycho include RTX exclude Pathracing, just every single posible option on ultra psycho 4k, I have also modded with 4K textures and I running DLSS Quality, FG with very stable 94 fps
before few minutes i just quit game after 4 hours
Maybe I have that Intel CPU instability causing performance degradation🙂 The PC that I'm using is equipped with the i7 14700KF. My Ryzen PC is a bit weak for the 4080 SUPER. It has the 7600X. Or maybe our settings are not 100% matched.🤷♂️ Or maybe something else entirely.
Obviously you and I both have because our CPU isn't bottlenecking GPU performance like in this guy's video. 80 degrees CPU heat on 15% of its utilization lmao and he asks where are his fps? Evaporated in that heat
I just ordered almost the same setup. plus am5 x670-E TUF and 4080 SUPER ASUS TUF as well as 32gb ddr5 6000mhz. Let's see what i am going to see on my end.
@@diegoriveira3579 what did you see ?
he has a point, why high end one graphic card can't even meet the criteria to use all the features. even it's 4090.
Yeah bro..I have a 4080 super and I’ve been playing cyberpunk at 144fps with raytracing. Frame generation is just the new standard and makes the game play much better.
Its called making devs get lazy and you open to it
My 4070 can run The First Descendant around 48-52 FPS all max out at 1440p. Comparing it to your 4080 seems underwhelming when the FPS difference is like 10 FPS.
The fps to dollar-ratio above the 4070 super is a catastrophe. Without the usual price-gouging this old fashioned piece (192 bit/12 GB... where did i read that raytracing is memory bound, 1950?) would cost 200 bucks... and everybody should buy the 4080 for $300 instead.
Totally agree! I have the 4070TI Super and I'm not that impressed with it. I want brute-force hardware power without frame gen. It's really disappointing.
I'm not really seeing unplayable games in this video. I'm just hearing a lot of fps number snobbery.
rtx 4090 gets 20fps at 4k ultra, raytracing overdrive without dlss in cyberpunk. I wonder if rtx 5090 will be able to get 60 fps without dlss.
@@Marko-ij4vy From what I have seen, the 4090 gets around 25-30 FPS at native 4K max settings with RT Overdrive Mode in Cyberpunk.
what cpu do you have
On CP2077, i got 23-27fps with 12900K(OC) with 4080 Super and 64Gb DDR5 on a 3840x1600 Ultrawide at native res + path tracing. With DLSS Quality and Framegen it's ~80-100fps. Not Path traced, but mostly Ultra RT. Following the optimized settings with a bump here and there. This was on the DLC, main game isn't as heavy for some reason.
these gpu's increase more in price than performance id say
Thats so weird i play every game (fornite, GOW, GOT, first descendant, the finals) 4k ray tracing getting at least 90fps stable
waiiting for 5090/5080 to swap for my 3080
Good call🙂
you gonna have to upgrade your cpu too at that point
@@theboy2777 yeah im getting a whole new pc. also for the first time im gonna join the ryzen side. im done with intel for now
@@varios4883 My first AMD CPU left me disappointed, so I switched over to Intel, that was nearly 10 years ago. Now, I'm switching back to AMD because Intel is leaving me disappointed😂😂
@@DeepDownInTheOceanme when every side i switch to suddenly sucks(intel 13 sand 14th gen and 9000 series):
I have a 4080 super, Ryzen 7 7800x3d and I get 90 - 120 fps max settings 1440p ray tracing and path tracing in cyberpunk so i dont know the issue you were having
The most surprising thing about when I got a 4090 is that some games are really hard to run to the point that I'm glad I don't have anything slower. I was originally looking for a 4080 and I'm glad I found a good deal on a used 4090 instead. Also, I have a pair of speakers like yours in the video, Edifier MR4, except mine are black and they sound amazing.
I agree. Great speakers. I'm very happy with them.
I went from the 4080 to 4090 I felt that the 4080 was a ok card for 1440p not a 4k gaming card even if it can run some games at 4k but these newer games coming out it gonna struggle to run at 4k the 4090 does a much better job at 4k I have 2 montior sony m9 4k 144 and Lg ultragear 3440 1440p 240 oled and I think the 4090 is the best at 1440p and will run newer games at 1440p for next 2 gens fine.
wait you fucks are unironicly spending 1.5 to 2 k of dollar on a GPU only to play fking video games ?
@@0mnikron702you’re gonna regret tha upgrade when a 5080 outperforms your 4090 for less.
@@caribbaviator7058 for less? as in they will be cheaper in price?
Just turn off Ray tracing that solves most of your problems.
True. But we are paying premium to have ray tracing capabilities in this $1000 GPU. So, why would I not want to use it if I paid for it?
@@theivadim The problem isn't game optimisation, as some people think it is. It's just that current gen hardware simply doesn't have the processing power to properly handle ray tracing. Most current ray traced games run at a low ray count, which causes a noisy image with artefacts, and the performance is still low. Current gen GPUs are just not powerful enough. More rays mean higher image quality, better lighting effects, and less noise and artifacts. Increasing the number of rays and bounces is the most effective way to increase RT/PT quality. The problem is that the higher the ray count, the more demanding it becomes. Video games still haven't unlocked the full potential of RT graphics. Hardware needs to become more powerful so that it will be able to handle running video games at a higher ray count. Then the RT would look way better without all the artefacts and noise you see in current ray traced games. Even the 4090 isn't powerful enough to trace a high number of rays. However, if we are referring to raster performance, I think current gen high end GPUs have smashed that, even at native 4K ultra.
@@theivadim well yeah the market sucks but so does ray tracing.
@@theivadim RT is the biggest scam to overprice cards I've ever seen, even tho my GPU has it I never use it except for like 5min when I first boot a new game to see how it looks, snake oil my friend, snake oil
Rt cores are there to be used
I ordered 4080 super TUF yesterday, am5 x670-E TUF, Ryzen 7800X3D proc, CORSAIR - VENGEANCE RGB 32GB (2x16GB) DDR5 6000MHz C36, WD - BLACK SN850X 2TB. Can't wait to see what I am going to see on my end.
Looks nice🙂👍 Enjoy!
Very good explanation and information in this video! I have myself an rtx 4070ti super which is still very expensive in my opinion. When you pay 800-1000$ for a high end gpu...you expect to play the latest games without upscaling and with ray tracing set to max. Unfortunately,I think the UE5 games and other with the latest graphical engines are made in that way with upscaling and frame generation in mind because in native they're just too demanding and complicated to run. Anyway,you're doing a pretty good job explaining to us enthusiasts how the things are with the latest games and GPUs. Keep up the good work!
Thanks for your input. I always thought resolution upscaling would eventually become standard in games. DLSS is impressive now, and I use it in Quality mode most of the time. I'm surprised the shift is happening so quickly. I expected more discussion and debate about it. But I don't see the community discussing it. So, I thought that I'd start.
I agree for a high end gpu with pricw range from 800 to 1000. Gpu should give better performance in ray tracing
everytime someone say these new gpus are overkill for 1080p I laugh my ass off.
Of all the people I've met, you're the most intelligent person in really seeing the gpus market, I've been commenting on these same things, and I'm very happy to find someone who explains the limitations of cards today, even Even a 4080s or 4090 has limitations, apart from the RT, which once you try it, never lets go again (It's the true Ultra of games) and man I'm very happy to see someone with a different thought, showing the truth, in fact, Keep it up bro, hail from Brazil!
Thanks!🙂 I’ll keep exploring more GPUs in various scenarios to showcase the raw experience.🤝
My personal opinion is since DLSS released reviewers haven't found a good way to review graphics yet.
Obviously DLSS often even improves the image or atleast shifts the downside from for example TAA jagginess to smearing.
But my greater point would be the overall image quality achieved should be the focus.
To elaborate - 4k DLSS - P will look and probably even run better than 1440p native. So saying a card is no 4k card would be weird.
What about 4k DLSS - Q with FG. This will in most instances run and look just fine.
Most likely a overall way better experience than 4k native.
To sum my point up. The amount of variables make it tedious to test and allmost impossible to properly comapare graphical fidelity from one game to another but those variables can give us a visual experience we couldn't otherwise have.
Thanks to DLDSR/DLSS/FG/Reflex+Boost I play most games at 5k/60fps with a 4080. They look breathtaking and feel (to me) just fine.
Indeed. That is why I don't want to make industry-standard reviews. Because showcasing various graphics settings + resolution upscaling and frame generation is important now. It's a huge part of the experience that simply can't be overlooked.
first of all, you need to post your computer spec, just the videocard didn't meant you have the system to able to use the videocard in full, there things call CPU bottleneck, ram speed, all kind stuff can bring down your fps.
Full PC specs are always in the description. I always pair GPUs with appropriate hardware unless stated otherwise for experiments such as "Can the Ryzen 5 7600X handle RTX 4080 SUPER" which can be found on my channel.
I have a 7800x3d and a gigabyte 4080 super oc and i get 80-100 in this game on 4k. Maybe there is something wrong with yours???
What game?
cope more leather
lol bs
Of course it's not... 4080 SUPER is just a re-released 4080 with an additional +5% cores.
great video
real review
literally showing live gameplay and settings displaying how underwhelming a 1k$ gpu is
Thank you for sharing your thoughts. I appreciate the support!🤝🙂
I’ll still get this over the 4090.
I have 4090 since oct 2022 from zotac which consider one of the worst crazy good gpu i paid 1700$ for it and its still same price zero problems better than 4080 by almost 30% in fps
@@GFE.
At 70% of the price
So what you suggest for i9 14gen?
Are you asking for GPU recommendation? The RTX 4080 SUPER or 4090 are the best graphics cards you can get right now.
@@theivadim plz suggest any one, I am also confused 😕
Great video and a valid point! Nvidia fanboys are going to be mad I guess 😂. The point the video is making is that if you are paying around 1.2k+ on rtx 4080 (non super) or 1k on rtx 4080 super and getting sub 60 fps and forced to use frame generation to get a 60+ fps experience on a “flagship” GPU then yeah something seems a bit fishy. To all the people saying “tweak the settings a bit”… bro the 4080s is meant to be a flagship card and I understand that ray tracing might be demanding but if you are paying that much then I would expect to run the latest games at or near max settings and still get above 60fps with all the features it has without being forced to use frame generation. I have used a RTX 2080 for 5 years and I only swapped recently to RX 7900 XTX (just because it was on a great deal for £765) and I had doubts after my purchase if I missed out on some features but yeah thank you for making a valid point. The only bit of remorse I have I guess is the higher power consumption of the rx 7900 xtx compared to the rtx 4080s but yeah other than that thanks for the great video and keep up the great work!
Do you use FSR upscaling?
Just upgraded from 1050ti to a used 1660 super and the difference is amazing, so sad to seeing even a 4080s struggle with something it is made for... Epic
Yes, it would be a poor upgrade over 1660 Super
@@GameslordXY bro is on somthin'
I'm glad to hear that you're enjoying your new GPU🙂 Thanks for checking out my video🤝
@@BigBoy-ql5rn
Yeah.
I wonder which chemicals he's consuming?
Selfishly keeping them for themselves
@@GameslordXY Hmm... Mmmm.... Oh!
Pardon me, I was absorbed in thought.
RTX4080 aged like milk 🥛, I thought 4080 will be reduced to 1440p card , but this test shows that for Morden games ( UE 5.x type , RT etc ) it is just a 1080p card . Hope 5080 with it's GDDR7 VRAM and new cache will be game changer for RT workloads .
The same way NVidia increased the number of shaders and RT-cores, they did lower the max. TDP of the new AD106. A strategy also to find in their laptop series. Many mobile models (up to 4080m) are sold as 150W-parts, but barely drawing a 100 under strong load.
doubt these gpu's only increase in price look at silent hill 2 remake lol not even 60 fps in ultra settings these cards getting cooked
Frame generation is a must nowadays
Now people will start defending NVIDIA in the comments, but let's be honest, NVIDIA is asking a lot of money for a graphics card that is the second in their ranking for the current generation. Its raw performance is barely enough for 2K, and to play in 4K, you must use DLSS and frame generation. And for those who say to just turn off ray tracing, at 4K with maximum graphics settings without ray tracing in games like Alan Wake 2 and Hellblade 2, it won't deliver stable 60 frames per second.
With DLSS Quality it will, while looking BETTER than native. Not using DLSS at that resolution is just honestly a bit retarded.
Not defending Nvidia. They are a monopoly executing typical monopolistic activity. But the truth is its the second best GPU out right now. There’s nothing forcing Nvidia to make anything better between it and the 4090. They aren’t going to compete with themselves. The 7900XTX is not a real threat, especially with developers including AMD sponsored titles like Avatar are including baked in raytracing. This is the reality of an Nvidia monopoly moving forward until someone can force them to make better products. But still, you can’t say the 2nd best GPU is bad because its still literally the second best GPU. By that standard, everything except the 4090 is bad. The only way the 50 series midrange will be great is if AMD/Intel kill in that segment forcing Nvidia not to gimp it like they did for the 40 series.
Nobody cares nor play alan wakes 2 . Its a boring walkibg simulator that is slow and barely any action. The 4080 super is amazing at most games
Love your videos bro. no nonsense. only facts
I'm glad you like them. Your viewership and support are greatly appreciated!🤝🙂
I don’t know if your using PCI e 5 nvme but if you are on this msi tomohawk board it pulls the pci e on the gpu down to 8 lanes. It would make these tests invalid .
The SSD is gen 4
I hate to disagree, i mean i can appreciate the effort put into this, but i like some others say, get almost 200fps at certain points with my 4080 Super in 4k maxed out graphics, plus some extras. 155-170 being average, still high though
Every comment starts out enthusiastic and then we find out they used DLSS. i have 4080 super OC and it is a bit disappointing at times tbh. Im not trying to mess with DLSS on a brand new GPU. Just doesnt look as good with it. 4070 can be OC close to 4080, so its not a bad deal.
We are far away from native pathtracing GPU-s at 4k60. In 10 years maybe we will have that kind of power.
I think that 4K gaming is still too taxing even without all fancy features like Ray tracing.
I just bought MSI 4080 Super slim version. I play on my old 27” 1440p 144hz TN monitor. At this reasolution I can turn everything on max even without frame gen. I want to upgrade my monitor to Alienware 32” 4K Oled 240hz in near future. I know that at 4K I have to sacrifice some settings for more fps. If not 5080 than 6080 will probably change this. If 4090 doesnt cost nearly twice more money($1200 rtx 4080 vs 2100+ 4090)in my country, I would go for it.
At 1080p-1440p you're probably CPU limited with a 4080super.
I am averaging 130 fps on alan wake 2... 330 in rb6 and 125 in 6 days in falujah.. all in max settings.. running off of i9 14th gen.. ddr5 and 6tb. 64 gb ram.
i9 14th gen. what a sucker.
I'm sorry but you are really pushing this video into a wrong direction - even while being aware that DLSS and FG is pushing you into the direction you want to be in, youre still so nostalgic and want process games like they were processed 10 years ago.
In other words: Please do the same test with a 4090 - a card which nearly costs 2k nowadays and literally is miles away from what everything the competition like AMD has to offer. After the test, please come back and let us talk. You will be shocked, that even with a 4090 you wont get stable 60 FPS native in games like Alan Wake 2 etc.
The market of graphics card is changing a lot. Where raw power once was extremely important, nowadays upscaling, framegeneration etc are getting more and more important.
You can totally see this on games like Alan Wake 2 or even games like Hogwarts Legacy. The top tier model 4090 from nvidia isnt able to run this games on 4k native and no frame generation on with stable 60fps. Using DLSS/Frame Gen start to make those games playable and developers really started to rely on this and arent optimizing their games anymore. And you know what? Those kind of cases arent happening the first time in the videosgames-industry. It is completely normal. In the 1990s, early 3D graphics faced significant hardware limitations, preventing real-time rendering of complex models and textures. Developers used pre-rendered backgrounds in games like "Resident Evil" and "Final Fantasy VII" to create detailed environments without overloading the hardware. Techniques like Level of Detail (LOD) and billboarding further optimized performance by reducing polygon counts for distant objects and using 2D textures for certain elements. These creative solutions, similar to modern upscaling and frame generation, allowed the industry to deliver impressive gaming experiences despite technical constraints.
I was asking myself some days ago whether to buy the 4080 super or an AMD card which is approx 20% cheaper and has more raw power. I decided to go for an 4080 anyway because i do think that the next KPIs for graphics cards will be: Software.
DLSS/Frame Gen from NVIDIA is absolutely insane and they did very well in the last years. AMD lacks in case of RayTracing and amd fsr is getting better and better - but is still miles away from DLSS imo.
In conclusion: Native 4k is dead - or even never was alive - rendering is simply way too slow currently. It might be that they will fix that - but currently the KPIs for good graphics cards move away from raw power / native graphics and move to: Software!
The RTX4080 is a great 4K card and I cant wait to see what NVIDIA will bring next.
I'm a big fan of DLSS and use it all the time. DLSS Quality on a 4K monitor is my preferred setting. Balanced mode is acceptable when gaming on a budget GPU like the 4070 SUPER at 4K, but I start to have issues with the image quality when using top-tier GPUs. DLSS Performance simply isn't good enough for a $1K+ graphics card. I've had the chance to use an RTX 4090 and know its capabilities firsthand. I might need to get my hands on one again for a more in-depth video.
While resolution upscaling and frame generation are undoubtedly the future, I won't accept a significant drop in image quality. The visuals need to justify the high price tag. Otherwise, I might as well opt for a less powerful 70-class GPU and accept more compromises at 4K.
Not just games. There are many AI models runs only on NVDIA gpus. AMD is way behind.
@Garrus-w2h Im talking about the maximum settings one want to play... and thats not 1080p. Im talking 4k res. Lets see how your AMD card will perform on Path-Tracing games without DLSS or FG. The AMD FSR functionalities are ages behind. Also please consider your monthly electricity bills.
Was wanting to sell my 4070 and buy one but I think I wait till 4090 get cheaper when 50series comes out😊
Wise choice. If you already have the 4070 then it's not going to feel as good swapping to a 4080 or 4090 at full price right now. I had a good time using the 4070 even at 4K thanks to DLSS. What about you?
@theivadim ya I'm happy it's my first pc amd cpu 7700.
wait till 5070
I agree, GPU's in this price range should be able to run 4K at 60fps or more in 90 to 95% of current games.
it literally does this guy is just yapping
Yeah it is true and a sad story. I have moved from a 4070super to a 4080super hoping to crush all the new games in 4k but u have to forget that. 4080s just an 1440p card. 4090 can do 4k upscaled.
In this generation with the current prices only 2 graphics card worth to buy.
Rtx 4070 ti super £750 for 1440p
Rx 7900 xt for £650 for 1440p
One give u option for ray tracing the other not. The rest just overpriced.
noo its still fine at 4k, its, maybe dont use path tracing in all your games? ok? wait until gpus come out that can actually run path tracing come out.
@@takik.2220 not enough vram already in ratchet and clank and callisto protocol also jedi survivor on the edge
Don’t care for Ray tracing my 4080s crushes any game at 4k max settings without any RT
so u cherry pick the most unoptimized games? The 4080 is an insane card. What a useless video
I that is true but the 4080 is more of 1440p card not a true 4k That being said these game developers are getting lazy releasing these games not ready and run like shit on alot of cards the as long the games run great on PS5 or the XboX
@@0mnikron702 I don't understand how people calling the 4080 SUPER a 1440p GPU like WTF? This card literally came out this year and 80 class GPUs were always 4K high/ultra settings GPUs.
For $1,000 you'd expect the 4080 SUPER to run games at high settings with RT enabled at 60 FPS.
Is it not reasonable to want to play these games if they exist? I ended on a positive note by showing that there are more reasonable games out there (in terms of performance). Not all is lost. But we are being pushed towards using DLSS in some games. Which is not a good thing if we have to use Balanced or Performance DLSS on a 4K screen.
@@0mnikron702it is a certified 4K Card bud,it is Insane how yall want the 4090 to be the only "True" 4K card when thats TOTALLY FALSE AND WRONG!
@@Milkykt Alan Wake 2 is not poorly optimised. If you want to call a game poorly optimised, then refer to a game like Jedi Survivor or Ark Survival. That's what you call poorly optimised. Alan Wake 2 is just extremely demanding, and its graphics are insane with path tracing enabled. It's actually well optimised now. Hardly any frame spikes, mininal stutters, or constant FPS fluctuation, not a massive difference between average FPS and 1% lows, GPU load is at 99% most of the time. Which is good because all of that means a game is usually well optimised. Can't say the same thing about games such as Jedi Survivor or Ark Survival, unfortunately.
i sold my RTX 4070 super to buy RTX 4080 super and i regret it... sure perfomance is more than my 4070 super but the graphic setting still the same (ex : Cyberpunk 2077, 4K, DLSS ON : Quality, RT ON, Path Trace off with optimized graphic setting from Hardware Unboxed : Volumetric Fog : Medium, Screen Space reflection : Medium, Cloud : Medium and Color Precis : Low) just to get AVG 80FPS all the time in 4K.. so whats the point to buy new more expensive card if the result in graphical setting is the same (can't up the "medium" setting to high or ultra for smoothly gameplay, try it once and FPS drop down so much) but just have bit more FPS ?? (with RTX 4070 super i get AVG FPS : 68FPS and 4080 super : AVG 80FPS in same setting), visualy same just bit more FPS : nothing "WOW!" about it, esp in tripple A games, 68FPS and 80FPS feel similiar to me, both not choppy in motion but yeah maybe am i the one who feel wrong about it here but thats what i feel, feel just the same.. esp with FG ON, both graphic card just play that game smoothly 70-90FPS but 4080s more expensive, bigger in size and more power consumption, for temp more or less same (AVG 65C)... or maybe i expect too much with 4080s.. sure perfomance is more than my 4070 super but the graphical setting, visualy just the same.. not say 4080s more sharpness when FG ON.. all the same, BLURRY VISUAL
glad you know, that this was a pretty stupid decision.
Almost buy 4080, dang u make me hold on 3090
I’m shocked that rtx 4080 super can’t handle high RT settings at 1080p
Yes it can?
yes it can, it msot definetley can at 4k as well, just not path tracing, or nanite
Cpu's have been the real gains the past 4 years compared to gpu's. The last good gpu performance boost was the 3080 imo
What about the RTX 4090’s performance uplift compared to the RTX 3090? If you’re referring to the rest of the lineup, though, I agree-the performance gains haven’t been particularly revolutionary. I don't expect NVIDIA's strategy to change for the RTX 50-series. The 5090 will get a huge uplift and the rest will be just normal.
so wtf do i get if not a 4080 super..
Wait for the 5080 or 5090
This video is total bogus, the NVIDIA RTX 4080 Super crushes Ray Tracing on or off.
It's funny how many people have this "waiting" mindset. You have no guarantee to be alive when the next line of cards release. You need a gpu NOW in order to enjoy games, you can always upgrade later and sell your old GPU for extra cash.
@@iLegionaire3755 I have bought a 4080 super since this comment and you're absolutely right, it's a baller graphics card for any game on the market these days. Barely breaks a sweat on graphically demanding games on high settings from my experience.
7:00 + that is not native, its native + DLAA, so it is extremely hardware intense. PS: And you are wrong, DLSS is active, as we can see in your Settings, simply set to DLAA... but you are right, about Framegen being deactivated... EDIT: Alan Wake II does not support native Resolution but forces Upscalers. Instead of picking one Setting that increases Performance, you instead selected the most demanding one... 😅 Also, your Cyberpunk 2077 Version is not the updated one, options are different... what Platform did you get it from? I can not see that in the Video.
GPUs are very costly and they don't even perform as per expectations , i bought new 4070 Ti super 16 GB with every last penny i saved from last few years and the result is not satisfying at all 😢
you will stay poor for ever with that mindset.
in conclusion 4080 super is a slideshow card as a hardcore powerpoint user i suggest this card
That card is built for 1440/4K and not for 1080p
4080ti was artificially removed from 4000 lineup not to eat into the 4090 sales, I know it
It's just 1+1
4080 Super is slightly better than 4080, and 4090 and 4080 are miles apart in games such as Red De*d Redemption 2, and I game in 1080p.
So, what people do is they say "Okay, then I must buy 4090"
If 5000 series once again releases without 5080ti, you know exactly why
XX80ti was always a flagship, and XX90 was for the enthusiasts, usually with larger ram capacity. Nowadays, it is not the case.
It's money driver, so the gap between 4080 Super and 4090 is immense
You need to upgrade your ssd to gen 5 and definitely more than 1tb storage. Youll get a way better performance for gaming with ryzen 7 7800x3d instead of intel
I'm looking forward to upgrading to the 9800X3D later this year. However, I don't see any significant benefits to swapping my i7 for the 7800X3D right now. The overall performance difference isn't substantial. While there are some games that benefit greatly from the larger cache, they seem to be relatively few based on my research.
thats right, 4080s is very dissapointing from performance value, relying on DLSS / DLAA is running the game on lower texture pack. using less VRAM to push more FPS but sacrifying graphic quality. i play on 65" 120hz monitor, DLSS look crap. and HDR look far better than path/ray tracing combined without sacrifying performance.
80C CPU on 15% of CPU utilization?! Whatever is cooling your CPU - it isn't doing it's job. Fix that CPU cooling first and then run benchmark
Its the guys system, i run a 4080 super and get nearly 3 times the frames he does, only difference is i'm using a 7800X3D and 64gb ram, intel CPU's are really bad right now, which hurts me to say because ive been exclusively "blue team" for 20 years now, but the 7800X3D smokes every intel CPU in gaming right now. The 4080 super has no problem reaching 100+ fps at 4k ultra in every game aside from like Cyberpunk and allen wake 2. Most games are closer to like 200-300 fps on the 4080 super. The problem is made a little worse by devs all making games assuming everyone will automatically be using DLSS and frame generation. But don't stress if you bought a 4080 super, this guys vid is hugely misleading, i own one and eaaasily go above 100+ at 4k ultra in basically everything, i play cyberpunk at 1440 but it still looks amazing on a 27 inch monitor, the pixel density is still great
That mouse sensitivity is insane💀
His camera angle insane making his setup look bad. The curve monitor does not look good and if i looked at this looking at the monitor i would pass. First impression is everything
I was saying this since the base 4080 dropped and everyone laughed at me
well, days ago I installed Crysis 3 from 2013 and my RTX 4070 Super paired with R5 5700X 3D cannot run at max settings at 4k 144hz( 80-100fps), an 11 year old game!! should be 144fps all the time right?????
True. Something is fishy🙂 Could be game engine limitation of some sort. Did you check GPU utilisation to see if it is 100% used?
@@theivadim It is 98 to 100% usage. The game runs at 120fps-ish at high settings though
Indeed, I'd expect more FPS from such an old game that doesn't look particularly fresh by 2024 standards🙂
@@theivadim it would be awesome If u could test on your own old triple A games from 10+ years when there was no DLSS or FG and see how modern GPUs do. Thanks for the feedback and keep up the awesome content!
@@necrodaft thanks for the tip. I'll add this idea to my "maybe" list.🤝
Exactly. I hate Videos that say 4080s and 90s are Pointless for 1080P. Exotic Textures Pathtracing Raytracing will utilize the Card and you won't Melt your 16pin connector at 1080P.😂
😄
I call BS. I have a 7950X3d with a 4080 super expert. Plays Cyberpunk in 4k with RUT mod at 75 FPS
Well drop to 540pNative and problem solved.
RT Overdrive is a probly even for 4090 and 5090 probably won't perform well enough for your tastes. 🤣
That being said , it all depends upon our settings+performace needs.
4080 Super as a 4K card
What is your cpu please
14700KF
Thank you - pulling the proverbial trigger on aero 4080 super today, pairing with 7800 x3d.
Wanted to wait for 8000 series but they will be scalped as well.
Good choice👍 Enjoy!
I have it and Microsoft flight simulator it will struggle if I run everything ultra 3440x1440.
I have to use frame gen. Without it. I’m barely reaching 40fps in some places.
All these comments that defend this 4070 with a 4080 sticker on are loosing the point. The card does output these low numbers with rtx on yeah but its one of its main selling points. Its overpriced for what it does simple as that.
It’s not bad ?
$1,000 GPU and can't even max out at 4K above 60 FPS... this is why the GPU market is screwed and no one is interested in new GPU releases.
it was made for 1440p gaming and every single gpu is like this in the 40 series, even the 4090 would be getting like 60 fps or maybe 70 in the final descendant
@@emy-m3i no it's not. NVIDIA marketed the 4080 SUPER as a 4K GPU. Also spending $1,000 to game at 1440p is absurd.
If you spend a $1,000 you'd expect a smooth 4K experience with maxed settings.
@@laszlozsurka8991 if we are being fair the 4090 in most these games at 4k is barely doing better and its like 2000 dollars in cyberpunk it still gets around 25 to 30 fps at max settings. no gpu is running any of these games at 4k max graphics no dlss at above 70 fps
@@emy-m3i If $1,000 only gets you a 1440p experience then you know the GPU market is screwed and game devs needs to optimize their games better.
@@laszlozsurka8991 yeah not the gpus fault, just that its reallty unoptimized you cant get on the 4080 super when the gpu thats 800 dollars mroe epensive has the same problem
This is not the GPU's fault. What a silly video. If a 4080 struggles, every card that's cheaper and less powerful will struggle harder. The performance is a reflection of the game you are trying to play.
BINGO! I just bought an 4080 super and coming from my 2060 super it’s so night and day I’m pretty confident I can get another 5 years out of it maybe even more! All this talk of well this card sucks! If it the 2nd fastest card out atm? Than what else can you say for everything beneath it!😭 people more upset about the pricing but I mean what market isn’t inflated shoot I got me an 4080 for 40$ off with a free game so pretty much 100 value and i can’t complain been enjoying my rig so much more!
@@lightest2385You nailed it. I will never understand the mindset of some people.
If the 4080 Super isn’t powerful enough for PC gaming you are effectively saying that the only GPU anyone should even consider using is the 4090 after selling both your kidney’s on the black market to afford one or worse the new 5090 upcoming that your grandkids will have to help pay off after you’re gone.
lot of people here missing the point. I've used the 4080 super for awhile. this guy is spot on. it's not as powerful as we make it out to be. people yap this is a 4k card but in reality this is a really great 1440p card. imo graphics cards haven't evolved to a stage where they blow 1440p just out of the freaking water everything cranked up including RT, which might sound like an insane thing to say but it's the truth
Well said!
its a 3070 successor bro. a midrange card priced at 80 ti value
Pretty sure its a 3090 successor.
Not as powerful as? 7900xtx?
So he's not spot on at all then, is he? Because he said numerous times that it's not even a playable 1080p card. Not to mention being a good 1440p card. And all he's defining "unplayable" with is frame rate numbers when he's played games all his life at half the rates he's calling playable.
I never use ray tracing. I do not need that.
Dis you overclock it
the fact, that the "4080" is a real 4070 and the "4090" is in fact a 4080 ti doesnt make it better.
This entire video is a skill issue.
Scam 1000 usd = 4k 140 and 80 on high demanding path tracing
2024, why my newest GPU cant run 4K video game in 60fps?
2034, why my newest GPU cant run 8K video game in 60fps?
2044, why my newest GPU cant run 16K video game in 60fps?
.
.
.
.
Both Xbox and PlayStation: ⚰⚰
There is no GPU that can run high frame rate with ray tracing unless you are using scaling software that is cheating to give higher frame rate
Dude its not card problem. Are you serious?
If you try to do the same with a 4090 you will get only 10fps more so no sense these tests
The only thing i can think of is that you dont have drivers updated or you know, you cpu could be shit. No one else has these issues so your wrong "CybErPuNK Is An oLd GaME" cyberpunk is as old as the ps5. Is that old? No
unlike what people said in the comment section you are actually right. Even if you buy one of the most expensive gpus you are still not going to be able to play games on max settings
I think that the concept of "max settings" is shifting.🙂 I bet that in 2-3 years most people will accept that DLSS Quality is a default setting. It'll probably be enabled in settings by default when you first load a game in 2027. I use DLSS Quality all the time at 1440p as well as 4K. The image quality is nice, you get Anti-aliasing included, and FPS is higher. Only Ws right there.
Lmao I play every thing in 4k max settings, with a 4080 super/ ryzen 7 7800x3d and I have a youtube channel with benchmarks. You can watch my cyberpunk benchmark, 4k resolution maxed out settings ray tracing on getting a solid 90 fps. On games like spiderman remastered in 4k I get a solid 120-130 FPS, re4 remake also get 120 fps. Black myth wukong 80 fps. red dead 2 max settings 90-100 fps.
Lol, its getting chocked most likely with cpu... i got 4080 super and i9 12900k and dont have those problems at all...
Nahh it's 1440p gpu
Lock your fps and turn on frame generation
FPS limit + frame generation is a bad mix. It doesn't work well. I know because I tried it in many scenarios.
@@theivadim actually it does work well with lossless scaling for example, but only while locking your fps to half
my 4080 super just chilling in my pc having 150 fps in that game so yh bro a you problem
I think we need Rtx 6090 to run full path traced games at 4k native 60+ fps. Considering Nvidia only improve ray tracing performance upto 70% on average every generation( of course dlss and fg can boost it to over 200fps)
People won't be happy even then.
Even 20 years from they will still be complaing because PT will make games work at 1000 instead of 2000fps at 1080p.
Then again maybe not.
By the release of 7000 series , few shall remember times before upscaling .
@@GameslordXY I think upcoming Rtx 5080 can run path tracing games at 4k 60 dlss balances=d no fg! I'm waiting for it to launch to buy it!
4080S is just fine for me. Y’all too thirsty.
for me everything looks right
Am I the only one that is still amazed by real time Ray tracing like that technology to me only existed in rendered anime movies that took years to make
Turn off fucking Ray Tracing
Lol, it’s the second best GPU you can buy right now. It crushes the 7900XTX in even light ray tracing workloads disproportionally to the latter’s raster wins. And now, even AMD admits raytracing is become a feature focus of games moving forward, so it can’t just be ignored. It is what it is, an overpriced second best graphics right now. Nvidia has no competition, so I don’t expect them to try to blow everyone away with the 50 series either. They don’t have to do much to stay on top. That’s why monopolies suck.
noone is talking about AMD in this video you little definding wiener boy
it might be faulty card i hope it is faulty card if not then nvidia is a joke by providing a shitty card for 1000 usd perhaps 7900 xtx would be suitable for 4k
It’s not faulty 😉
7900 XTX would literally average less than 10 fps in all of these path traced games at 4k.
@@theivadimWhy are you winking at him weirdo
@@adolfo777nica bro it is not that deep 😭🙏
@@kraithaywireNope the 7900xtx Is 10 percent faster than 4080 super avarage