I was alerted to my Black Myth Wukong performance possibly being too high. I reviewed my full capture and settings used, and it is correct. However, after an uninstall and re-install of the game the performance is much lower, and the only way I can replicate the performance here in BMW at 1440p is if I use 1080p. I have removed the 1440p results for Black Myth: Wukong so the timestamps are now all over the place. I will fix them in due time.
@@Abenteuerlich that sounds like bs... but it doesnt matter...dlss looks like shit most of the time with trailing and especialy on lower resolutions with perfomance on, the clarity of the picture if severely lowered... so yeah you get more fps but also get a smudgy image with weird pixel trailing.... doesnt seem like a move forward to me... ray traycing still needs some time.
@@dender6816 you are not the first noob who has no idea how DLSS works. DLSS does nothing other than use pixel information from already rendered frames, the AI to evaluate how long the pixels can be used for each frame, even DLSS Performance ( 4xUpscaling ) has 8 times the pixel information of the input res from 1920x1080p.
@Abenteuerlich i mean...i didnt say that i dont know what it does..."noob" i said that it doesnt matter cuz it looks shit anyway and that it needs more time..
I'm sorry, but it doesn't, i've tried path tracing at 4k and looks absolutely worse then regular ray tracing, the lightning may be more precise but the loss of quality and mushyness of textures when moving is not worth it
@@tibusorcur user issue, path tracing looks amazing for me. Are you on amd by any chance, those gpus cant path trace at all, and it doesnt even look right, nvidia intended path tracing to be used with ray reconstruction.
If you can't get AT LEAST 60 fps maxed out with the latest top of the line hardware then the game is not ready for release. I don't care what anyone thinks
Path tracing is optional though, and extremely heavy. Even the 5090 probably wont do 60 fps at 1440p native with path tracing enabled, unless the 5090 sees a 50% improvement in ray tracing / path tracing over a 4090.
@@GankAlpaca I see the point you are making but it doesn't apply to 99% of games. Look at DOOM Eternal, you can max it out with ray tracing at 240fps on modern hardware and it looks great. Then you look at these UE5 slop games that hardly look better and can't hit 60fps on a $5k PC
$1200 GPU branded as a Ray Tracing card Running a game at 720p with the advertised feature Or doing 1440p30 "4K gaming" by Nvidia Ray Tracing is gonna end up like Physx, it will have to stop being a marketing tool and actually be integated reasonably without most people even knowing about it
that's honestly what I was thinking, I made the same reference talking to another friend. I'm sure in the next 5+ years we're gonna have a lot of games require RT hardware like Indiana Jones and it'll just be there without even thinking about it.
@@LateNightFire One thing to note, Indiana Jones is using Vulkan raytracing. Actually after deep diving this Vulkan (open gl) is very close to DX12, they can use the same physical hardware on gpus to do these calculations faster. Note, I learned last night also that rdna 3 doesn't actually have 'tensor cores' in the same sense as matrix multipliers, as rdna 3 has WMMA, or wave matrix multiply-accumulate. Its definately a different design But yeah, I feel you. This is whats getting popular in the tech space and will be bigger and bigger until they hit roadblocks
RTX 3000 was second gen of RT card and it make nice working RT instead of RTX 2000, now RTX 4000 is first gen of path tracing GPUs, so waiting for RTX 5000 to have good PT performance (good means 60+ fps with DLSS Q)
For the viewers, in Black Myth Wukong you can get an fps boost using 21:9 aspect ratio to get you over the edge if you need a bit more FPS in a given scenario.
Yeah, 16:9 60fps to 21:9 72fps is a nice 20% boost to overall fps, allowing for a higher dlss preset or higher settings, or just the outright fps boost alone its a neat trick and pretty cinematic as a game with higher fov as a bonus 👍
@@kerkertrandov459 you'd get the opposite effect of burn in, "reverse" burn in, oleds are organic, which means they degrade with use over time, degradation basically = less brighness, in this case those pixels would be permanently brighter than the rest given enough time. If you do it for long enough for it to actually make a big difference, once you play/look at something that isn't 21/9 you'll see bright bars where the black bars normally are. We're talking extreme use cases here, of course, but even if you just do it for say, 100 hours.. if you're sensitive enough or you're looking at the kind of image that makes that apparent, you'll start to see it. If you did it exclusively for 1000 hours from new it's basically like the center of your screen has 1000 hours of use and the top and bottom have zero so it'll pretty much be apparent under any circumstance.
people dont get that Ray Tracing/Path Tracing is literal Movie Tech that took Pixar like 20 servers rendering stuff for days on end. Still does. It will never be fit or ready for gaming probably ever without AI frame gen stuff.
Yeah, path tracing is exceptionally heavy, and even ray tracing. The correct term is real-time ray tracing, and that should give people an idea, but many still dont realize just how computationally expensive it is to calculate these light rays and bounces on the fly. For ray tracing upscaling is required in most games, especially games that make use of per-pixel ray tracing. And then path tracing needs frame generation as the amount of light bounces increase the cost exponentially.
The answer is no, it is not! and I strongly advise not buying the 5080 16GB. Here's what I found: first, my setup is 7800X3D, TUF 4080 Super, 32GB DDR5 6000Mhz, CL30. So in Indiana Jones the great circle. I'm getting 130 FPS at 4k DLSS performance, frame gen on, ray tracing high (not ultra). As soon as I switch to DLSS balanced, I ran out of 16GB VRAM and get 3 FPS... So even before the release of 5080, 16GB is not enough to run DLSS balanced with high ray tracing! This is completely unacceptable, imagine spending thousands and get 3 FPS on DLSS balanced with ray tracing... what a joke
Every generation of Nvidia card in the RTX series has nearly doubled RT performance and 30%-50% raster and I doubt it’s gonna slow down in the next gen either.
Normal Ray-Tracing: The RTX 4000 series absolutely is just a powerhouse. Path-Tracing(REAL RT): The RTX 4000 Series gets brought to its KNEES! So y'all remember when the RTX 2000 Series came out and could barely do Ray-Tracing when it first came out and you could maybe only get away with turning on ONE RT Setting and still get 60FPS. I think the RTX 4000 Series cards are at the same level the RTX 2000 Series cards were at when it comes to PATH-TRACING, you can maybe turn on ONE PT Setting but turn them all on and your GPU will be brought down to it's KNEES. So we might not really see RTX Cards that can actually crush Path-Tracing and get a crap-ton of performance until the RTX 6000 Series Cards come out, the same way we didn't get GPU's that can do Ray-Tracing at 120 FPS until the RTX 4000 Series cards came out.
Are they really, though? I played AW2 at 1440p full path tracing, high-ultra optimised with DLSS Q, and it ran perfectly fine with 90-100fps. 120-140 if I enabled FG, too
@kerkertrandov459 Honestly, I've never tried 4k on a smaller screen, but 1440p feels more than clear enough for me right now, and I can definitely tell the difference between 60fps and 120-150. I might invest in a 4k oled in a few years when high refresh 4k is a real thing and the monitors are more affordable
@@fredstead5652 with a 5000 gpu u can quadruple 4x ur fps thanks to 3 fake frames per 1 real frame, so u only need 60 base fps at 4k to cap out ur 240 fps oled 4k monitor, and dlss can help u with that, the technology is here, unfortunately 4x frame gen it will only work in 5 or so games but hopefully devs add it to more games
I wonder how a console will look like for it to run something like cyberpunk at ultra ray tracing with path tracing. And what the cost of a console like that would look like too.
I understand why people would use a 4K display for video editing, color grading etc but for gaming it’s just not worth it ! Adding path tracing on top of that isn’t feasible without Frame generation ! I would recommend switching to 1440p DLSS Q + path tracing or 4k DLSS Balanced Ray-traced/rasterized !
The graphical fidelity is very very impressive. I've been waiting for 50 series for a while now, and I was hoping to grab a 5090 for 1440p high-refresh rate gaming. I play a lot of different titles, from competitive shooters at 360Hz to single-player titles on a controller, but I mostly tend to enjoy M+K gameplay and 100+fps I feel is the minimum for a decent experience on mouse input. I feel like the 4090 does not deliver that on all games, sadly. That said, I bit the bullet and got the 4090. I should've gotten it sooner, as the gen is ending, but such is life in a 3rd world country. My country's currency might devalue considerably in the coming months. I can't expect 5090 pricing to be anywhere near decent on launch or for at least 3-5 months from its launch, even if US MSRP is acceptable. And when it drops in price, well, might as well bite the bullet once again, as long as it delivers a transformative experience and I can sell the 4090 for close to what I paid for (which surprisingly is lower than US MSRP converted, although not by much). I'm eagerly awaiting its arrival, as I've kept myself from playing a few games that I was anticipating to have a "full" experience. CP2077 and AW2 included, but more recently STALKER 2.
I have a 3090ti, i just started playing Cyberpunk and it can run 1440p with path tracing on at highest settings with DLSS. Pretty surprised at how they managed to get playable PT in this open world game!
Yeah, it's crazy. It started out as a tech demo and now we have fully playable path tracing if you use a bit of DLSS. And the best part is Cyberpunk has very, very few stutters. Sure, it took them a little while to get here, but it is a very fluid open-world experience now.
Who in his right mind can say that 30 / 40 fps is enough? I stepped up recently from 60 hz 1080p to 165hz 1440p and the difference is outstanding. 60fps is minimum.
movies are at 24fps is just a habit. steady 30fps are perfectly fine, in game cinematics it is locked at 30fps and people don't even realize it, I bet you didn't. Of course, more is smoother and, but it is not a necessity, it's only a habit, it is a necessity only for PVP and VR. I prefer photorealistic/eye candy 30fps to medium 60+fps
I have an RTX 4080 laptop and I'm happy to compromise playing at 1080p to get smooth 60fps with path tracing. When I want a smooth 4K 60+ fps, I switch to regular RT or turn it off entirely for DLAA or DLSSQ and enjoy that sharp clear image. It's great either way and I'm happy
Great video, I feel like you read my mind with the things you test sometimes. I've had the opportunity to use a 4070ti 12gb again recently and I the only path tracing game I tried so far is Portal RTX since that is 'free' on steam. To play that at 4k I also had to go to DLSS Ultra performance + FG. With that game Ultra Performance DLSS isn't the worst looking because of the art style and basic graphics, but for more detailed games like in this video it probably wouldn't look the best, even though DLSS UP has improved in recent DLSS versions. The Ultra Performance also comes in handy in terms of controlling the VRAM and ray reconstruction is also really good because it seems like the exact case it was designed for. Ultra Performance DLSS also helps you generate more frames with DLSS FG because of the lower internal res, so everything kind of comes together. Still it's sad that only the 4090 can pretty much do path tracing at a somewhat acceptable quality at 4k, and based on the rumors for next gen, we will only get maybe 2 more GPU's on the level of the 4090 or higher. Maybe there will be more architecture improvements that let something like a 5070 do path tracing much easier than current GPU's.
Thanks for always watching and commenting, I appreciate it 🙏 I believe we might see some decent changes in RT performance with the 50 series. Seeing that there are so many games coming out with RT, and some you cant disable it, I think a big boost to RT performance is required. I am very curious about the 5070, but if it only has 12GB VRAM I'll probably end up getting a 5070 Ti for the channel.
@@lsnexus2080 Yeah, the 5070 leaked specs look kinda bleak. I think it will probably fall about 4070 Ti performance in raster and maybe 4080 performance in RT, if they improve the RT performance quite a bit. Ideally the 5070 should match the 4080 at least, in raster and RT, but based on the leaked specs that's not going to happen.
Great video! Really detailed and thorough. A few notes would be that some combat etc should probably have been tried with these settings, as usually frames drop there, and that the capture quality seems just a tad on the blurry / low bitrate side. Or maybe it's just my eyes idk. Thanks anyway!
Any help on the capture side would be appreciated! I use an Elgato 4K60 Pro MK2 Capture card with OBS, and I record at 75mbps 4K CBR using NVENC. I tried H.264 as well but the quality is similar. When exporting in Adobe I also use 75mbps CBR at 4K, but other people's videos do look better than mine and I honestly dont know what I'm doing wrong?
Also, appreciate the feedback. Constructive criticism is always welcomed, as long as people arent yelling and being condescending, which you werent, so thanks for that!
@@Mostly_Positive_Reviews You need 100mbps bitrate for 4k recordings at 60FPS. If your output refresh rate is above 60hz I'd recommend 125-150mbps for bitrate - huge file sizes but end result is way less broken up on YT
@@SolidBoss7 Thanks, I'll give it a shot. A friend of mine uploads 5K videos at 75mbps and his videos look a lot sharper than mine, with a lot less breakup. The only difference is he records locally using ShadowPlay and I use an external PC with a capture card. I think I am missing some setting in OBS as when I use the Elgato Capture Utility at the same bitrate it looks a lot sharper as well. But unfortunately that utility has a lot less features so it's not really viable for me to only use that. I will try upping the bitrate and upload a test video, see how it goes, thanks!
At 4k you need to use DLSS performance plus FG on top of that to have play with smooth fps with PT. On my RTX4080S in black myth wukong I get 86fps (built in benchmark) at 4K DLSS performance + FG, very high settings and full RT. The game is perfectly smooth and responsive and with the latest 3.7.2 DLL the image quality looks like 4K native to my eyes. At 1440p DLSS quality is enough to run these PT games at over 60fps. In the cyberpunk at 1440p DLSS quality with PT I get around 70 fps, and 110-120fps with FG on top of that. With psycho RT (standard RT) I get 55-70fps at 1440p native, 110-120fps DLSS quality, and 150-170fps. IMO the RTX4080 is enough to play PT games as long you use DLSS features.
Yeah, I'm also not opposed to using DLSS features to get a decent playable experience. In fact, I use DLSS Quality and Frame Generation in basically every game that supports it. I'm able to max out my monitor's refresh rate of 165hz that way, and in games where I'm able to get more than that I cap it and it then provides a decent reduction in power draw.
@@Mostly_Positive_Reviews IMO on RTX series cards it makes no sense to play at native at all. DLSSQ is enough to match TAA native image quality, and sometimes DLSS can even beat the native image. What I however like to do is to combine DLSS balance (or even performance) and DLDSR to get even better image quality (nothing beats downsampling) without performance penalty. In Cyberpunk at 4K DLSS balance + DLDSR 2.25x with ultra RT I get 10fps more compared to native 1440p and the DLSS image quality looks way better (4K like).
@@PabloB888 Yeah, there are very clever tricks that can be used with DLDSR and DLSS combined. I did find that Ratchet and Clank at 4K DLDSR downscaled to 1440p had a huge performance penalty for me on the 4070 Ti, but I am convinced it was a VRAM issue. I tested Ratchet and Clank recently on the 4070 Super and even at 1440p Very High with RT it runs out of VRAM, even before using frame generation.
@cosmicheretic8129 Nvidia recommends at least 40fps to use DLSS FG, and based on my tests, that's the necessary minimum for this technology to work properly. If the game is running below 40 fps, the image starts to stutter like hell, but once you get into the 40 fps range, the image is smooth as butter. On my RTX4080S OC I rarely see 40fps dips though, even in Alan Wake 2 PT. I usually get 60-80fps without FG, but sometimes in the deep forest performance can dip below 60fps and that's when DLSS FG is doing it's magic. Thanks to DLSS FG, I no longer have to worry about sub 60fps dips because I no longer notice them and I always get smooth gameplay.
Here is why ReSTIR DI/GI is so expensive TODAY. The base cost of these techniques are hight because you start tracing rays per pixel. If you play in 4K DLLS Performance mode that mean you trace for each pixel in 1920x1080 grid. Now for example CP2077 shoots 6 ray per pixel up to 2 bounces for each fram of that grid. Now you do the math on it how many calculations that is... The great advantage of this is that although today it is heavy even on the best GPUs this cost is much more scalable then any other technique meaning, that ReSTIR DI/GI traces against everything withing BVH and camera view. If RTX 50series will again double the speed in which rays can be calculated that means you will in theory get double the performance. So RT is actually smart way of how to light the scenes and I wouldn't blame developers wanting to leverage that. With that being said it is important to understand that not everygame necessarily needs RT lighting or reflections. From what I have seen so far the biggest impacts are when the game uses a lot of dynamic lights, changing time of day or dense forests. Other than that you can get away with pre-baked lighting and game will look great in the still shots but as soon as you introduce any dynamic objects in the scene (torches, lights, particles, muzzle flashes...) it will start to show it's videogamy look. My point is that the RT feature have this sort of flat cost which means that future GPUs will not neccesarily need to be more powerful to get the same results and focus might shift from lighting to simulating physics or A.I. for example.
I agree, PT preset in CP2077 does not worth it, since it introduces too big drop in GPU performance. But it is possible to get so-called "optimized" PT settings via PT "Ultra Plus mod". There are quite a few settings here. I personally was using PT20 preset with "Fast" quality. I also enabled custom DLSS scaling at 60% and preset "E". This way I was getting locked 60 fps at 1440p/PT preset/RR (no FG) on my 4070 Ti. The image quality was pretty decent on the 1440p screen. Average performance with this mod was roughly on par with base "RT Ultra" preset, but thanks to enabled PT lighting the game looks quite a bit better in some scenes (especially indoor ones). 4080 can run CP2077 with this mod at either higher internal res (67% should be double) or higher quality settings (adjusted through the mod menu). So, I'd say PT is definitely playable on 4080 S at 1440p. You just need to tune hidden graphics settings a little bit (via the mod).
The mod does wonders for sure. I tested it back in the day on my 4070, havent tested it on the 4080 yet. Will do so soon, because if the performance is similar to RT Ultra it's basically where I'll play the game.
You need FG for PT games at 4K. On my RTX4080S I play Black Myth at 4K DLSS performance + FG, very high settings and max PT I get 86fps in the built in benchmark and 80-100fps during gameplay. With FG input lag is even lower (48ms vs 60ms) because the game is using nvidia reflex and I cant see any artifacts, so there's no point playing without FG. With the latest 3.7.2 DLL and mod that removes excessive sharpening this game has even DLSS performance looks like 4K native to my eyes. With these settings this game looks and runs amazing, so I cannot complain.
my preference is 4k on DLSS performance, and no frame gen. It's normally 45-55fps, which I can live with. but it just shows that the only cards really capable of path tracing at 4k 60fps+ with good rendering are 4090/5080/5090, which is sad but the math requirements are insane so understandable. The exception is Indiana Jones, which runs at 4k, Dlss quality, path tracing at 55-80fps, which is sweet! And honestly it's the best game of the lot.
i see that u have a little of screen tearing have you tried the recommended settings for nvidia gpus? on control panel u should activate gsync on full and windowed screens , vsync on and low latency on ultra those settings help a ton on the latency while using framegeneration.
The capture card is only 60 hz, hence the reason for the screen tearing. If I enable v-sync all benchmarks would be stuck at 60 fps, but I do use g-sync + v-sync on my monitor when not recording 👍
I can play cyberpunk with 70+ fps at 2560*1600 resolution on my Rtx4080 laptop with DLSS quality and frame gen. Everything else is maxed out except for motion blur and crowd density.
yes when vram isnt a question , 50/58/67/77 for render resolutions you can apply dlaa on top of dlss in most games as well if you were wondering. Also frame generation is designed to be turned on at 60fps thats not an opinion its just a fact its not gonna run correctly if its below 60fps or you turn vsync on gsync also does weird shit with it to.
Seeing that it's about 2 weeks before announcement I'd say it's better to wait. What we have seen so far from the leaked specs the 5080 wont outright beat the 4090, at least in rasterization, but we'll have to see what new features are released and also what RT performance looks like. It's possible the 5080 comes close to 4090 in raster, and slightly beat it in RT, but then also cost much less than the 4090 currently does.
movies are at 24fps is just a habit. steady 30fps are perfectly fine, in game cinematics it is locked at 30fps and people don't even realize it, I bet you didn't. Of course, more is smoother, but it is not a necessity, it's only a habit, it is a necessity only for PVP and VR. I prefer photorealistic/eye candy 30fps to medium graphics 60+fps
I'm so confused. Can someone explain why with 4k DLSS Performance in CP77 he had 50fps or so on average, but with Ultra Performance he gained 20-30 frames? I thought UP renders it at 720p, so how could it give that many frames more than Performance which is 1080p render? I was expecting 10 frames at the most. That's usually how many you gain from each tier of DLSS
DLSS Ultra Performance is less than half the amount of pixels at 1080p. ~921000 vs ~2100000. DLSS Ultra Performance is the biggest reduction in resolution compared to other DLSS Presets. Quality = 67% Balanced = 58% Performance = 50% Ultra Performance = 33%
@@Mostly_Positive_Reviews A 5070 with 12gb would be very disapointing. A few years ago Nvidia released a paper on a new algorithm for vram compression, but i don't know if it was or will be implimented. Having an idea work on paper and actually using it are two different things. Also the PS5 Pro has an addition 2gb of ram, so maybe that will push Nvidia to up the vram in their cards.
One can hope for sure. I think it'll be a mistake if they go 12GB again, but we still have 8GB 4060 Ti's that released just recently so it wouldnt surprise me.
Compare image at 17:18 to image 18:17 and tell me this is a difference which will make your gaming experience any better? Honestly? Than compare the FPS... Its all just a joke, placebo, and a way to pull 1000s of USD from your pocket. And thats in one of the THREE games, where RTX actually makes some visual difference.
Hab ne 4080RTX OC und spiele auf nem 34"Uwqhd, IPS, 120hz, Curved, G-sync Pannel, meistens Dlss Qualität, weil mit dlaa in Uwqhd die 4080RTX zu schwach ist, wenn Pathtracing oder Raytracing aktiviert ist!
Hey man! I love your videos. I have a question. The BMW frames seem very, very high! I just cross-checked with multiple other videos on YT and even a 4090 doesn't average around 140-150fps at 1440p RT Very High (DLSS Q + FG). Every other 4080 video shows it averaging around 80-100fps. Could you re-check what's going on? Did the RT settings enable fully or is something else going on? Every other game seems inline with 4080's performance. BMW is the only outlier here.
Hey man, thanks for reaching out. Sure, I can check again, never hurts to double check. I need to re-install BMW again so will provide feedback in the morning, that okay?
@@Mostly_Positive_Reviews Sure man! Thank you so much. And I didn't mean to question your testing, was just curious haha. Love your videos and thank you for the quick response!
@@kraithaywire Not a problem at all. I have a slight suspicion as to what might be causing the differences but I want to double check to make sure. I just went through my full unedited recording to verify the settings and they are correct, and the game was restarted 4 times during my testing. Here's what I think happens, and I'll verify, but I use a 4K capture card for my benchmarks, so the PC detects it as a 4K monitor. Now, if I leave it on 4K Windows Desktop resolution and set the resolution in BMW to 1440p the framerate is quite a lot lower. I read up on it and it seems as if BMW tries to internally upscale to your desktop resolution. So for that reason I set my desktop resolution to 1440p when I test 1440p in that game. It's possible that others tested with the desktop resolution still at 4K. This is just a theory, the game is downloading again now so I can double check because I am curious now as well haha!
@@kraithaywire Okay, so I just retested it and you are right, the 1440p results are too high. The only way I can replicate these framerates is if I set it to 1080p resolution. But in the video you can see the resolution is definitely set to 1440p and DLSS slider is set to 100% resolution scale. The 4K results are the same though, so I have no idea how this happened. Thank you for bringing it to my attention, I have left a pinned comment and I am busy updating my timestamps. Edit: I managed to replicate the issue. If I set the resolution to 1080p and apply, and change it back to 1440p and apply, it doesnt seem to apply correctly. It shows 1440p in all the settings but the performance is identical to 1080p.
Great breakdown kind sir. Currently playing through Alan Wake 2 with my 4080s/7600x build and manage 60fps average with ultra settings and full pt. mind you my cpu and gpu are very well overclocked, I've squeezed everything i could out of both, but I have a quick question. How are you getting the overlay on the right that shows PCL? I dont see that setting in msi afterburner so i was justr curious. Thanks in advance!
Thanks for watching and commenting! The overlay on the right is the Nvidia App, not MSI Afterburner. It's not as granular as MSI Afterburner, but it does have some additional options that MSI AB does not have.
@@Mostly_Positive_Reviews Cheers for the quick reply. I actually don't use GFE, instead opting to just download drivers directly but ill be downloading for that feature, would be a good stat to be aware of. Thanks again!
@@RepublicofODLUM You're welcome. This isnt GeForce Experience though as I had issues with it showing average PC latency lately. The Nvidia App is supposed to combine the feature sets of both GeForce Experience and the Nvidia Control Panel. It's a separate download, you can just search for it on Google.
Sir i have same cpu as yours with 4070 super & Corsair 750w 80+ Gold PSU...in future am willing to upgrade to any 50 series card similar to 4080 in terms of power so can i upgrade or power supply will be an issue ?
My whole system with 2 monitors use about 600w, so probably just below 500W in total for the CPU and GPU + Fans and whatever under heavy load, so 750W will be fine. We'll have to wait and see what Blackwell's actual power draw is to be 100% sure, but I think it will be slightly lower than Ada, so it should be fine.
@Mostly_Positive_Reviews no 50 series wont be low power hungry as per leaks but with considerable performance boost such as 5080 will be 400w with 10% more performance than 4090 if rumors are true✌️
What's wrong with 39 FPS? It's definitely a playable frame rate, even if it's not fully optimal. For example, the PS5's Quality mode is locked at 30 FPS, yet the game is still perfectly playable, although not as smooth as 60 FPS or higher.
Nothing wrong with it, I know plenty people who are fine with playing at 30 fps. I'm just not one of them. I prefer a much higher framerate of about 120 fps, but it's all personal preference.
I play at 4k on a 4090 but I think if you have a 4080S at 1440p you have the best value for money high end experience as the leap to 4k and a 4090 is a big one cost wise. In some ways I kinda wish I had stayed on 1440p just because keeping up with 4k is always going to not be great bank for buck value
One of the reasons I am hesitant to go to 4K myself. Firstly I would need a decent hugh refresh rate 4K monitor, and over here they are pretty expensive. And then you need something to constantly drive it at that resolution. I actually wish I had forever stayed at 1080p60 🤣
@@Mostly_Positive_Reviews Lol yeah for sure. If you don't need to, don't. I think the leap from 1080p to 1440p is very noticeable but from 2k to 4k... not so much
@@MaTtRoSiTy I have a semi-decent 4K monitor here that I use for my benchmarks, but it is only 60hz, as I dont need more than that to show performance at 4K. But that 60hz is what is preventing me from using it as a daily, and then, as I said, maintaining a high framerate at 4K without too many sacrifices becomes very expensive, so I'll stick to 1440p now. Might go to 3440x1440p UW if I can get a good price on one of those monitors, but I am happy for now.
On Black Myth I think I had it on 55 Render with FG Cinematic, Very High PT Felt pretty good on those settings and looked great Feels like FG is a must for Path Tracing games 🤣
Yeah, FG launched pretty much with Cyberpunk's path tracing mode, so that was the idea behind it I believe. And 2 years later we have games using FG as recommended to hit 60 fps hahaha!
@@Mostly_Positive_Reviews 🤣Yeah it does make you laugh.Weirdly it seems to change everything on Cyberpunk because at base fps without FG with Path on. It feels really unplayable and slow, but once you turn on FG it feels a lot more responsive. i couldn't work it out how it does that, FG made vital for Path tracing. I also saw Black Myth that the 4090 wasn't even enough to have DLSS quality render on that over 60fps with Max Path RT, getting like low 40fps FPS before game gen. Outlaws was bad at launch also with that but it's since been improved
I play primarily single-player games. And because I play with mouse/keyboard I tend to flick the camera around a lot, 60 fps is very sluggish for me. I need at least 90 fps minimum to not be distracted and focus on the game.
So you said you're OK with 1440p DLSS balanced but don't want 4K DLSS performance. Kind of a weird take imo. 1440p balanced would be 1506x847 while 4K performance is 1920x1080.
If you compare them side-by-side there are quite a few more visual artifacts with DLSS Performance at 4K, especially in some lights and shadows. Not in all games, but in Cyberpunk for example there is more light flickering, and in games like Indiana Jones in dense foliage the shadows break up more.
@@Mostly_Positive_Reviews I'll have to do some pixel peeping but that doesn't reflect my experience. The 1440p would always look softer simply from being a lower base resolution. I would take 4K DLSS performance mode over 1440p quality (960p) every time I have compared them. I used to have a 1440p monitor and 4K OLED TV so I used to alternate the two pretty regularly. I have since upgraded to a 4K QD OLED monitor so I don't really mess around with 1440p anymore.
It does look sharper, no arguments there. Just fired up Black Myth Wukong and in the first jungle area 4K DLSS Performance has more noticeable flickering in the leaves and foliage than DLSS Balanced at 1440p. The overall image is sharper though but the flickering is the only reason for me. And as I said, it doesnt happen in every game, and DLSS 3.8.10 does seem to help with that in games like Spider-Man remastered, among others.
I've got RTX 3090 with 9800X3D. 120fps native is always better than 60fps with RTX and DLSS. I have no idea how people play on 4K monitors. I guess DLSS is the only way.
Same for me, I'd rather have 120fps without RT than 60 fps with it. But I'm also fine with using DLSS so I use DLSS Quality in pretty much all games that support it.
Even the 4090 you need to go DLSS Balanced or Performance in Cyberpunk I think to get a more optimal FPS with RT overdrive at 4K. Otherwise on DLSS quality you'll get very heavy dips at base, it's defo playable but slower and not the best experience I found I found DLSS Balanced with FG to be fine most of the time actually at 4K with Cyberpunk RT Overdrive on the 4080 Super But for the most optimal DLSS Performance I think, But I prefer balanced to get a good compromise between Image and bit more FPS Hardware unboxed done a video with RT and think the 4090 had a avg of 45 FPS on Cyberpunk with RT Overdrive 4K DLSS Quality before FG, mad when it costs about 2 grand😆 The 4080 for a grand less not bad tbh and you're losing only about 10-12 fps I worked out in difference to the 4090 on Cyberpunk with those settings i think
Yeah, 4K with path tracing is extremely heavy, even on the 4090 as you mentioned. Hoping the 50 series provides a decent bump in RT performance at least, it's definitely needed as there are a lot of new games releasing now where RT is standard and cant be disabled.
@@Mostly_Positive_Reviews Yeah defo needs to be but since Path tracing is so demanding i'm not sure it's going to be enough of a bump up in performance. 5080 Super is what I might get if vram higher
@@Mostly_Positive_Reviews Yeah I just can't justify the prices for the 90 series for myself. If the performance over the 4080 was a massive gap possibly, but theres a line I don't cross in terms of price for performance. Just not worth it for me in the end The 4080 if priced around £1000 and not what their launch price was is about right for me Great card for that price
@@Mostly_Positive_Reviews The other thing is I'm still running on a 750W PSU, probably the 5090 would be going over that. Can't be bothered to change that plus pay the silly price lol
Yet 20% of people chose it. 100+ might not be the minimum for you, but it is for some people. People who buy high-end GPUs to play at 1440p for example.
@aboutthat413 Probably due to the fact that the 60 option was too low for them. I wouldn't know what to vote here either. I will absolutely play a game even if it is hardcapped at 60fps. But I would not be happy voting for it as my desired minimum like here. I generally put myself in that group, I desire high resolution, high fidelity, high framerate images from games. We might say things about over 100, but it is absolutely not true. Everyone would 100% play a game they otherwise love at less than 100fps. We might complain about the framerate, but in the end nothing is unplayable under 100, and everyone who says that is overreacting, and mostly complaining only because it is not the performance they'd like to see. Or the other possible option, they're playing competetive games. But then their opinion doesn't really matter for this, since this is about high graphics. I play at 1440p, and I would never ever go back to 1080p. But is it really unplayable? No. But I would not choose it in a poll about minimum acceptable resolution. It's easy to choose untruthfully is what I think I'm trying to say with this :'D Edit: To add to this absolute novel of a reply: 100+ fps in every game isn't even achievable, unless you play with a 4090 at 1440p medium settings. Which is a weird combo to me, but I guess possible.
I added that option as I know of plenty people who swear by nothing less than 120 fps. If Twitter allowed more poll options I wouldve included more, but because I am limited to 4 options only I wanted to cover the higher refresh rate gamers too. If you look at the comments on that poll there are quite a few that said 120+
Yo why don't you OC the GPU? If you run it at 3000Mhz clock and +900Mhz on the memory, you could probably get 60FPS in DLSS Performance mode or 50FPS in balanced. Put your fans to 100%, OC the card to the maximum that it is stable and then do 20Mhz down and do the benchmark. It's gonna eat more power and make more noise, but WHO CARES? If you bought a 4080S then your power-bill is the least of your concerns.
Not even a 4090 can do that on these games lol. DLSS has got to a point where quality and balanced are virtually identical to native, so its silly to not use it
I personally am not happy with 60fps... 80 would be the lowest I am accepting. But if I were to buy A 1200 DOLLAR BLODDY RTX MARKETED CARD It better do 4k 120 in most* titles.
Could you make a video like this for the 4070 ti super? Im divided betwen going straight for the 4080 or going with the 4070 ti super (which id prefer since i wouldnt have to change my psu also lol)
I traded the 4070 Ti Super for this 4080 Super, so I dont have it anymore unfortunately. That said, the 4080 Super is about 20% faster in these tests in my own testing. So if the 4080S has 120 fps, the 4070 TiS has about 100 fps. And if the 4080S has 72 fps, it means the 4070 TiS has 60 fps. The difference between the GPUs are really not that big, and to get the same FPS on both the difference is usually just a DLSS Preset. For example, if I got 100 fps with the 4080S using DLSS Quality I can achieve almost 100 fps with the 4070 TiS using DLSS balanced. I still like the 4070 Ti Super. It's fast, efficient, 16GB VRAM, and it really is a very good 1440p performer.
@@Mostly_Positive_Reviews Ty for the answer dude it helped a lot! So it should be fine for 1440p Pathtracing with dlss on balanced (which is just fine by me lol)
That's exactly where I played it on Cyberpunk. 1440p High, DLSS Balanced, PT On and DLSS FG. 100+ fps at all times, and in less intensive areas up to 120 fps, which is perfectly fine in terms of latency when FG is used.
@@Mostly_Positive_Reviews Thats perfect man, you saved me about 400 usd (yeah its pretty expensive here in Brazil...) the 4070 ti super is about 1,1k usd
@@Lucasdelima04 Oof, and I thought it was expensive here! The 4080 Super I have now costs just over $1k USD here, and the 4070 Ti Super was just over $800 USD. So here it is kinda okay to pay 20% more for 20% more performance. But there you pay like 40% more, which makes it less attractive :(
Unfortunately I dont have a UWQHD monitor, and if I set a custom resolution to 3440x1440 my capture card recording looks like the matrix as it seems to officially only support 16:9 resolutions. I have tried recording 21:9 and 32:9 but it just never worked well.
I just restarted Metro Exodus Enhanced Edition last week, and man does it ever look good while also still running very well. The 4070 Super was my favourite GPU I've owned this gen.
Cyberpunk freezes my system. I got rid of all mods because game kept crashing It would run f8ne then then freeze....I seen someone with a 4090 have the same issues. He had to turn off RT because it was glitch his card. Have a 4070 super and path tracing looks horrible. Ray reconstruction looks horrible. I think it needs a 4070ti super at least because mine looks bad. The lighting is great but it looks like wet paint when u move. Like moving in puddles of paint. I had 4k60 with rt on psycho just game would run perfectly for hrs then freeze. Un-installed mods and game.
@@Mostly_Positive_Reviewsi think his point is, 4070 can do 1440p. Ofc 4080s can. I use 4070TiS and yes it can do 1440p with dlssQ with FG and having 100-120fps. All on high settings and cinematic textures. There’s no way 4080s cant. And FG latency isn’t really noticeable if you have 60fps without FG. FG is only bad if the framerates are below 60. Those people hating RT or PT are the ones who doesn’t have these cards. Ofc anything below 4070 would be having rough time running RT/PT I don’t need 4k, 1440p is already great experience having amazing light fx and reflections. Than having 4k without anything new to visuals. even if i get 4080 or 4090, sure they can run 4k but i probably run it at 1440p abusing RT/PT instead with high FPS
It's definitely sharper, no doubt about that, but in some games it also seems to flicker more, especially in more foliage heavy and shadowy areas. The trees and leaves in BMW for example shimmers a lot more at 4K performance than 1440p balanced. Same happens with some lights in Cyberpunk.
It works for me rally well. yes you need upscalingand frame gen but who cares. It still looks great. I have vids on my channel. It's good enough for me. b. But people who cannot afford a 4080 super say it isn't possible lol. wouldn't play with it on though. No point.I do play it for a bit. depending how I feel.
🤣🤣🤣 I actually think it does. Alex from DF does most of his PC analysis at 4K DLSS Performance, and in some of the comparisons I have seen it does look better. PSSR does have some issues in a few games though, so let's hope they can sort it out soon.
I saw a video of indiana jones on a 4080 at native 4k (no dlss,fg) with max PT and settings and it got like 12fps lol I saw another video of indie on a 4090 on native 4k max PT to get max 33 fps and dip as low as 28. And this was the only the starting scene.
In my opinion, Path Tracing is only worth it in Cyberpunk. It completely changes the game and I can't play without it. Ray Tracing does not even come close. I've already completed the game and I am waiting to replay it when I upgrade because I refuse to play it without PT. I get 40-50fps at 1080p XeSS performance which is unplayable a lot of the time especially in combat. There is also a mod called 'Ultra Plus' that improves Path Tracing performance and can give you 10-15fps for free with no visual difference compared to Vanilla PT.
Cyberpunk's path tracing is for sure the one with the biggest visual differences. It's very heavy though, but you're right, that mod helps quite a bit.
LOL 4090 user here. playing 1440p 165HZ and 60FPS on high refresh monitor is like a slide show especially when you are use to higher frames like 100. anything below 100 on high refresh seems like slide show. 30FPS is completely unplayable. 60 feels judder, 80 close to ok 100 sweet spot, 120 ideal. Maybe im spoiled but when i pay 1700pounds for the card thats what i expecting.
Yea, one of the most popular monitors is LG 4K 120hz OLEDs. They look incredible with HDR, if you are playing with a 32inch monitor, you almost always need 4K also
Like, Metro Exodus has its hiccups once in a while in 4k path traced but I should go back to it and see if the newer drivers havent smoothed it out Indiana Jones maxxed out is completely fine I just couldn't get Portal RTX to really run at its full potential, thats such a good looking implementation. Oh yeah, cyber punk path tracing. I want to play phantom liberty but sub 'rt' option I can stand the wierdness of certain scenes reflections. RIP CDPR Witcher 4 is up in the air. To me, Warhammer Darktide with RT is just that game. It runs jank. They need to optimize it more IMO. Star Wars Jedi Survivor would run good with whatever RT they had. The game would crash and use 20+gb of vram every 1-2 hours though. At the end of the day, I knew back when I picked up the 6900xt 16gb of vram in 4k was barely enough. Games like Middle Earth Shadow of War were taking up that. The 6900xt wasnt a bad card, my powercolor red devil wouldn't underclock a ton so it didn't really have much more to offer. I was able to play Gaurdians of teh Galaxy with RT on that card, it was playable. Very enjoyable. Made use of RT highly in that game, it fit the thematic style of Marvel. Ill have to download Control again, but its always kind of been janky with RT on the 6900xt. The game does get shiny. 7900xtx vs shadow of the tomb raider = easy. I was able to beat that game with a little RT with the 2060super, back in the day, have to replay again. The 7900xtx doesn't make the visuals a complete gamechanger, but it runs like butter. "nobody wants to die" is playable, maxxed, but Unreal 5 is using Lumen, probably. Yeah, DLSS is probably a more enjoyable upscaler. FSR 3 gets the job done. The upscalers always looked super jank in 1080p, I jumped to 4k and while the imperfections are still noticeable, its a lot lot less and makes so much better use of it in 4k. I got my hands on Allen Wake the OG game, idk if I want to play Alan Wake 2. Deathloop ran fine. Then, theres games like Predator Hunting Grounds which run jank maxxed out without any capability of RT. They are on Unreal. Bodycam at first would run 60+fps native, with upscalers like FSR 3/AFMF/AFMF 2.0 its 120 for me now. but theres visual artifacts. The xtx was matching the 4090 in that game with lumen. The devs have since been tweaking settings At this point I mostly play like Tarkov, sometimes CoD, but people are finally waking TF up. MW2019 last had RT shadows and it actually helped me get kills. IDK Raytracing is up to the individual. Ghostrunner looks good with it, but theres input delay. You kinda need all the FPS you can get with that game, for speed and reaction times. Cherylnobilite or whatever slaps but is still playable. Its a single player FPS, like Stalker 2, I think I run it 60, but anything above 30 is playable so. Howarts Legacy was like 49-59 fps solid. I have the old i9 7980xe I finally picked up on ebay for like 200 bucks FPS chasing is just up to the individual. 4k slaps all. 1440p/1080p is competitive king. Some monitors have hybrid resolutions. like alienware has one with 1080/4k. Should people pay twice the price of an xtx for a 4090? I look at it like this, 1k is my limit. the 2080ti was around 1k. 5-7 years later, its the standard. So, in 3-5 years xtx/4080 will be the entry standard. The problem is VRAM. That will always screw the pooch with too little. So, bang for buck vram won my brain. They can always turn down RT, or make newer software methods, etc. Like with 7000 series Radeon, we don't have machine learning upscalers yet. etc. So for me, I didn't throw anything away that wasn't going to completely change my world. Cyber punk, Portal RTX, maybe Alan Wake, you would have to be some serious casual fans to justify IMO getting a 2k card to run a 60 dollar unoptimized games. I think Indiana Jones is using Vulkan RT. Runs fine like I said. Doom Dark Ages likely will be using it as well. Vulkan RT, DX12 RT, Lumen can all be programmed to make use of RT accelerating cores in by Intel/AMD/Nvidia. Thats why even Intel is chasing it, besides machine learning.
@@sasquatchcrew That's a long list of games :) Those games are normal RT though, not Path Traced. The only Path Traced games currently are Black Myth Wukong, Cyberpunk, Alan Wake 2, and Indian Jones, where you have to toggle path tracing separately. In Indiana Jones the path tracing option is unfortunately disabled on Radeon GPUs. Then there are games like Portal RTX and Quake 2 RTX that are also fully path traced, but they are more showcases really. The 7900XTX is a real solid GPU, dont get me wrong. If I had enough money I would have both a 4090 and a 7900 XTX to make content with. I got the 4090 I currently own for free, so that helped a lot!
Do you REALLY have to say "frames per second" as many times as you do? I think the vast majority of us know when you mention a number, you're referring to FPS.
Shitty time for gaming …. Lazy expensive games… expensive graphic cards… extremly buggy games … unplayable for first 3-6 months… nonstop shader compilation
Congrats Nvidia, congrats game developers; you have single handedly (or double handedly) ruined gaming with this garbage tech that no one can play on under $1600 graphics cards... This era is already failing. I can see Crysis moments in the future where people are revisiting these horrible games with new graphics cards and seeing if they can actually handle all this garbage...
@@Mostly_Positive_Reviews true, but imagine paying four digits for a gpu and still realizing you live in an era where you cant max out, as in, really max out a recent title like you could back in the day when you paired a new game with a new high end gpu (okay, crysis was one exception but it stood out like an oddball for years while now we have many games like this). so they took that feeling from us.
Quite a few games that dont even have RT are still unplayable on their true max settings today. RDR2 fully maxed out with MSAA at its highest setting struggles even on a 4090. 1440p Max is often below 60 fps, and at 4K it gets mid 20's. GTAV on max MSAA settings is also brutal. Sure, it's fine today on a 4090, but GPUs of the time werent close to being able to max out either of these games. Kingdom Come Deliverance when it launched had an Ultra preset which warned you that it is meant for future GPUs, and at the time the top-end GPUs couldnt get playable framerates even at 1440p with that preset. Games not being able to be maxed out is nothing new.
I'm curious to see what the 5090 brings to the table in terms of ray tracing. But you're probably right, the 6090 might he the GPU we are waiting for to use path tracing properly
the noobs in the comment section are hilarious. you sound like the same old fools crying foul as to why the 8800ultra can't play crysis. these tech will become better over time.
6 years later ray tracing is A$$. Nvida harping about path tracing, when raytracing is overhyped and Terrible, for nearly 2k graphics card. Raytracing awfully implemented in most games.
@@SergioCorona960919 I haven’t looked at path tracing, only ray, and I’m not interested in most these modern games either. Most games these days are pretty trash. Better looking, but worse to play, and more boring. I’m looking at either a 4080 super or 7900 xtx sapphire nitro+ GPU with 7800x3d or above. You assume I’m broke, not everyone is super obsessed with gaming. Some of us are men who win in real life, not only video games.
Spending all that money not getting good 4K native gaming is just crazy to me. We’re just not there yet to really get consistent 4K native gaming for the cost. 1440p is the crème de la crème right now.
Yeah, agreed, but the cost is heavy. $1000 for 1440p gaming is pretty expensive, but it is where we are right now with heavy RT / PT. Hopefully the 50 series sees a good improvement in RT, bigger than the jump from 30 to 40 series. But let's see.
You are missing the point, optimization is so crap these days, developers are getting lazy, putting upscalers in pc requirements to hide their botched work pc hardware is not at fault here, look at all the UE5 games on pc all have traversal stutters and frametime issues its just bad for pcs right now
@@COYG94 This is true too, optimization has gone out the window, especially in UE5 titles. Path Tracing is still very heavy though, but I agree that many developers are forgoing optimization and just adding upscaling and frame gen to get their games playable. I personally can handle a lower framerate, because I can do something about it most times. But shader comp and traversal stutter drive me mad. And then you have games like Monster Hunter Wilds that put frame gen in their system requirements to be able to hit 60 fps... That is inexcusable.
@@Mostly_Positive_Reviews black myth wukong use fsr frame gen to mimic 60fps on ps5 too so yah developers are using cheats now and UE5 was suppose to eliminate UE4 issues what a bummer 👎
@@COYG94you yourself is missing the point! if all UE5 games suffer from traversal stutters then that's not a developer issue. that's an Epic issue, you know, the company that made Unreal. and before you throw the word optimization around, understand what it means. visual fidelity will never be free.
So many privileged children gamers here in the comments that haven’t experienced history. 15+ year ago getting 40 fps in the newest AAA pushing graphical boundaries was perfectly normal. You’ve been privileged over the last decade because we have been using the same old technology that came out in 2008 that has gotten easier and easier to render. You used to HAVE to buy a new card every year or 2 because you didn’t have the right shader model or DirectX version. Here’s the thing, we are entering a completely new era. Ray Tracing is ahead of its time because Jensen is an absolute mad lad. I didn’t think we would get path tracing only a few short years later. If your response is “it doesn’t really look better” nobody cares. Stfu and buy an AMD card if you don’t care and can’t see the difference. I get it the cards or expensive but here’s the reality. Businesses are paying an arm and a leg for these things. I know it sucks but that’s what we get for relying on a fucking island to export 80% of the world’s silicon devices for the last 30 years. Prices are here to stay and the only way to slow the pace is massive infrastructure investments here in America.
RTX 4000 doesn't appeal to me mainly because of ``Frame Generation'' - Fake FPS and also because the games are less fun in much worse conditions than in 2018-2020. How many times they look worse. I will exchange the RTX 3080 for a bargain 4070 Ti Super or wait for the RTX 5070 250-270W. And I consider ultra settings with Ray tracing stupid. On high detail with Ultra textures and object rendering, games look pretty cool and run 30-60% better than on ultra with Ray Tracing.
So the result is that NO, you cant. Because to get there, he always used upscaling, and playing at 720p and 60 fps on 1000 USD card does not cut it for me. Enabling FG to get 60ms latency, which takes u back to latency of 30 fps, does not cut it for me neither. You ppl need to wake up.
I was alerted to my Black Myth Wukong performance possibly being too high. I reviewed my full capture and settings used, and it is correct. However, after an uninstall and re-install of the game the performance is much lower, and the only way I can replicate the performance here in BMW at 1440p is if I use 1080p. I have removed the 1440p results for Black Myth: Wukong so the timestamps are now all over the place. I will fix them in due time.
ray tracing made !4K GAMING! into !1440p rendered at 1080p GAMING! :D
That is...pretty accurate 🤣
Even in 1080p DLSS performance mode you have more Pixel Information than native 2160p thats the magic behind temporal Supersampling aka DLSS!
@@Abenteuerlich that sounds like bs... but it doesnt matter...dlss looks like shit most of the time with trailing and especialy on lower resolutions with perfomance on, the clarity of the picture if severely lowered... so yeah you get more fps but also get a smudgy image with weird pixel trailing.... doesnt seem like a move forward to me... ray traycing still needs some time.
@@dender6816 you are not the first noob who has no idea how DLSS works. DLSS does nothing other than use pixel information from already rendered frames, the AI to evaluate how long the pixels can be used for each frame, even DLSS Performance ( 4xUpscaling ) has 8 times the pixel information of the input res from 1920x1080p.
@Abenteuerlich i mean...i didnt say that i dont know what it does..."noob" i said that it doesnt matter cuz it looks shit anyway and that it needs more time..
It's really hard to show on TH-cam, how different things look with path tracing enabled. Cyberpunk looks crazy in person.
Yeah, unfortunately that's true. Path tracing in Cyberpunk is just on another level.
I'm sorry, but it doesn't, i've tried path tracing at 4k and looks absolutely worse then regular ray tracing, the lightning may be more precise but the loss of quality and mushyness of textures when moving is not worth it
@@tibusorcur what GPU are you using? I think GPU also affect how good RT and PT can be. It's not exactly same across GPUs
@@hongvietnguyen5487 4070 ti super
@@tibusorcur user issue, path tracing looks amazing for me. Are you on amd by any chance, those gpus cant path trace at all, and it doesnt even look right, nvidia intended path tracing to be used with ray reconstruction.
If you can't get AT LEAST 60 fps maxed out with the latest top of the line hardware then the game is not ready for release. I don't care what anyone thinks
Path tracing is optional though, and extremely heavy. Even the 5090 probably wont do 60 fps at 1440p native with path tracing enabled, unless the 5090 sees a 50% improvement in ray tracing / path tracing over a 4090.
Ultra settings are supposed to be overkill and are made for future when there is better hardware
@@GankAlpaca I see the point you are making but it doesn't apply to 99% of games. Look at DOOM Eternal, you can max it out with ray tracing at 240fps on modern hardware and it looks great. Then you look at these UE5 slop games that hardly look better and can't hit 60fps on a $5k PC
@@yeeee859 Doom eternal is 5 years old. We already got the hardware to run it at ultra. Don't get me wrong I also don't like the ue slop
@@GankAlpaca On release it ran like that. It's probably the best example of a truly optimised game
$1200 GPU branded as a Ray Tracing card
Running a game at 720p with the advertised feature
Or doing 1440p30
"4K gaming" by Nvidia
Ray Tracing is gonna end up like Physx, it will have to stop being a marketing tool and actually be integated reasonably without most people even knowing about it
yes phsyx yes
that's honestly what I was thinking, I made the same reference talking to another friend. I'm sure in the next 5+ years we're gonna have a lot of games require RT hardware like Indiana Jones and it'll just be there without even thinking about it.
@@LateNightFire One thing to note, Indiana Jones is using Vulkan raytracing.
Actually after deep diving this Vulkan (open gl) is very close to DX12, they can use the same physical hardware on gpus to do these calculations faster.
Note, I learned last night also that rdna 3 doesn't actually have 'tensor cores' in the same sense as matrix multipliers, as rdna 3 has WMMA, or wave matrix multiply-accumulate.
Its definately a different design
But yeah, I feel you. This is whats getting popular in the tech space and will be bigger and bigger until they hit roadblocks
RTX 3000 was second gen of RT card and it make nice working RT instead of RTX 2000, now RTX 4000 is first gen of path tracing GPUs, so waiting for RTX 5000 to have good PT performance (good means 60+ fps with DLSS Q)
You already have that, it becomes an issue when they shove PT into a UE5 title that's a tech demo
I get that with rtx4060 60+ fps with path ray tracing
@@jacklamat6315 Yeah but at what resolution? 1080p I'm guessing
@@bruvcuzfam 4k 60fps 1440p 80-140fps
@@jacklamat6315 In which game, tetris? If it does get that in any modern game, then it's with some crazy upscaling
For the viewers, in Black Myth Wukong you can get an fps boost using 21:9 aspect ratio to get you over the edge if you need a bit more FPS in a given scenario.
Yeah, that's 100% true. In some places the FPS boost is actually quite decent.
Yeah, 16:9 60fps to 21:9 72fps is a nice 20% boost to overall fps, allowing for a higher dlss preset or higher settings, or just the outright fps boost alone its a neat trick and pretty cinematic as a game with higher fov as a bonus 👍
just don't do that if you´re using any kind of oled screen (unless it's a 21:9 monitor, of course)
@@definitelyfunatparties why not?
@@kerkertrandov459 you'd get the opposite effect of burn in, "reverse" burn in, oleds are organic, which means they degrade with use over time, degradation basically = less brighness, in this case those pixels would be permanently brighter than the rest given enough time. If you do it for long enough for it to actually make a big difference, once you play/look at something that isn't 21/9 you'll see bright bars where the black bars normally are.
We're talking extreme use cases here, of course, but even if you just do it for say, 100 hours.. if you're sensitive enough or you're looking at the kind of image that makes that apparent, you'll start to see it. If you did it exclusively for 1000 hours from new it's basically like the center of your screen has 1000 hours of use and the top and bottom have zero so it'll pretty much be apparent under any circumstance.
people dont get that Ray Tracing/Path Tracing is literal Movie Tech that took Pixar like 20 servers rendering stuff for days on end. Still does. It will never be fit or ready for gaming probably ever without AI frame gen stuff.
Yeah, path tracing is exceptionally heavy, and even ray tracing. The correct term is real-time ray tracing, and that should give people an idea, but many still dont realize just how computationally expensive it is to calculate these light rays and bounces on the fly.
For ray tracing upscaling is required in most games, especially games that make use of per-pixel ray tracing. And then path tracing needs frame generation as the amount of light bounces increase the cost exponentially.
The answer is no, it is not! and I strongly advise not buying the 5080 16GB. Here's what I found: first, my setup is 7800X3D, TUF 4080 Super, 32GB DDR5 6000Mhz, CL30. So in Indiana Jones the great circle. I'm getting 130 FPS at 4k DLSS performance, frame gen on, ray tracing high (not ultra). As soon as I switch to DLSS balanced, I ran out of 16GB VRAM and get 3 FPS... So even before the release of 5080, 16GB is not enough to run DLSS balanced with high ray tracing! This is completely unacceptable, imagine spending thousands and get 3 FPS on DLSS balanced with ray tracing... what a joke
there are rumors about a version of the 5080 with 24GB. If not the only other option is to go all-in and buy the 5090 with 32GB vram
Every generation of Nvidia card in the RTX series has nearly doubled RT performance and 30%-50% raster and I doubt it’s gonna slow down in the next gen either.
Wow that's rough. The 3090 got 24 gigs almost 5 years ago! That's really coming in handy now.
User error, need to lower the texture pool and it'll work fine
@@Kinslayu Yes, you're right, I got in a habit of using max textures, but still, my point was a 2024 title is capable of surpassing 16GB already.
Normal Ray-Tracing: The RTX 4000 series absolutely is just a powerhouse.
Path-Tracing(REAL RT): The RTX 4000 Series gets brought to its KNEES!
So y'all remember when the RTX 2000 Series came out and could barely do Ray-Tracing when it first came out and you could maybe only get away with turning on ONE RT Setting and still get 60FPS. I think the RTX 4000 Series cards are at the same level the RTX 2000 Series cards were at when it comes to PATH-TRACING, you can maybe turn on ONE PT Setting but turn them all on and your GPU will be brought down to it's KNEES. So we might not really see RTX Cards that can actually crush Path-Tracing and get a crap-ton of performance until the RTX 6000 Series Cards come out, the same way we didn't get GPU's that can do Ray-Tracing at 120 FPS until the RTX 4000 Series cards came out.
Yea I agree, my 3070 was horrible at raytracing so I don’t expect the 50 series to be much better than the 40 series for path tracing
Are they really, though? I played AW2 at 1440p full path tracing, high-ultra optimised with DLSS Q, and it ran perfectly fine with 90-100fps. 120-140 if I enabled FG, too
@@fredstead5652 1440p is barely better than 1080p, 4k is where it makes a huge difference (163 vs 109 ppi, a 50% increase over 1440p)
@kerkertrandov459 Honestly, I've never tried 4k on a smaller screen, but 1440p feels more than clear enough for me right now, and I can definitely tell the difference between 60fps and 120-150. I might invest in a 4k oled in a few years when high refresh 4k is a real thing and the monitors are more affordable
@@fredstead5652 with a 5000 gpu u can quadruple 4x ur fps thanks to 3 fake frames per 1 real frame, so u only need 60 base fps at 4k to cap out ur 240 fps oled 4k monitor, and dlss can help u with that, the technology is here, unfortunately 4x frame gen it will only work in 5 or so games but hopefully devs add it to more games
I wonder how a console will look like for it to run something like cyberpunk at ultra ray tracing with path tracing. And what the cost of a console like that would look like too.
Paying $1000 for 30FPS is crazy. Then you add DLSS-FG blur and slop? The coping is hard. Nvidia fans are a legit plague to the PC Gaming community.
Why does a video make you so angry?
I understand why people would use a 4K display for video editing, color grading etc but for gaming it’s just not worth it ! Adding path tracing on top of that isn’t feasible without Frame generation ! I would recommend switching to 1440p DLSS Q + path tracing or 4k DLSS Balanced Ray-traced/rasterized !
why no?
The graphical fidelity is very very impressive. I've been waiting for 50 series for a while now, and I was hoping to grab a 5090 for 1440p high-refresh rate gaming. I play a lot of different titles, from competitive shooters at 360Hz to single-player titles on a controller, but I mostly tend to enjoy M+K gameplay and 100+fps I feel is the minimum for a decent experience on mouse input. I feel like the 4090 does not deliver that on all games, sadly. That said, I bit the bullet and got the 4090. I should've gotten it sooner, as the gen is ending, but such is life in a 3rd world country. My country's currency might devalue considerably in the coming months.
I can't expect 5090 pricing to be anywhere near decent on launch or for at least 3-5 months from its launch, even if US MSRP is acceptable. And when it drops in price, well, might as well bite the bullet once again, as long as it delivers a transformative experience and I can sell the 4090 for close to what I paid for (which surprisingly is lower than US MSRP converted, although not by much).
I'm eagerly awaiting its arrival, as I've kept myself from playing a few games that I was anticipating to have a "full" experience. CP2077 and AW2 included, but more recently STALKER 2.
I have a 3090ti, i just started playing Cyberpunk and it can run 1440p with path tracing on at highest settings with DLSS. Pretty surprised at how they managed to get playable PT in this open world game!
Yeah, it's crazy. It started out as a tech demo and now we have fully playable path tracing if you use a bit of DLSS. And the best part is Cyberpunk has very, very few stutters. Sure, it took them a little while to get here, but it is a very fluid open-world experience now.
Who in his right mind can say that 30 / 40 fps is enough?
I stepped up recently from 60 hz 1080p to 165hz 1440p and the difference is outstanding. 60fps is minimum.
movies are at 24fps is just a habit. steady 30fps are perfectly fine, in game cinematics it is locked at 30fps and people don't even realize it, I bet you didn't. Of course, more is smoother and, but it is not a necessity, it's only a habit, it is a necessity only for PVP and VR. I prefer photorealistic/eye candy 30fps to medium 60+fps
I have an RTX 4080 laptop and I'm happy to compromise playing at 1080p to get smooth 60fps with path tracing. When I want a smooth 4K 60+ fps, I switch to regular RT or turn it off entirely for DLAA or DLSSQ and enjoy that sharp clear image. It's great either way and I'm happy
The main goal is to enjoy it, whether you play at 1080p or 4K or path tracing. As long as you enjoy it, it doesnt matter what anyone else says.
Great video, I feel like you read my mind with the things you test sometimes. I've had the opportunity to use a 4070ti 12gb again recently and I the only path tracing game I tried so far is Portal RTX since that is 'free' on steam. To play that at 4k I also had to go to DLSS Ultra performance + FG. With that game Ultra Performance DLSS isn't the worst looking because of the art style and basic graphics, but for more detailed games like in this video it probably wouldn't look the best, even though DLSS UP has improved in recent DLSS versions. The Ultra Performance also comes in handy in terms of controlling the VRAM and ray reconstruction is also really good because it seems like the exact case it was designed for. Ultra Performance DLSS also helps you generate more frames with DLSS FG because of the lower internal res, so everything kind of comes together.
Still it's sad that only the 4090 can pretty much do path tracing at a somewhat acceptable quality at 4k, and based on the rumors for next gen, we will only get maybe 2 more GPU's on the level of the 4090 or higher. Maybe there will be more architecture improvements that let something like a 5070 do path tracing much easier than current GPU's.
Thanks for always watching and commenting, I appreciate it 🙏
I believe we might see some decent changes in RT performance with the 50 series. Seeing that there are so many games coming out with RT, and some you cant disable it, I think a big boost to RT performance is required.
I am very curious about the 5070, but if it only has 12GB VRAM I'll probably end up getting a 5070 Ti for the channel.
We’re not even sure if 5070 could beat the current 4070 Ti Super with 16gb of Vram with a die of 4080
@@lsnexus2080 Yeah, the 5070 leaked specs look kinda bleak. I think it will probably fall about 4070 Ti performance in raster and maybe 4080 performance in RT, if they improve the RT performance quite a bit. Ideally the 5070 should match the 4080 at least, in raster and RT, but based on the leaked specs that's not going to happen.
Great video! Really detailed and thorough. A few notes would be that some combat etc should probably have been tried with these settings, as usually frames drop there, and that the capture quality seems just a tad on the blurry / low bitrate side. Or maybe it's just my eyes idk. Thanks anyway!
Any help on the capture side would be appreciated! I use an Elgato 4K60 Pro MK2 Capture card with OBS, and I record at 75mbps 4K CBR using NVENC. I tried H.264 as well but the quality is similar. When exporting in Adobe I also use 75mbps CBR at 4K, but other people's videos do look better than mine and I honestly dont know what I'm doing wrong?
Also, appreciate the feedback. Constructive criticism is always welcomed, as long as people arent yelling and being condescending, which you werent, so thanks for that!
@@Mostly_Positive_Reviews You need 100mbps bitrate for 4k recordings at 60FPS. If your output refresh rate is above 60hz I'd recommend 125-150mbps for bitrate - huge file sizes but end result is way less broken up on YT
@@SolidBoss7 Thanks, I'll give it a shot. A friend of mine uploads 5K videos at 75mbps and his videos look a lot sharper than mine, with a lot less breakup. The only difference is he records locally using ShadowPlay and I use an external PC with a capture card.
I think I am missing some setting in OBS as when I use the Elgato Capture Utility at the same bitrate it looks a lot sharper as well. But unfortunately that utility has a lot less features so it's not really viable for me to only use that.
I will try upping the bitrate and upload a test video, see how it goes, thanks!
At 4k you need to use DLSS performance plus FG on top of that to have play with smooth fps with PT. On my RTX4080S in black myth wukong I get 86fps (built in benchmark) at 4K DLSS performance + FG, very high settings and full RT. The game is perfectly smooth and responsive and with the latest 3.7.2 DLL the image quality looks like 4K native to my eyes. At 1440p DLSS quality is enough to run these PT games at over 60fps.
In the cyberpunk at 1440p DLSS quality with PT I get around 70 fps, and 110-120fps with FG on top of that. With psycho RT (standard RT) I get 55-70fps at 1440p native, 110-120fps DLSS quality, and 150-170fps.
IMO the RTX4080 is enough to play PT games as long you use DLSS features.
Yeah, I'm also not opposed to using DLSS features to get a decent playable experience. In fact, I use DLSS Quality and Frame Generation in basically every game that supports it. I'm able to max out my monitor's refresh rate of 165hz that way, and in games where I'm able to get more than that I cap it and it then provides a decent reduction in power draw.
@@Mostly_Positive_Reviews IMO on RTX series cards it makes no sense to play at native at all. DLSSQ is enough to match TAA native image quality, and sometimes DLSS can even beat the native image. What I however like to do is to combine DLSS balance (or even performance) and DLDSR to get even better image quality (nothing beats downsampling) without performance penalty. In Cyberpunk at 4K DLSS balance + DLDSR 2.25x with ultra RT I get 10fps more compared to native 1440p and the DLSS image quality looks way better (4K like).
@@PabloB888 Yeah, there are very clever tricks that can be used with DLDSR and DLSS combined. I did find that Ratchet and Clank at 4K DLDSR downscaled to 1440p had a huge performance penalty for me on the 4070 Ti, but I am convinced it was a VRAM issue. I tested Ratchet and Clank recently on the 4070 Super and even at 1440p Very High with RT it runs out of VRAM, even before using frame generation.
Using FG with 4k native base fps for these is too laggy which makes 1440p the sweet spot even fit the 4080.
@cosmicheretic8129 Nvidia recommends at least 40fps to use DLSS FG, and based on my tests, that's the necessary minimum for this technology to work properly. If the game is running below 40 fps, the image starts to stutter like hell, but once you get into the 40 fps range, the image is smooth as butter. On my RTX4080S OC I rarely see 40fps dips though, even in Alan Wake 2 PT. I usually get 60-80fps without FG, but sometimes in the deep forest performance can dip below 60fps and that's when DLSS FG is doing it's magic. Thanks to DLSS FG, I no longer have to worry about sub 60fps dips because I no longer notice them and I always get smooth gameplay.
Here is why ReSTIR DI/GI is so expensive TODAY. The base cost of these techniques are hight because you start tracing rays per pixel. If you play in 4K DLLS Performance mode that mean you trace for each pixel in 1920x1080 grid. Now for example CP2077 shoots 6 ray per pixel up to 2 bounces for each fram of that grid. Now you do the math on it how many calculations that is...
The great advantage of this is that although today it is heavy even on the best GPUs this cost is much more scalable then any other technique meaning, that ReSTIR DI/GI traces against everything withing BVH and camera view. If RTX 50series will again double the speed in which rays can be calculated that means you will in theory get double the performance. So RT is actually smart way of how to light the scenes and I wouldn't blame developers wanting to leverage that.
With that being said it is important to understand that not everygame necessarily needs RT lighting or reflections. From what I have seen so far the biggest impacts are when the game uses a lot of dynamic lights, changing time of day or dense forests. Other than that you can get away with pre-baked lighting and game will look great in the still shots but as soon as you introduce any dynamic objects in the scene (torches, lights, particles, muzzle flashes...) it will start to show it's videogamy look.
My point is that the RT feature have this sort of flat cost which means that future GPUs will not neccesarily need to be more powerful to get the same results and focus might shift from lighting to simulating physics or A.I. for example.
I agree, PT preset in CP2077 does not worth it, since it introduces too big drop in GPU performance. But it is possible to get so-called "optimized" PT settings via PT "Ultra Plus mod". There are quite a few settings here. I personally was using PT20 preset with "Fast" quality. I also enabled custom DLSS scaling at 60% and preset "E". This way I was getting locked 60 fps at 1440p/PT preset/RR (no FG) on my 4070 Ti. The image quality was pretty decent on the 1440p screen.
Average performance with this mod was roughly on par with base "RT Ultra" preset, but thanks to enabled PT lighting the game looks quite a bit better in some scenes (especially indoor ones).
4080 can run CP2077 with this mod at either higher internal res (67% should be double) or higher quality settings (adjusted through the mod menu).
So, I'd say PT is definitely playable on 4080 S at 1440p. You just need to tune hidden graphics settings a little bit (via the mod).
The mod does wonders for sure. I tested it back in the day on my 4070, havent tested it on the 4080 yet. Will do so soon, because if the performance is similar to RT Ultra it's basically where I'll play the game.
How do set it completely? I been using last few days can’t lie very confused on how to set the mode i want permanently
I exploded with laughter when I saw this $1000 gpu give 17 fps lol.
i have a 4080 laptop which is equivalent to a 4070 desktop and it handles cybeprunk so well! full RT gets me around 60-90fps.
For someone that needs 80-90fps minimum and who won't use frame gen or upscale from lower than 1440p, RT is not viable.
Interesting that Wukong is actually the least demanding of them all at 1440p, yet at 4K they all need DLSS Ultra Performance to break bast 60 fps.
Yeah, I think it has to do with the BVH quality and the resolution of rays, and how much that increases at 4K.
You need FG for PT games at 4K. On my RTX4080S I play Black Myth at 4K DLSS performance + FG, very high settings and max PT I get 86fps in the built in benchmark and 80-100fps during gameplay. With FG input lag is even lower (48ms vs 60ms) because the game is using nvidia reflex and I cant see any artifacts, so there's no point playing without FG. With the latest 3.7.2 DLL and mod that removes excessive sharpening this game has even DLSS performance looks like 4K native to my eyes. With these settings this game looks and runs amazing, so I cannot complain.
my preference is 4k on DLSS performance, and no frame gen. It's normally 45-55fps, which I can live with. but it just shows that the only cards really capable of path tracing at 4k 60fps+ with good rendering are 4090/5080/5090, which is sad but the math requirements are insane so understandable. The exception is Indiana Jones, which runs at 4k, Dlss quality, path tracing at 55-80fps, which is sweet! And honestly it's the best game of the lot.
i see that u have a little of screen tearing have you tried the recommended settings for nvidia gpus? on control panel u should activate gsync on full and windowed screens , vsync on and low latency on ultra those settings help a ton on the latency while using framegeneration.
The capture card is only 60 hz, hence the reason for the screen tearing. If I enable v-sync all benchmarks would be stuck at 60 fps, but I do use g-sync + v-sync on my monitor when not recording 👍
I can play cyberpunk with 70+ fps at 2560*1600 resolution on my Rtx4080 laptop with DLSS quality and frame gen. Everything else is maxed out except for motion blur and crowd density.
Laptop 4080s are quite different from desktop ones, so that's good to hear!
yes when vram isnt a question , 50/58/67/77 for render resolutions you can apply dlaa on top of dlss in most games as well if you were wondering. Also frame generation is designed to be turned on at 60fps thats not an opinion its just a fact its not gonna run correctly if its below 60fps or you turn vsync on gsync also does weird shit with it to.
regular ray tracing i would say 100+ fps, but if we're talking full on path tracing 60fps is understandable given how demanding it is
I'm a PC noob but I wonder if someone could recommend if I should get a 4090 or wait for the 5080 for gaming?
Seeing that it's about 2 weeks before announcement I'd say it's better to wait. What we have seen so far from the leaked specs the 5080 wont outright beat the 4090, at least in rasterization, but we'll have to see what new features are released and also what RT performance looks like. It's possible the 5080 comes close to 4090 in raster, and slightly beat it in RT, but then also cost much less than the 4090 currently does.
wait for 9990 ti ultra ray tracer super
movies are at 24fps is just a habit. steady 30fps are perfectly fine, in game cinematics it is locked at 30fps and people don't even realize it, I bet you didn't. Of course, more is smoother, but it is not a necessity, it's only a habit, it is a necessity only for PVP and VR. I prefer photorealistic/eye candy 30fps to medium graphics 60+fps
The answer is no uppsampling is ugly, and we arent even talking 4k here for a card that cost 1350 eur that is bad.
1000$ gets you decent 1440p GPU these days.. PT is just in few games and it needs rtx 5090 to run well at native 1440p, this 4080 is like 720p PT gpu.
I'm so confused. Can someone explain why with 4k DLSS Performance in CP77 he had 50fps or so on average, but with Ultra Performance he gained 20-30 frames? I thought UP renders it at 720p, so how could it give that many frames more than Performance which is 1080p render? I was expecting 10 frames at the most. That's usually how many you gain from each tier of DLSS
DLSS Ultra Performance is less than half the amount of pixels at 1080p. ~921000 vs ~2100000. DLSS Ultra Performance is the biggest reduction in resolution compared to other DLSS Presets.
Quality = 67%
Balanced = 58%
Performance = 50%
Ultra Performance = 33%
@@Mostly_Positive_Reviews Oh, that makes sense now, thanks!
You're welcome! 👍
Cool. I'm looking forward to getting a 5070 or a used 4080, whenever the 50 series comnes out.
I'm very curious about the 5070 myself. If rumours are true and it only has 12GB VRAM I'll probably get a 5070 Ti for the channel.
@@Mostly_Positive_Reviews A 5070 with 12gb would be very disapointing. A few years ago Nvidia released a paper on a new algorithm for vram compression, but i don't know if it was or will be implimented. Having an idea work on paper and actually using it are two different things. Also the PS5 Pro has an addition 2gb of ram, so maybe that will push Nvidia to up the vram in their cards.
One can hope for sure. I think it'll be a mistake if they go 12GB again, but we still have 8GB 4060 Ti's that released just recently so it wouldnt surprise me.
@@Mostly_Positive_Reviews Yeah, It would suck.
@@walter274 DLSS4 being NTC makes a lot of sense. i just hope it wont require per game implementation and i hope it will be available on all RTX cards
Compare image at 17:18 to image 18:17 and tell me this is a difference which will make your gaming experience any better? Honestly? Than compare the FPS... Its all just a joke, placebo, and a way to pull 1000s of USD from your pocket. And thats in one of the THREE games, where RTX actually makes some visual difference.
Have you switch to MSI afterburner from CapFrameX?
I alternate between them. I have some issues with CFX in some games since I updated to 24H2.
Hab ne 4080RTX OC und spiele auf nem 34"Uwqhd, IPS, 120hz, Curved, G-sync Pannel, meistens Dlss Qualität, weil mit dlaa in Uwqhd die 4080RTX zu schwach ist, wenn Pathtracing oder Raytracing aktiviert ist!
Yeah, DLSS Quality is basically a must when RT is involved.
Hey man! I love your videos.
I have a question. The BMW frames seem very, very high! I just cross-checked with multiple other videos on YT and even a 4090 doesn't average around 140-150fps at 1440p RT Very High (DLSS Q + FG). Every other 4080 video shows it averaging around 80-100fps. Could you re-check what's going on? Did the RT settings enable fully or is something else going on? Every other game seems inline with 4080's performance. BMW is the only outlier here.
Hey man, thanks for reaching out. Sure, I can check again, never hurts to double check. I need to re-install BMW again so will provide feedback in the morning, that okay?
@@Mostly_Positive_Reviews Sure man! Thank you so much. And I didn't mean to question your testing, was just curious haha.
Love your videos and thank you for the quick response!
@@kraithaywire Not a problem at all. I have a slight suspicion as to what might be causing the differences but I want to double check to make sure. I just went through my full unedited recording to verify the settings and they are correct, and the game was restarted 4 times during my testing.
Here's what I think happens, and I'll verify, but I use a 4K capture card for my benchmarks, so the PC detects it as a 4K monitor. Now, if I leave it on 4K Windows Desktop resolution and set the resolution in BMW to 1440p the framerate is quite a lot lower. I read up on it and it seems as if BMW tries to internally upscale to your desktop resolution. So for that reason I set my desktop resolution to 1440p when I test 1440p in that game. It's possible that others tested with the desktop resolution still at 4K. This is just a theory, the game is downloading again now so I can double check because I am curious now as well haha!
@@kraithaywire Okay, so I just retested it and you are right, the 1440p results are too high. The only way I can replicate these framerates is if I set it to 1080p resolution. But in the video you can see the resolution is definitely set to 1440p and DLSS slider is set to 100% resolution scale. The 4K results are the same though, so I have no idea how this happened. Thank you for bringing it to my attention, I have left a pinned comment and I am busy updating my timestamps.
Edit: I managed to replicate the issue. If I set the resolution to 1080p and apply, and change it back to 1440p and apply, it doesnt seem to apply correctly. It shows 1440p in all the settings but the performance is identical to 1080p.
RTX 4080 Super costs $1500 in the Philippines
That sucks 😔 I got mine for $1100 but our prices are always higher than the US.
Great breakdown kind sir. Currently playing through Alan Wake 2 with my 4080s/7600x build and manage 60fps average with ultra settings and full pt. mind you my cpu and gpu are very well overclocked, I've squeezed everything i could out of both, but I have a quick question. How are you getting the overlay on the right that shows PCL? I dont see that setting in msi afterburner so i was justr curious. Thanks in advance!
Thanks for watching and commenting! The overlay on the right is the Nvidia App, not MSI Afterburner. It's not as granular as MSI Afterburner, but it does have some additional options that MSI AB does not have.
@@Mostly_Positive_Reviews Cheers for the quick reply. I actually don't use GFE, instead opting to just download drivers directly but ill be downloading for that feature, would be a good stat to be aware of.
Thanks again!
@@RepublicofODLUM You're welcome. This isnt GeForce Experience though as I had issues with it showing average PC latency lately. The Nvidia App is supposed to combine the feature sets of both GeForce Experience and the Nvidia Control Panel. It's a separate download, you can just search for it on Google.
Sir i have same cpu as yours with 4070 super & Corsair 750w 80+ Gold PSU...in future am willing to upgrade to any 50 series card similar to 4080 in terms of power so can i upgrade or power supply will be an issue ?
My whole system with 2 monitors use about 600w, so probably just below 500W in total for the CPU and GPU + Fans and whatever under heavy load, so 750W will be fine. We'll have to wait and see what Blackwell's actual power draw is to be 100% sure, but I think it will be slightly lower than Ada, so it should be fine.
@Mostly_Positive_Reviews no 50 series wont be low power hungry as per leaks but with considerable performance boost such as 5080 will be 400w with 10% more performance than 4090 if rumors are true✌️
for most cards, even the next gen cards for games of late 2025..heavy unofficial optimisation mods will be calling.. -and more vram-
What's wrong with 39 FPS? It's definitely a playable frame rate, even if it's not fully optimal. For example, the PS5's Quality mode is locked at 30 FPS, yet the game is still perfectly playable, although not as smooth as 60 FPS or higher.
Nothing wrong with it, I know plenty people who are fine with playing at 30 fps. I'm just not one of them. I prefer a much higher framerate of about 120 fps, but it's all personal preference.
4k gaming is the Native resolution without DLSS gameplay. Where you getting at 60fps all the time.
It is not a 4k gaming
I play at 4k on a 4090 but I think if you have a 4080S at 1440p you have the best value for money high end experience as the leap to 4k and a 4090 is a big one cost wise. In some ways I kinda wish I had stayed on 1440p just because keeping up with 4k is always going to not be great bank for buck value
One of the reasons I am hesitant to go to 4K myself. Firstly I would need a decent hugh refresh rate 4K monitor, and over here they are pretty expensive. And then you need something to constantly drive it at that resolution. I actually wish I had forever stayed at 1080p60 🤣
@@Mostly_Positive_Reviews Lol yeah for sure. If you don't need to, don't. I think the leap from 1080p to 1440p is very noticeable but from 2k to 4k... not so much
@@MaTtRoSiTy I have a semi-decent 4K monitor here that I use for my benchmarks, but it is only 60hz, as I dont need more than that to show performance at 4K. But that 60hz is what is preventing me from using it as a daily, and then, as I said, maintaining a high framerate at 4K without too many sacrifices becomes very expensive, so I'll stick to 1440p now. Might go to 3440x1440p UW if I can get a good price on one of those monitors, but I am happy for now.
On Black Myth I think I had it on 55 Render with FG Cinematic, Very High PT
Felt pretty good on those settings and looked great
Feels like FG is a must for Path Tracing games
🤣
Yeah, FG launched pretty much with Cyberpunk's path tracing mode, so that was the idea behind it I believe. And 2 years later we have games using FG as recommended to hit 60 fps hahaha!
@@Mostly_Positive_Reviews 🤣Yeah it does make you laugh.Weirdly it seems to change everything on Cyberpunk because at base fps without FG with Path on. It feels really unplayable and slow, but once you turn on FG it feels a lot more responsive. i couldn't work it out how it does that, FG made vital for Path tracing. I also saw Black Myth that the 4090 wasn't even enough to have DLSS quality render on that over 60fps with Max Path RT, getting like low 40fps FPS before game gen. Outlaws was bad at launch also with that but it's since been improved
I play primarily single-player games. And because I play with mouse/keyboard I tend to flick the camera around a lot, 60 fps is very sluggish for me. I need at least 90 fps minimum to not be distracted and focus on the game.
exactly
Absolutely. No exceptions.
Turn on frame generation, at 60FPS baseline it'll get you there and feel buttery smooth.
@@anitaremenarova6662 AI smearing will make everything better.
@@Moonmonkian depends on the implementation, Nvidia's FG doesn't often have issues.
So you said you're OK with 1440p DLSS balanced but don't want 4K DLSS performance. Kind of a weird take imo. 1440p balanced would be 1506x847 while 4K performance is 1920x1080.
If you compare them side-by-side there are quite a few more visual artifacts with DLSS Performance at 4K, especially in some lights and shadows. Not in all games, but in Cyberpunk for example there is more light flickering, and in games like Indiana Jones in dense foliage the shadows break up more.
@@Mostly_Positive_Reviews I'll have to do some pixel peeping but that doesn't reflect my experience. The 1440p would always look softer simply from being a lower base resolution. I would take 4K DLSS performance mode over 1440p quality (960p) every time I have compared them.
I used to have a 1440p monitor and 4K OLED TV so I used to alternate the two pretty regularly. I have since upgraded to a 4K QD OLED monitor so I don't really mess around with 1440p anymore.
It does look sharper, no arguments there. Just fired up Black Myth Wukong and in the first jungle area 4K DLSS Performance has more noticeable flickering in the leaves and foliage than DLSS Balanced at 1440p. The overall image is sharper though but the flickering is the only reason for me.
And as I said, it doesnt happen in every game, and DLSS 3.8.10 does seem to help with that in games like Spider-Man remastered, among others.
I've got RTX 3090 with 9800X3D. 120fps native is always better than 60fps with RTX and DLSS. I have no idea how people play on 4K monitors. I guess DLSS is the only way.
Same for me, I'd rather have 120fps without RT than 60 fps with it. But I'm also fine with using DLSS so I use DLSS Quality in pretty much all games that support it.
Path tracing is useless until high end gpus can run it native 4k60fps
Even the 4090 you need to go DLSS Balanced or Performance in Cyberpunk I think to get a more optimal FPS with RT overdrive at 4K.
Otherwise on DLSS quality you'll get very heavy dips at base, it's defo playable but slower and not the best experience I found
I found DLSS Balanced with FG to be fine most of the time actually at 4K with Cyberpunk RT Overdrive on the 4080 Super
But for the most optimal DLSS Performance I think, But I prefer balanced to get a good compromise between Image and bit more FPS
Hardware unboxed done a video with RT and think the 4090 had a avg of 45 FPS on Cyberpunk with RT Overdrive 4K DLSS Quality before FG, mad when it costs about 2 grand😆
The 4080 for a grand less not bad tbh and you're losing only about 10-12 fps I worked out in difference to the 4090 on Cyberpunk with those settings i think
Yeah, 4K with path tracing is extremely heavy, even on the 4090 as you mentioned. Hoping the 50 series provides a decent bump in RT performance at least, it's definitely needed as there are a lot of new games releasing now where RT is standard and cant be disabled.
@@Mostly_Positive_Reviews Yeah defo needs to be but since Path tracing is so demanding i'm not sure it's going to be enough of a bump up in performance. 5080 Super is what I might get if vram higher
@@RJTHEGAME Yeah, that would probably be the go-to upgrade for current 4080 owners. I reckon the 5090 is going to be astronomically priced again.
@@Mostly_Positive_Reviews Yeah I just can't justify the prices for the 90 series for myself. If the performance over the 4080 was a massive gap possibly, but theres a line I don't cross in terms of price for performance. Just not worth it for me in the end
The 4080 if priced around £1000 and not what their launch price was is about right for me
Great card for that price
@@Mostly_Positive_Reviews The other thing is I'm still running on a 750W PSU, probably the 5090 would be going over that. Can't be bothered to change that plus pay the silly price lol
i saw rtx4070 1440 gaming before, with framegen its doing well
Nearly all RT is too expensive. The returns are negligible even if you can stomach the artifacts from DLSS.
Jump from 60fps to 100fps on the poll is quote extreme.
60 is fine, but above 75 is desireable. 100 is plenty already, definitely not a mimimum.
Yet 20% of people chose it.
100+ might not be the minimum for you, but it is for some people. People who buy high-end GPUs to play at 1440p for example.
@aboutthat413 Probably due to the fact that the 60 option was too low for them. I wouldn't know what to vote here either. I will absolutely play a game even if it is hardcapped at 60fps. But I would not be happy voting for it as my desired minimum like here.
I generally put myself in that group, I desire high resolution, high fidelity, high framerate images from games. We might say things about over 100, but it is absolutely not true. Everyone would 100% play a game they otherwise love at less than 100fps. We might complain about the framerate, but in the end nothing is unplayable under 100, and everyone who says that is overreacting, and mostly complaining only because it is not the performance they'd like to see.
Or the other possible option, they're playing competetive games. But then their opinion doesn't really matter for this, since this is about high graphics.
I play at 1440p, and I would never ever go back to 1080p. But is it really unplayable? No. But I would not choose it in a poll about minimum acceptable resolution. It's easy to choose untruthfully is what I think I'm trying to say with this :'D
Edit: To add to this absolute novel of a reply: 100+ fps in every game isn't even achievable, unless you play with a 4090 at 1440p medium settings. Which is a weird combo to me, but I guess possible.
I added that option as I know of plenty people who swear by nothing less than 120 fps. If Twitter allowed more poll options I wouldve included more, but because I am limited to 4 options only I wanted to cover the higher refresh rate gamers too. If you look at the comments on that poll there are quite a few that said 120+
@@Mostly_Positive_Reviews That's always annoying when a poll doesn't let you input enough options, I know the feeling.
@@ZenTunE- Yep, ideally I wouldve liked to have 6 options with increments of 15 or 20.
Yo why don't you OC the GPU? If you run it at 3000Mhz clock and +900Mhz on the memory, you could probably get 60FPS in DLSS Performance mode or 50FPS in balanced.
Put your fans to 100%, OC the card to the maximum that it is stable and then do 20Mhz down and do the benchmark.
It's gonna eat more power and make more noise, but WHO CARES? If you bought a 4080S then your power-bill is the least of your concerns.
Well half of the DLSS and FPS magic is FrameGen ;)
I know many people dont like these technologies, but I really do. I mostly play with DLSS Quality and Frame Gen enabled.
U cant benchmark with a cpu as shitty as that, need at least a 7800x3d
I've been playing Cyberpunk with PT just fine with a 4070ti super. As a 4K card the 4080 is weak though
Nah it's gotta be 4k ultra with ray tracing. No dlss or frame gen. So the answer is NO.
Not even a 4090 can do that on these games lol. DLSS has got to a point where quality and balanced are virtually identical to native, so its silly to not use it
dlss is just upscaling and if you don't see difference that only means your monitor is to small and your resolution is to high
I personally am not happy with 60fps... 80 would be the lowest I am accepting. But if I were to buy A 1200 DOLLAR BLODDY RTX MARKETED CARD It better do 4k 120 in most* titles.
Could you make a video like this for the 4070 ti super? Im divided betwen going straight for the 4080 or going with the 4070 ti super (which id prefer since i wouldnt have to change my psu also lol)
I traded the 4070 Ti Super for this 4080 Super, so I dont have it anymore unfortunately. That said, the 4080 Super is about 20% faster in these tests in my own testing. So if the 4080S has 120 fps, the 4070 TiS has about 100 fps. And if the 4080S has 72 fps, it means the 4070 TiS has 60 fps. The difference between the GPUs are really not that big, and to get the same FPS on both the difference is usually just a DLSS Preset. For example, if I got 100 fps with the 4080S using DLSS Quality I can achieve almost 100 fps with the 4070 TiS using DLSS balanced.
I still like the 4070 Ti Super. It's fast, efficient, 16GB VRAM, and it really is a very good 1440p performer.
@@Mostly_Positive_Reviews Ty for the answer dude it helped a lot!
So it should be fine for 1440p Pathtracing with dlss on balanced (which is just fine by me lol)
That's exactly where I played it on Cyberpunk. 1440p High, DLSS Balanced, PT On and DLSS FG. 100+ fps at all times, and in less intensive areas up to 120 fps, which is perfectly fine in terms of latency when FG is used.
@@Mostly_Positive_Reviews Thats perfect man, you saved me about 400 usd (yeah its pretty expensive here in Brazil...) the 4070 ti super is about 1,1k usd
@@Lucasdelima04 Oof, and I thought it was expensive here! The 4080 Super I have now costs just over $1k USD here, and the 4070 Ti Super was just over $800 USD. So here it is kinda okay to pay 20% more for 20% more performance. But there you pay like 40% more, which makes it less attractive :(
Why didn't test UWQHD :C ?
Unfortunately I dont have a UWQHD monitor, and if I set a custom resolution to 3440x1440 my capture card recording looks like the matrix as it seems to officially only support 16:9 resolutions. I have tried recording 21:9 and 32:9 but it just never worked well.
You're using a PoS 14600K, no wonder you're getting literally zero frames.
4070 Super here
I have not tried any serious game with RT like CyberPunk. I have Metro Exodus waiting.
I just restarted Metro Exodus Enhanced Edition last week, and man does it ever look good while also still running very well. The 4070 Super was my favourite GPU I've owned this gen.
Cyberpunk freezes my system. I got rid of all mods because game kept crashing It would run f8ne then then freeze....I seen someone with a 4090 have the same issues. He had to turn off RT because it was glitch his card.
Have a 4070 super and path tracing looks horrible. Ray reconstruction looks horrible. I think it needs a 4070ti super at least because mine looks bad.
The lighting is great but it looks like wet paint when u move. Like moving in puddles of paint.
I had 4k60 with rt on psycho just game would run perfectly for hrs then freeze. Un-installed mods and game.
Why wouldn't it be? My 4070 can do it at 1440p with DLSS Quality and Framegen, surely the 4080Super can.
Many people dont want to use framegen so I was exploring all options to cater for all preferences.
@@Mostly_Positive_Reviewsi think his point is, 4070 can do 1440p. Ofc 4080s can.
I use 4070TiS and yes it can do 1440p with dlssQ with FG and having 100-120fps. All on high settings and cinematic textures. There’s no way 4080s cant. And FG latency isn’t really noticeable if you have 60fps without FG. FG is only bad if the framerates are below 60. Those people hating RT or PT are the ones who doesn’t have these cards. Ofc anything below 4070 would be having rough time running RT/PT
I don’t need 4k, 1440p is already great experience having amazing light fx and reflections. Than having 4k without anything new to visuals. even if i get 4080 or 4090, sure they can run 4k but i probably run it at 1440p abusing RT/PT instead with high FPS
The 4080 super is a 1440p RT / PT card, not 4k.
$1000 for a 1440p card is crazy
@@mutantkilla1440no way it costs only 1k
maybe we can play path tracing with a rtx 6090 yea 5090 might stuggle with path tracing idk am just saying
I have this all playable on my 4070 laptop as well. ❤
pay 1000 dollars for a card to run 40 or 60 fps
Well now it's $1500 so even less desirable
Oof, where do you live? Luckily it's still around $1100 here for a more entry-level model, but the Asus ROG models go for almost $2000 😔
Hard mode: Raytracing benchmark a game that was NOT paid for by nvidia.
Show me a path traced game not sponsored by Nvidia and I'll gladly benchmark it.
He also recently did a 20 game benchmark that included multiple ray tracing tests as well. Guess you didnt see that huh?
$k performance is a higher resolution than 1440p quality/balanced. I wonder why you find the visual quality to be different for you in a negative way.
It's definitely sharper, no doubt about that, but in some games it also seems to flicker more, especially in more foliage heavy and shadowy areas. The trees and leaves in BMW for example shimmers a lot more at 4K performance than 1440p balanced. Same happens with some lights in Cyberpunk.
@@Mostly_Positive_Reviews Odd, considering 4k performance is a higher resolution than 1440p quality.
It works for me rally well. yes you need upscalingand frame gen but who cares. It still looks great. I have vids on my channel. It's good enough for me. b. But people who cannot afford a 4080 super say it isn't possible lol. wouldn't play with it on though. No point.I do play it for a bit. depending how I feel.
I wonder if performance dlss on 4080 looks better than quality mode on ps5 pro 😂
🤣🤣🤣
I actually think it does. Alex from DF does most of his PC analysis at 4K DLSS Performance, and in some of the comparisons I have seen it does look better. PSSR does have some issues in a few games though, so let's hope they can sort it out soon.
16:12 Loosing textures there big time. On $1000 GPU. Well played.
dlss ultra performance, what do you expect?
I saw a video of indiana jones on a 4080 at native 4k (no dlss,fg) with max PT and settings and it got like 12fps lol
I saw another video of indie on a 4090 on native 4k max PT to get max 33 fps and dip as low as 28.
And this was the only the starting scene.
In my opinion, Path Tracing is only worth it in Cyberpunk. It completely changes the game and I can't play without it. Ray Tracing does not even come close. I've already completed the game and I am waiting to replay it when I upgrade because I refuse to play it without PT. I get 40-50fps at 1080p XeSS performance which is unplayable a lot of the time especially in combat. There is also a mod called 'Ultra Plus' that improves Path Tracing performance and can give you 10-15fps for free with no visual difference compared to Vanilla PT.
Cyberpunk's path tracing is for sure the one with the biggest visual differences. It's very heavy though, but you're right, that mod helps quite a bit.
Path tracing, HDR on an oled. Like entering another world.
LOL 4090 user here. playing 1440p 165HZ and 60FPS on high refresh monitor is like a slide show especially when you are use to higher frames like 100. anything below 100 on high refresh seems like slide show. 30FPS is completely unplayable. 60 feels judder, 80 close to ok 100 sweet spot, 120 ideal. Maybe im spoiled but when i pay 1700pounds for the card thats what i expecting.
I guess 4080S will be my new buy in a few months. I've got a 3080 10GB since 2021.
I would highly wait. rtx 50 series will be announced monday night @ 9:30 est.
Is anyone actually playing at 4k?
Yea, one of the most popular monitors is LG 4K 120hz OLEDs. They look incredible with HDR, if you are playing with a 32inch monitor, you almost always need 4K also
Yes because my 4070 is
I mean, my 7900xtx traces like a 3090 but its good enough in games for 4k
Not path tracing though?
Like, Metro Exodus has its hiccups once in a while in 4k path traced but I should go back to it and see if the newer drivers havent smoothed it out
Indiana Jones maxxed out is completely fine
I just couldn't get Portal RTX to really run at its full potential, thats such a good looking implementation.
Oh yeah, cyber punk path tracing. I want to play phantom liberty but sub 'rt' option I can stand the wierdness of certain scenes reflections.
RIP CDPR Witcher 4 is up in the air.
To me, Warhammer Darktide with RT is just that game. It runs jank. They need to optimize it more IMO.
Star Wars Jedi Survivor would run good with whatever RT they had. The game would crash and use 20+gb of vram every 1-2 hours though.
At the end of the day, I knew back when I picked up the 6900xt 16gb of vram in 4k was barely enough.
Games like Middle Earth Shadow of War were taking up that. The 6900xt wasnt a bad card, my powercolor red devil wouldn't underclock a ton so it didn't really have much more to offer.
I was able to play Gaurdians of teh Galaxy with RT on that card, it was playable. Very enjoyable. Made use of RT highly in that game, it fit the thematic style of Marvel.
Ill have to download Control again, but its always kind of been janky with RT on the 6900xt. The game does get shiny.
7900xtx vs shadow of the tomb raider = easy. I was able to beat that game with a little RT with the 2060super, back in the day, have to replay again. The 7900xtx doesn't make the visuals a complete gamechanger, but it runs like butter.
"nobody wants to die" is playable, maxxed, but Unreal 5 is using Lumen, probably.
Yeah, DLSS is probably a more enjoyable upscaler. FSR 3 gets the job done.
The upscalers always looked super jank in 1080p, I jumped to 4k and while the imperfections are still noticeable, its a lot lot less and makes so much better use of it in 4k.
I got my hands on Allen Wake the OG game, idk if I want to play Alan Wake 2.
Deathloop ran fine.
Then, theres games like Predator Hunting Grounds which run jank maxxed out without any capability of RT. They are on Unreal.
Bodycam at first would run 60+fps native, with upscalers like FSR 3/AFMF/AFMF 2.0 its 120 for me now. but theres visual artifacts. The xtx was matching the 4090 in that game with lumen. The devs have since been tweaking settings
At this point I mostly play like Tarkov, sometimes CoD, but people are finally waking TF up. MW2019 last had RT shadows and it actually helped me get kills.
IDK Raytracing is up to the individual.
Ghostrunner looks good with it, but theres input delay. You kinda need all the FPS you can get with that game, for speed and reaction times.
Cherylnobilite or whatever slaps but is still playable. Its a single player FPS, like Stalker 2, I think I run it 60, but anything above 30 is playable so.
Howarts Legacy was like 49-59 fps solid.
I have the old i9 7980xe I finally picked up on ebay for like 200 bucks
FPS chasing is just up to the individual. 4k slaps all.
1440p/1080p is competitive king. Some monitors have hybrid resolutions. like alienware has one with 1080/4k.
Should people pay twice the price of an xtx for a 4090?
I look at it like this, 1k is my limit. the 2080ti was around 1k. 5-7 years later, its the standard.
So, in 3-5 years xtx/4080 will be the entry standard.
The problem is VRAM. That will always screw the pooch with too little. So, bang for buck vram won my brain.
They can always turn down RT, or make newer software methods, etc.
Like with 7000 series Radeon, we don't have machine learning upscalers yet. etc. So for me, I didn't throw anything away that wasn't going to completely change my world.
Cyber punk, Portal RTX, maybe Alan Wake, you would have to be some serious casual fans to justify IMO getting a 2k card to run a 60 dollar unoptimized games.
I think Indiana Jones is using Vulkan RT. Runs fine like I said. Doom Dark Ages likely will be using it as well.
Vulkan RT, DX12 RT, Lumen can all be programmed to make use of RT accelerating cores in by Intel/AMD/Nvidia. Thats why even Intel is chasing it, besides machine learning.
@@sasquatchcrew That's a long list of games :)
Those games are normal RT though, not Path Traced. The only Path Traced games currently are Black Myth Wukong, Cyberpunk, Alan Wake 2, and Indian Jones, where you have to toggle path tracing separately. In Indiana Jones the path tracing option is unfortunately disabled on Radeon GPUs.
Then there are games like Portal RTX and Quake 2 RTX that are also fully path traced, but they are more showcases really.
The 7900XTX is a real solid GPU, dont get me wrong. If I had enough money I would have both a 4090 and a 7900 XTX to make content with. I got the 4090 I currently own for free, so that helped a lot!
That's a 4080 super performance 🤦♂️ 🤦♂️
lol crazy, 720p gaming in 2024
Do you REALLY have to say "frames per second" as many times as you do? I think the vast majority of us know when you mention a number, you're referring to FPS.
18 frames per second? what is this? thats the console experience
a technology that I will never understand because there are no cards capable of running this shit
over price gfx for extra performance...better buy a console for best in value....may not get the best but able to play the game decently
Shitty time for gaming …. Lazy expensive games… expensive graphic cards… extremly buggy games … unplayable for first 3-6 months… nonstop shader compilation
Congrats Nvidia, congrats game developers; you have single handedly (or double handedly) ruined gaming with this garbage tech that no one can play on under $1600 graphics cards... This era is already failing. I can see Crysis moments in the future where people are revisiting these horrible games with new graphics cards and seeing if they can actually handle all this garbage...
Path tracing is optional, nobody is forced to use it.
@@Mostly_Positive_Reviews true, but imagine paying four digits for a gpu and still realizing you live in an era where you cant max out, as in, really max out a recent title like you could back in the day when you paired a new game with a new high end gpu (okay, crysis was one exception but it stood out like an oddball for years while now we have many games like this). so they took that feeling from us.
Quite a few games that dont even have RT are still unplayable on their true max settings today. RDR2 fully maxed out with MSAA at its highest setting struggles even on a 4090. 1440p Max is often below 60 fps, and at 4K it gets mid 20's.
GTAV on max MSAA settings is also brutal. Sure, it's fine today on a 4090, but GPUs of the time werent close to being able to max out either of these games.
Kingdom Come Deliverance when it launched had an Ultra preset which warned you that it is meant for future GPUs, and at the time the top-end GPUs couldnt get playable framerates even at 1440p with that preset.
Games not being able to be maxed out is nothing new.
If you enabling FG or rocking DLSS performance it means the GPU is nowhere near capable
games like wukong will be amazing at native 144 fps when the rtx 6090 comes out
No GPU is capable, then.
They would be if games had at least an ounce of optimization@@Minarreal
5090 maybe, 6090 probably.
I'm curious to see what the 5090 brings to the table in terms of ray tracing. But you're probably right, the 6090 might he the GPU we are waiting for to use path tracing properly
the noobs in the comment section are hilarious. you sound like the same old fools crying foul as to why the 8800ultra can't play crysis. these tech will become better over time.
I was just looking into this. It literally took tech about 10 years for us to have Crysis playable at full details, when we had the 1080ti released.
@ exactly and even 10years after, crysis still beat games in graphics fidelity in 2017.
6 years later ray tracing is A$$. Nvida harping about path tracing, when raytracing is overhyped and Terrible, for nearly 2k graphics card. Raytracing awfully implemented in most games.
@@presmokan just say you broke man Cause pt looks amazing
@@SergioCorona960919 I haven’t looked at path tracing, only ray, and I’m not interested in most these modern games either. Most games these days are pretty trash. Better looking, but worse to play, and more boring.
I’m looking at either a 4080 super or 7900 xtx sapphire nitro+ GPU with 7800x3d or above.
You assume I’m broke, not everyone is super obsessed with gaming. Some of us are men who win in real life, not only video games.
Spending all that money not getting good 4K native gaming is just crazy to me. We’re just not there yet to really get consistent 4K native gaming for the cost. 1440p is the crème de la crème right now.
Yeah, agreed, but the cost is heavy. $1000 for 1440p gaming is pretty expensive, but it is where we are right now with heavy RT / PT. Hopefully the 50 series sees a good improvement in RT, bigger than the jump from 30 to 40 series. But let's see.
You are missing the point, optimization is so crap these days, developers are getting lazy, putting upscalers in pc requirements to hide their botched work pc hardware is not at fault here, look at all the UE5 games on pc all have traversal stutters and frametime issues its just bad for pcs right now
@@COYG94 This is true too, optimization has gone out the window, especially in UE5 titles. Path Tracing is still very heavy though, but I agree that many developers are forgoing optimization and just adding upscaling and frame gen to get their games playable. I personally can handle a lower framerate, because I can do something about it most times. But shader comp and traversal stutter drive me mad.
And then you have games like Monster Hunter Wilds that put frame gen in their system requirements to be able to hit 60 fps... That is inexcusable.
@@Mostly_Positive_Reviews black myth wukong use fsr frame gen to mimic 60fps on ps5 too so yah developers are using cheats now and UE5 was suppose to eliminate UE4 issues what a bummer 👎
@@COYG94you yourself is missing the point! if all UE5 games suffer from traversal stutters then that's not a developer issue. that's an Epic issue, you know, the company that made Unreal.
and before you throw the word optimization around, understand what it means. visual fidelity will never be free.
So many privileged children gamers here in the comments that haven’t experienced history. 15+ year ago getting 40 fps in the newest AAA pushing graphical boundaries was perfectly normal.
You’ve been privileged over the last decade because we have been using the same old technology that came out in 2008 that has gotten easier and easier to render. You used to HAVE to buy a new card every year or 2 because you didn’t have the right shader model or DirectX version.
Here’s the thing, we are entering a completely new era. Ray Tracing is ahead of its time because Jensen is an absolute mad lad. I didn’t think we would get path tracing only a few short years later. If your response is “it doesn’t really look better” nobody cares. Stfu and buy an AMD card if you don’t care and can’t see the difference.
I get it the cards or expensive but here’s the reality. Businesses are paying an arm and a leg for these things. I know it sucks but that’s what we get for relying on a fucking island to export 80% of the world’s silicon devices for the last 30 years. Prices are here to stay and the only way to slow the pace is massive infrastructure investments here in America.
RTX 4000 doesn't appeal to me mainly because of ``Frame Generation'' - Fake FPS and also because the games are less fun in much worse conditions than in 2018-2020. How many times they look worse.
I will exchange the RTX 3080 for a bargain 4070 Ti Super or wait for the RTX 5070 250-270W.
And I consider ultra settings with Ray tracing stupid. On high detail with Ultra textures and object rendering, games look pretty cool and run 30-60% better than on ultra with Ray Tracing.
So the result is that NO, you cant. Because to get there, he always used upscaling, and playing at 720p and 60 fps on 1000 USD card does not cut it for me. Enabling FG to get 60ms latency, which takes u back to latency of 30 fps, does not cut it for me neither. You ppl need to wake up.