I think 90% of the time unless you're someone who pauses at each frame to analyse the visuals you won't notice a difference, but I'd be lying if I said whacking everything on Ultra didn't feel good
My laptop can go to Ultra >70fps but I still only use High or Very High in single player games then low on multiplayer. Anytime I use Ultra settings... It's just for screenshots, selfies/photomodes, and recording in-game cutscenes because that's the only time it felt really good to use anyway.
From my experience, the only real Ultra Setting that actually makes a difference and I recommend is the ultra Texture Quality setting for RDR2. If you put that at Ultra, but set everything else at Medium, the game still looks utterly fantastic, and it will run pretty well.
@@DragonOfTheMortalKombat Yeah, when they started including things like anti-aliasing and ambient occlusion, it felt pretty sweet to be able to turn those things on.
The other great thing about older gaming is that older games cost s fraction of the price. One of the benefits of being 2-3 years behind is that great games are €10
Yeah totally agree, here in Australia while new games are generally $90-$150AUD at the moment the whole Metro Series is on sale at Steam it cost me under $20AUD for all 3 games with DLCs, Plus GOG are always having a sale & there have been some interesting games for the EPIC Weekly Freebie.
This is what I do. The games are way cheaper, it's one thing but they also have been patched , fixed, optimized by the time I buy them. And I don't have to sell a kidney to buy the latest Nvidia high end rig to play a couple of AAA games on ultra.
got the chance to finally play rdr2 and it came set to ultra by default. Ran the benchmark, got 60fps with some drops to 45~50. Then I tried Digital Foundry's optimized settings and got 100fps+ and couldn't even tell the difference in graphics
@@mrducky179 For me, it was screen-space reflections (SSR) Most games that use SSR end up taking a big performance hit, for almost no gain in the quality of water reflections
@@mrducky179 i really, really wish MSAA was still an option in video games. i fucking LOVE MSAA i don't care about the performance hit. TAA looks gross.
Nowadays games have become so demanding even on low settings that a mixture of med + high settings are a good balance between performance and quality. Good informative video by the way, learnt a lot!
The problem is that despite modern games still being demanding on low, most still look terrible with muddy textures that rival the PS1 era games yet somehow take up 4 to 6 gigs of VRAM. You'd expect it'd look like ultra from 10 years ago, but no, but no, not even close. That's why I'd like to see a comparison between modern low settings vs old ultra settings, visually and performance-wise.
I don't know if it's in all Assassin's Creed games, but Origins has a nice interface when adjusting graphical settings, showing you the differences for each setting. Then you can decide whether you'd really miss the 'tad sharper text from a great distance' or 'extra puffy clouds'. And then it also has a performance graph running while you play to show exactly the loss or gain.
Ubisoft is really good with stuff like this. the earliest game I can recall it on is ghost recon wildlands and it's been in quite a few Ubi (and other) games since then.
What is also worth mentioning is that if you play games that don't require high framerates, you can lock the framerate and still benefit from lower settings in the form of decreased power consumption and temperatures. Especially if you use a power hungry card.
Absolutely. I have a 3060Ti and a 120hz oled tv, yet I always play at 1440p, locked at 60fps (+undervolt and mild underclock on cpu, but heavy overclock on memory). My GPU is usually running relatively silently at 85% usage, at 70-75°c. Great for my electricity bill and the longevity of the GPU too.
@@armyofninjas9055 its for benchmark purpose, for playing regularly undervolting and fps cap is highly recommended for any games just with different number
There's also the fact that common settings like SSR, AO, Shadows and Volumetric effects are the most resource intensive these days and should be the first ones to crank down when tweaking.
Thank you for this video. We always see videos tested with high end machines but this area of hardware is where tests need to be made the most, not the high end stuff. So, thank you 😊
There's always the point where you cross it you start seeing diminishing returns and that's the line between high and ultra while taking a 20-30% performance hit or more generally, and the days of setting everything to low-medium or high is not enough anymore as there are settings that if you put on ultra-high or low they take no performance but others that take a lot of performance yet still look roughly the same on other settings which is weird. That's why youtubers that benchmark and compare every setting from low to ultra and finding the optimal points between performance and visual fidelity are a must these past years
Hardware unboxed used to do optimized setting guides, testing what was most bang for the buck (their RDR2 video still the best), but they stopped. DF does the same but in less detail, but Alex has been sleeping and not done it for many of the new releases, so we kinda stuck in the dark on what settings are actually best.
Major PC games this year so far have been lackluster and horribly optimized. Notable major titles: Forspoken, Dead Space remake, Hogwarts Legacy, Wo Long: Fallen Dynasty, The Last of Us Part 1, Star Wars Jedi: Survivor. They all needed multiple patches after launch to get to playable state. Not worth deep-diving into "optimized" settings when they are so broken on launch day.
Idk if youve noticed but almost every AAA pc port has launched so unbelievably unoptimized that changing settings doesnt fix anything, which is why youtubers who normally make optimized settings guide for games have not covered some of the more recent broken ports such as jedi survivor for instance.
@@RandomGaminginHD I would take a 4090 to run at 4K with high FPS. I'll gladly turn things down if they have no effect on visuals, but boost FPS when turned down.
Interesting how little difference there is between the ultra & high settings in terms of graphics but quite a bit of a jump in performance. Done some tests on my specs - RTX 3060 Ti / R7 5700X / 64GB (2x32) DDR4 RT (where available) is off for all tests. Also turned off is motion blur, DOF & bloom for personal preference. Readings taken from CapFrameX. Numbers from left to right are average FPS, 1% & 0.1%. Assassin's Creed Valhalla Ultra - 101.4 / 65.4 / 42.8 Very High - 110.8 / 76.9 / 60.3 Borderlands 3 Badass - 94.7 / 76.6 / 9.8 (!?) Ultra - 100.1 / 81.3 / 58.8 CP 2077 Ultra - 90.6 / 63.8 / 52.7 High - 96.4 / 65.3 / 54.7 FH5 Extreme - 91.2 / 75 / 66 Ultra - 102.6 / 79 / 64.7 Hogwarts Legacy Ultra - 65.5 / 41.3 / 30.4 High - 69 (nice) / 38.8 / 26.1 Spider-Man: Miles Morales Max - 99 / 62.1 / 41.6 High - 111 / 68.5 / 54.6 Shadow of the Tomb Raider (SMAA4x) Max - 113.8 / 81.4 / 53.2 High - 127.9 / 96.7 / 69.7 Watch Dogs: Legion Ultra - 86.2 / 68.9 / 59.3 Very High - 103.1 / 76.1 / 68.8
Two bits I noticed: Forzas Extreme preset had a notable difference in that the car cast shadows on itself, where it didnt on Ultra, which was one of the more visible differences. Hogwarts Legacy was the other. The high preset had this annoying fog that I think was the greatest difference in looks between any of the examples. It would annoy me to hell in a game if I ever found out. Also, the thing with stutters is definitely an issue of stuff loading in, bigger textures cause a bigger stutter. Nothing groundbreaking here, but it can be a big factor in certain games that try to load things on the fly when they get paired with old HDDs, or even SATA SSDs, where the data rate just doesnt meet the games demands. But thats honestly on the game as its an optimization issue, and the textures could be cached ahead of time into RAM if enough is available...which in this case it certainly was.
I have seen many split decisions on the settings of the highest and next one down, but most tech tubers would surmise that the highest setting would be more used for the photographic people out there who like to take huge amount of screen shots more than for the playability as the fidelity is like you said a lot more on a still image. I most of the time on my 6700XT playing at 1440p will use the setting down unless of course it can get cranked to the highest without much trouble. Great video yet again and just confirms really my thoughts on how I already do things.
@@RocketRenton My 6700XT cost £600 when I had no choice to replace my 1660ti If it had been 3 months later I would of got the 6800XT for the same price however I needed it at the time lol I would defo favour the 6700XT over the 3060ti any day. With the way things are going with Vram AMD saw this coming a long time ago which is why most of their reasonably priced cards are 10gb or above. I fitted my son a 6700 with 10gb as his budget wouldn't accommodate the XT variant but he plays at 1080 so big issue. The only thing Nvidia cards have slightly better is DLSS however there really is not much difference between FSR and DLSS when your actually into the game. For me team green lost my confidence a long time ago and have been team red for a while now and will probably stick that way in the future, my next upgrade is going from a 3600x to 5800x3d to maximise my AM4 platform until the AM5 gets over its hiccups.
How many fps that you got in recent title using RX 6700XT on 1440P resolution and high graphic, the problem with benchmark nowadays is that they use the highest graphic option while i am just gonna use high setting, and also apparently video recording take some fps (some argue that it is false and some argue that it is true), so i want information from the real owner (I want to buy RX 6700XT in the near future or maybe something better if the market goes right)
@@jalma9643 I've never done much recording with it but I would assume I wouldn't be able to run the very highest whilst recording as it probably would take a performance hit, however for instance with Hogwarts Legacy I'm get around 75-100 fps on max settings with FSR on. If there's been any dips I have not noticed them what so ever. With games that people play online like cod and fortnight I wouldn't know but people use competitive settings so one can assume there is more than enough grunt for those. One of the main games I do play is Farming Simulator 22 (yes I know its just farming) however I'm able to play that completely maxed out with 200% res scale so essentially 4k with changes in the ini to further distance rendering by 4x and I'm getting a comfortable 60-75fps on average more than enough for me. The cost to performance is defo in AMDs corner for that gen of cards. I cannot fault this card in anyway. The only thing I would suggest is if your an RT fanboy then probably not the card for you though it's capable its not very strong at it in anyway but I don't use it so that feature to me is irrelevant. I have XFX quick and its really quite quiet under load also even OC so for me it's a winner and would highly recommend it as a med/high end card.
Actually you can often get away with with low settings but Textures at Max looking really good in my opinion. Recently did that with Crysis Remastered. I almost did not notice a difference, only when driving cars in third person FOV did I notice some popin. I bet I could increase one of the video options just one notch and get rid of that. And the FPS gain was massive by the way.
This was a fantastic video... You've actually changed my mind on ultra settings... I usually just go for the absolute best visuals, but I quite literally can't tell ANY difference in most of these comparisons.
I think High is enough most of the time, I even went to get my glasses halfway through the video as I wasnt noticing anything that different between the options. The main thing I noticed that ultra sometimes seems to have better lighting on things like hogwarts, but that seemed to be about all.
Big difference in the draw difference in Hogwarts. Forza you can see a difference in lighting/reflection in a pause screen but in game you ain't gonna be looking at that.
sometimes i wont lie i can't tell if raytracing is on or not unless its mentioned in the video because things are not raytraced enough in alot of games.....
there are instances where there is a very clear difference in quality between Ultra and High (not necessarily that they're WORTH, depending on who you are, but if you're looking for them you can find them). TH-cam compression, however, says "nah...these are the same picture"
I went from a 1080p display to a 4k display recently and noticed resolution is infinitely more important. Since you wont see the tiny details with the difference between high and ultra without being in a higher resolution. I tried staring at a wall in 1080p and 4k and going to between them is like turning quality from low to ultra. In 1080p i saw a massive cobbled wall. In 4k i saw the fingerprints of the man who placed the bricks there and was able to estimate the date the wall was constructed. (both in max settings) Edit - After getting used to 4k, i do wonder how i ever was able to see anything in 1080p. Sounds silly but my god it looks mushy now.
I never see the differences in Ultra vs High comparisons on TH-cam, but when I play games myself and in Native Resolution, without the TH-cam compression, it always looks noticably better.
to be fair the last time a game really looked massively different at the highest preset was Crysis of all games. Everything else is just "future proofing" by increasing things that you can't see without pixel peeping. Ultra used to mean something and now its just a term.
I think this comparison should have been done with a 3060 instead of a 3050, the 8gb vram buffer might mean that games might be silently downgrading or compressing textures so as to fit within the vram buffer. Hogwarts legacy and forspoken have been shown to do that, leading people to think that the game looks uncharacteristically bad on their 8gb gpus.
That's literally me up until this year I bought an actual gaming pc The funny thing is that I managed to beat isshin with only 12fps You can't imagine my feeling when I fought him again with 60fps
the fact that I played Crysis Remastered on the maximum settings, including RT, despite using 1650S. my justification was "I finished this game dozens of times, I don't care I'm getting 15fps stable, I want maximum visuals" it was something...
I got into PC gaming back in 2018 and I always followed the "never use ultra settings" rule, because I always had lower-end GPUs (1050 Ti, 1660, 1660 Ti). But last week I upgraded to a 4070, which is a massive leap in performance from the 1660 Ti I've been using for a few years. Now that I have the GPU horsepower that can max out games even at 1440p, I'm finally able to just set all graphics settings to max. Definitely feels good. Though I will admit that with all games, there really isn't any significant or noticeable difference between ultra and high graphics - unless you're comparing raytracing on vs raytracing off visuals, of course. And even then, in some games you can't even tell the difference between raytracing on vs off. But it does still feel good to know I can just set everything to ultra, turn on raytracing, and just play the game while maintaining a 60+ frames per second performance.
This is why I optimise my settings or find others who have already done so, even though I can max games out. Unless its LOD settings, which I max out instantly just to reduce as much pop in as possible. Cant wait for nanite to help drastically rude that.
I agree with other comments. The only "ultra" or truly max setting I ever enable is texture quality because I have the VRAM for it, and it's the most noticeable improvement with the smallest impact on frame rate. Lighting settings are by far the most intense and usually past "medium" or "high" you're just giving up half your frames for tiny differences that you can barely notice.
I think in most cases in this video i liked the lower presset more than the higher, probably the better illumination makes stuff looks shallower in the higher presset
The differences are more noticeable at higher resolutions. Yeah, wont see much on 1080p, but things like lightning, thextures and shadows have a great jump on 4k, you can CLEARLY set presets appart.
Having actually working for small development startup studios it actually comes down to the quality of assets the team is using or the abilities of the engine. Some tie the fidelity of the assets and textures to the resolution and its already a set graphic quality, and some actually change the entire game from textures, filtering quality, draw distance, hair, lighting, etc. In order to really notice you have to use a wide variety of games. Games from big talented studios like Naughty Dog will be harder to spot because their art direction and animations are so good. You need to use indie and non-AAA games in order to notice alot more.
Ultra settings are usually built for future hardware in mind. Typically the best experiences is mimicking the settings consoles use if applicable. Lots of console games often have the best mix of visuals to performance given their hardware. Replicating those settings on a PC and potentially getting higher framerates with your better hardware is nice. On my 3080 and 5800X3D I play Forza Horizon 5 using the settings the Series X version has, but with the game running at 60 instead of 30 since I have the extra power. Max settings above the console version doesn’t really look all that better.
Like you said mixing up settings and going with a custom balanced preset is always the best choice , for example the ultra and high settings in cyberpunk are very similar in terms of visual fidelity and the biggest difference is reflections (SSR) as ultra is very expensive to turn on and costs more frames than Ray traced reflections , so it's a no brainer to turn it off
keep in mind that chromatic aberration is automatically enabled on the ultra preset and disabled on the high preset. it adds a lot of blur to the edges of the ultra preset. using ultra without it might be a closer call.
Seems like the Vram usage changed when switching between the highest and second highest settings. That's something to consider if you are running close to your max Vram.
Worth noting that quite often using lower settings will increase CPU usage as the framerate goes higher. That can actually be seen in this video. Most settings only affect the GPU, only certain settings will affect the CPU (like draw distance, physics/animation, ray tracing). With a low-end CPU, you might actually find yourself CPU-limited and lowering the settings might not even help.
One thing I've learned is often when you are actually playing a game, you just don't notice a lot of this stuff. If my computer can run a game maxed at over 120FPS at 1440p, I'll go ahead and crank the settings. If not, I'll dial them back. But, I've actually dialed some settings back before even if my system was handling it fine, because I wanted to reduce the load on the video card to make it run cooler and quieter, especially during certain times of year. Maxed settings are nice until your room turns into an oven.
I’ve always had things on medium - low settings throughout my years on pc, just never saw the minor graphical improvements worth losing frames. Current rig is a 2070s & 9700k and will likely continue to do this even if i won the lottery
Do you remember the old days when one level of graphical fidelity looked like completly different game? Recently I played Heretic 2 for nostalgic reasons. Great looking game for the time, especially with Glide or OpenGl, but you wouldn't want to play in on medium or low.
I swear, there's often no difference between the highest and 2nd highest preset screenshots. I would have to take the 2 screenshots and take a difference with GIMP to see what's up. Might be that they're at 1080p though, there may be an actual difference on higher resolutions.
its really hard even for me on 4k to see the difference in the youtube video between the ultra and high...and don't forget youtube compresses videos to hell and back
Jedi Survivor looked (to me) to have the biggest improvement going from high to ultra. But to tell the truth, I'm shocked at the small amount these changes make. Been a gamer since the 90's (1890's, feels like) and like you said, the gap keeps closing. It used to make a quite large difference going from "high" to "highest" (or equivalent) settings.
it really annoys me that most benchmarking channels on youtube only test on ultra settings. Especially when the videos make it look like the games not playable on certain hardware. for example, looking for Forza Horizon 5 benchmarks with the RTX3060, most videos make it seem like you cant play on 4K60fps with the card, when in reality you get 70+ fps at 4k if you just turn the graphics down to ultra and maybe use DLSS Quality. In general, 4K benchmarks with RTX cards should always show DLSS aswell.
For Cyberpunk, I would like to point out one thing not obvious from the footage in the video. High preset (at least at launch) had Contact shadows off and this made a big difference for character faces, without it they looked a lot flatter. The setting came with a performance hit, but was IMO worth it, so my actual settings ended up being based on the High preset, but with Contact shadows turned on and Aniso 16x. I think that after some updates (after I finished the game) they changed this (broke it up into more than one setting, and I'm no longer sure which preset does what).
One thing that would presumably be different would be things like effects when action's occurring like explosion's or magic spells. I already agree that they're not enough difference between max settings vs the next best for the performance downgrade most of the time. It's just that with effects, makes me wonder if max setting might actually end up seeing lower lows as a result.
I can't remember the video I saw once that Ultra vs high (or equivalent), but the point he made was that it often came down to psychological reason then the actual results which often didn't really matter.
What about mixing High with medium and High and Ultra comparison? I do also kinda think Medium nowadays is less dogwater than it used to be and Medium - High mix settings is a good balance that I find heavily underrated
what i do is low to medium preset then setting the texture quality to high or ultra it looks way better then low textures and with less fps drop compared to setting every thing high or ultra you can confirm with a test of your own
I remember the first Forza Horizon Port running kinda bad on pc only if you ran on Ultra but it ran great on high and the dev said something along the lines of "Ultra is meant for taking pictures, High is meant for actually playing the game" and ive stuck with that mantra ever since.
I love to watch digital foundry for the technical aspects. But when they have to slow down the footage while zoomed in, so you can see the difference after they pointed it out, you can notice it, when you look straight for it. Tells me barely make a difference in game. Shimmering from bad AA on the other hand can be distracting even in the peripheral field of view. In the 90s, you could tell from across the room, if someone was still running a 3D Game in software mode, or had one of these new fancy 3D Accelerators, with all smooth and warp free textures.
I'm not sure if a 3050 is the best to show this, since games will sometimes load in lower resolution textures regardless of your settings if your card can't handle it from what I understand... I think I don't know.
The trouble with GFX comparisons on YT is the compression. Even at 1440P, the artifacting around the car is awful. Can't wait for AV1 to be a standard. No one has looked into it from what I've seen, but lets hope the sound gets improved too...although, it does seem to be a YT thing, so maybe not.
id rather set the preset to high then increase every anti-aliasing settings and visual clarity enhancers like filtering, anti-aliasing, lowering texture settings depending on how much vram i have, then setting some unimportant settings to high or medium, but making the lighting/shadows/AO to the max, a balanced fps of 60-100 on games with quality visuals
My 980ti ran everything I needed it to (at 1080p) and almost everything I wanted to play ran at max settings (I don't care for a lot of games that came out after 2016 so it makes sense) ... until I ran Doom Eternal. I think I played through that game at medium settings and overclocked my card a bit (which still looks great). But last year when the market was decent, I snagged a 6950XT. I still like running old hardware when possible. I have a blower R9 290 that sits in a spare rig for LAN parties... Loud, but still crushes most games we play. I always ask people to look at their most recent games to justify an upgrade. CoD WaW, CSGO, Risk of Rain, TF2, Borderlands 1, Saints Row 2/3/4. Portal 1/2. Fallout 3/NV. Bioshock series. Half Life. Plus more indie games than I could ever count... Most of these will run on a R9 290/390 or RX 480 from AMD... And from Nvidia a 970 or 1060 will still play lots of amazing titles (scored a good deal on a 1070 for build I did for a friend a few months ago). Which might explain why the 1060 was top of the steam hardware chart for years.
Even if my rig is capable of Ultra, there are some settings that just seem silly. For example, "shadow resolution" seems unimportant. Sure I don't want pixelated shadows, but I've yet to see a game where I need to set shadow resolution above "medium" at most. Then again, it's not the dumbest/useless setting I've seen (i.e. DOF, Subsurface scattering).
Shadow resolution depending on the method used can be far more noticeable at higher render resolutions than low. Same with volumetrics or any other setting with a resolution independent of render res. If each setting level is a static resolution rather than a % then lower will look fine at 1080p but can be really distractingly ugly at 4k. Oftentimes high matches 4k though so ultra is more than needed. Then there are games that tie resolution and draw distance into a single setting. It can be really annoying seeing the shadows being painted in only a metre in front of the character.
I generally an a 80/80Ti class GPU buyer and this time (I’m learning DaVinci Resolve) opted for an RTX 4090 and I still only game at high settings (never ultra) as the performance hit, and even frame dips do occur more so in many titles which I’d rather avoid. Also, other than shadows and lighting definition differences, these aren’t noticeable in the vast majority of titles when your moving around the game ….imo anyhow
This will vary from game to game for sure but high settings look good with ultra textures in most games. Games like Assassin's Creed Odyssey/Valhalla though for example you'll notice things like Terrain, fog, clutter and water when lowered but even then at the high settings rather than very high they still look really good. I go with ultra if my GPU coasts it otherwise I don't mind tuning a few settings. That's the beauty of PC gaming over console gaming you can fine tune the game to run as smooth and look as good as possible on your hardware. At least when the ports aren't released needing a dozen patches to get them running well. Great video mate don't see these ones very often.
We are now in a period where graphical fidelity has advanced so much that the ultra settings has become obsolete as evident by how microscopic the differences are in this video between high and ultra. You can only improve visuals so much and we are really hitting diminishing returns. At this point ultra is simply for bragging rights nothing more.
TIP FOR OLD BUDGET GPU GAMERS : to increase fps in pc games, just turn you shadows to low and anti aliasing to fxaa or equivalent. If game supports Ambient occlusion, please turn it on cuz it makes a huge difference in visuals without sacrificing fps. TEXTURES ALWAYS ON MEDIUM-HIGH. And this works for every pc games.
It depends on hardware and monitor choice. If youre shelling out more than 1k for a card then you better be able to turn on all the eye candy. Also if you have a nice high refresh rate 4k monitor then lower settings definitely wont look as good. But if youre more budget or “lower high end” (emphasizing the quotation marks cause anything over $400 seems to be peoples limit) oriented then med to high is the way to go for best clarity. Low to medium is good for those on the lower budget side of things. For the most part textures makes the biggest difference between low to ultra in most games. Another one is more than 2 options ray tracing if that is turned on it does make the game look different compared to base.
As a 1440p player i tend to use medium or high preset and just crank up texture and filtering to match my 12gb of vram. Then i seek for some extra optimization based on my configuration
The biggest different when in static image seemed to be Forza but when you play you still see nothing. So unless your card can play highest settings no trouble there isn't much reason to bother.
im happy with my RTX 3050 & i dont mind playing all triple AAA games on high instead of ultra settings for better performance thx for testing my card in TLOU1 & Hogwarts legacy i thought 8GB Vram is not enough for them in the highest settings but i was wrong now i can buy both of them without regret but before that do u think 16GB of ram is enough?? or should i upgrade to 24GB? i hope ull answer me i rly appreciate it :))
I think it's much better to do a comparison by flicking back and forth between the shots, rather than show a transition or side-by-side. Also, upload in as high quality as possible at 4k.
The reason why is pretty simple, we hit the quality wall for raster graphics at 1080p with the GTX 16xx/RX 5xx series GPUs. The future for raster graphics is in higher resolutions. 4k is the arena where AMD and Nvidia have chosen to compete now, and what most "Ultra" settings are targeted at. There's little room to add more that isn't annoying, lost on 1080p, or only looks good when done with ray tracing anyways. To the point you could probably get away with using a Quadro P4000 with "RTX Experience" (Nvidia's Quadro equivalent to GeForce Experience) to game comfortably @ 1080p (cheap and power efficient to boot) for years to come.
So around 30% fps increase in many games with barely any visual difference, i usually run my games with ultra textures and rest on high. My Red Devil 7800xt is a beast with those settings at UWQHD.
*If u play on PC anyway don't use the presets for video settings* most games can easily get a 10-30 fps increase just by switching 3 or 4 settings from Ultra to High (in some games even going down to medium doesn't change visuals on some options) for example if u want to play RDR 2 at 4K Ultra at more than 60 FPS your gonna need a beefy 3070/4070 maybe even xx80 after tweaking with the settings a little bit i bet you could even run it 4K with a 3060/4060 at 70-80 fps without any noticeable visual difference (unless you take a screenshot and zoom x800 lol) and if like me your lazy af look up "(game name) HUB optimized settings" then copy the settings in the game video settings & voilà
Usually there is a power difference too, though not a huge factor, can be something to track when changing ultra vs high vs med vs low. Especially if there is a framecap
Honestly, I've been playing Metro 2033 + LL (Redux versions) because they're some of the only games natively supported on my M1 MacBook Air, and I'm truly blown away at just how incredible even these old games look on the Medium preset. I literally can't notice a difference between Medium and Very High.
FH5 is the only game shown where I would recommend Ultra or Extreme if you can afford it. The difference in the LOD of the vegetation is abysmal between low/med/high and ultra/extreme, even playing on high settings you can see how the LOD of the trees changes in front of your face, on ultra the transition is much smoother.
The difference that my eye can see is that how the light and shadow work, like in Forza Horizon 5 ultra, mountain seem more visible rather than the extreme setting, maybe its shadow change. And for the light change seem less visible so it might be subjective (you can argue it), for example is TLOU Part 1 on high setting, the light seem more blurry rather than the Ultra setting
Cyberpunk's SSR setting is brutal on higher settings and has almost raytracing level of performance hit. Turning it down to high while keeping rest of the ultra settings makes up around 90% of the improvement between ultra and high.
To be fair in Forza Horizon 4 & 5 the differences are pretty big between presets and for the 20fps drop i feel like the sacrifice is worth it, if your fps is above 60 all the time of course. In motion you can't see the difference that clearly, but while steady or in photo mode the difference is significant.
Thanks for the video! I often think about this very subject when setting up a new game install, and I'm in agreement with you; Sometimes it just doesn't make sense - from a fidelity standpoint - to go to the highest preset. I, personally, don't even use the highest presets on new releases with a 5800X3D, 3080 Ti, and 32GB of memory. I'll select the highest preset at first, then go disable any anti-aliasing (especially reflection AA), reduce shadows to their second-highest value, reduce cloud/weather effects to their second-highest value, and disable the subjective post-effects that I don't personally enjoy (e.g. motion blur). In lieu of AA, I simply use resolution upscaling, be it in-game or through the nVidia control panel; FXAA softens textures, TAA causes smearing in motion, and MSAA redraws some geometry X-amount of times - so best just to use upscaling (in my situation) to reduce jaggies. Cheers!
It almost seems some developers put these options in PC games for the sake of them being there rather than if most people will notice any difference. Even older games if you play on a higher resolution than most were able to back in the day some of the AA settings become pointless the higher you go. There there are more modern games and some of them even on lower settings look very good. Although I remember back at one point lowest settings pretty much meant stripping almost all textures and details out the game until it didn't even resemble what it was supposed to look like.
I think 90% of the time unless you're someone who pauses at each frame to analyse the visuals you won't notice a difference, but I'd be lying if I said whacking everything on Ultra didn't feel good
Haha very true for both statements
Yeah, maxing everything out is fun tbh. This is why I after building a new PC, the first thing I do is revisit the older titles.
Just like you can't tell the difference from 144hz to 165, 240, 300... you guys swear you see a difference even with 100fps and above
@@jponz85 But we do, our eyes are just built different, if you can't, I feel sorry for you
My laptop can go to Ultra >70fps but I still only use High or Very High in single player games then low on multiplayer.
Anytime I use Ultra settings... It's just for screenshots, selfies/photomodes, and recording in-game cutscenes because that's the only time it felt really good to use anyway.
From my experience, the only real Ultra Setting that actually makes a difference and I recommend is the ultra Texture Quality setting for RDR2. If you put that at Ultra, but set everything else at Medium, the game still looks utterly fantastic, and it will run pretty well.
Yeah my sentiments exactly with red dead
In any game textures can be maxed to the limit of your vram.
Ultra textures in rdr2 is the equivalent of consoles and anything below it is pretty bad
shadows definitiely high but otherwise yeah
Otherwise dogwater pixelation not that it’s bearable on lower end hardware tbh
Nice to see that "Maxing out" games is still unnecessary.
Yeah definitely
I thought it was less unnecessary in old times when even the slightest improvement in graphics felt big.
In most cases it is but on higher resolutions you can notice changes easily that's the thing and reason ULTRA exists
Always has been
@@DragonOfTheMortalKombat Yeah, when they started including things like anti-aliasing and ambient occlusion, it felt pretty sweet to be able to turn those things on.
The other great thing about older gaming is that older games cost s fraction of the price. One of the benefits of being 2-3 years behind is that great games are €10
Story of my life and I work in IT lol
there's a whole subreddit for this sort of thing called "patient gamers"
Yeah totally agree,
here in Australia while new games are generally $90-$150AUD at the moment the whole Metro Series is on sale at Steam it cost me under $20AUD for all 3 games with DLCs,
Plus GOG are always having a sale & there have been some interesting games for the EPIC Weekly Freebie.
This is what I do. The games are way cheaper, it's one thing but they also have been patched , fixed, optimized by the time I buy them. And I don't have to sell a kidney to buy the latest Nvidia high end rig to play a couple of AAA games on ultra.
@@beardalaxy Purchased GTAV in 2015 for PC & still waiting for GTAVI, lol
got the chance to finally play rdr2 and it came set to ultra by default. Ran the benchmark, got 60fps with some drops to 45~50. Then I tried Digital Foundry's optimized settings and got 100fps+ and couldn't even tell the difference in graphics
The biggest killer of fps for me was msaa, imo taa does a great job with the slider being turned halfway up
@@mrducky179
For me, it was screen-space reflections (SSR)
Most games that use SSR end up taking a big performance hit, for almost no gain in the quality of water reflections
@@mrducky179 i really, really wish MSAA was still an option in video games. i fucking LOVE MSAA i don't care about the performance hit. TAA looks gross.
@@beardalaxy TAA is a disgrace for picture quality.
What's your pc specs
Nowadays games have become so demanding even on low settings that a mixture of med + high settings are a good balance between performance and quality. Good informative video by the way, learnt a lot!
The problem is that despite modern games still being demanding on low, most still look terrible with muddy textures that rival the PS1 era games yet somehow take up 4 to 6 gigs of VRAM. You'd expect it'd look like ultra from 10 years ago, but no, but no, not even close. That's why I'd like to see a comparison between modern low settings vs old ultra settings, visually and performance-wise.
@@Snow.2040 Oh BenchmarKing, yeah that's good
@@dantemeriere5890 Truer words have never been said.
Is not texture fault but shaders or particle effect that you cant turn it off tlou example the game rendered each leaves with respective light
I don't know if it's in all Assassin's Creed games, but Origins has a nice interface when adjusting graphical settings, showing you the differences for each setting. Then you can decide whether you'd really miss the 'tad sharper text from a great distance' or 'extra puffy clouds'. And then it also has a performance graph running while you play to show exactly the loss or gain.
DOOM Eternal has the best graph data give it a try
Its in Odessey too.
and AC Valhalla
the newer Resident Evil games also have a showcase screenshots for each graphical setting. Which is super handy!
Ubisoft is really good with stuff like this. the earliest game I can recall it on is ghost recon wildlands and it's been in quite a few Ubi (and other) games since then.
What is also worth mentioning is that if you play games that don't require high framerates, you can lock the framerate and still benefit from lower settings in the form of decreased power consumption and temperatures. Especially if you use a power hungry card.
True, also undervolting could help
Absolutely. I have a 3060Ti and a 120hz oled tv, yet I always play at 1440p, locked at 60fps (+undervolt and mild underclock on cpu, but heavy overclock on memory). My GPU is usually running relatively silently at 85% usage, at 70-75°c. Great for my electricity bill and the longevity of the GPU too.
I always undervolt and cap the framerate. Why melt your card to give yourself screentear?
@@armyofninjas9055 its for benchmark purpose, for playing regularly undervolting and fps cap is highly recommended for any games just with different number
There's also the fact that common settings like SSR, AO, Shadows and Volumetric effects are the most resource intensive these days and should be the first ones to crank down when tweaking.
Shadows can have a huge drain on FPS. My favorite indie Grim Dawn, the difference between off and Very High (or Ultra) is around 20 or so fps.
Thank you for this video. We always see videos tested with high end machines but this area of hardware is where tests need to be made the most, not the high end stuff. So, thank you 😊
There's always the point where you cross it you start seeing diminishing returns and that's the line between high and ultra while taking a 20-30% performance hit or more generally, and the days of setting everything to low-medium or high is not enough anymore as there are settings that if you put on ultra-high or low they take no performance but others that take a lot of performance yet still look roughly the same on other settings which is weird. That's why youtubers that benchmark and compare every setting from low to ultra and finding the optimal points between performance and visual fidelity are a must these past years
Hardware unboxed used to do optimized setting guides, testing what was most bang for the buck (their RDR2 video still the best), but they stopped. DF does the same but in less detail, but Alex has been sleeping and not done it for many of the new releases, so we kinda stuck in the dark on what settings are actually best.
Major PC games this year so far have been lackluster and horribly optimized. Notable major titles: Forspoken, Dead Space remake, Hogwarts Legacy, Wo Long: Fallen Dynasty, The Last of Us Part 1, Star Wars Jedi: Survivor. They all needed multiple patches after launch to get to playable state. Not worth deep-diving into "optimized" settings when they are so broken on launch day.
Idk if youve noticed but almost every AAA pc port has launched so unbelievably unoptimized that changing settings doesnt fix anything, which is why youtubers who normally make optimized settings guide for games have not covered some of the more recent broken ports such as jedi survivor for instance.
Really like how you try so hard just to present the facts with as little spin or personal opinion as possible. Another great video.
I have a 3080 and never use Ultra settings on newer games. The visual gains are too small for the sacrifice in framerate.
Yeah I think I could have a 4090 and still avoid ultra lol. Just a habit at this point
@@RandomGaminginHD I would take a 4090 to run at 4K with high FPS. I'll gladly turn things down if they have no effect on visuals, but boost FPS when turned down.
Lol I play some games at 1080p low on a 3080
i have a 2080ti, playing games on ultra means i have less FPS and i always try to hit at least 90 when possible. still playing at 1080p too lol.
i got a 4070 here and mostly play on high unless its an older game.
Interesting how little difference there is between the ultra & high settings in terms of graphics but quite a bit of a jump in performance.
Done some tests on my specs - RTX 3060 Ti / R7 5700X / 64GB (2x32) DDR4
RT (where available) is off for all tests. Also turned off is motion blur, DOF & bloom for personal preference.
Readings taken from CapFrameX. Numbers from left to right are average FPS, 1% & 0.1%.
Assassin's Creed Valhalla
Ultra - 101.4 / 65.4 / 42.8
Very High - 110.8 / 76.9 / 60.3
Borderlands 3
Badass - 94.7 / 76.6 / 9.8 (!?)
Ultra - 100.1 / 81.3 / 58.8
CP 2077
Ultra - 90.6 / 63.8 / 52.7
High - 96.4 / 65.3 / 54.7
FH5
Extreme - 91.2 / 75 / 66
Ultra - 102.6 / 79 / 64.7
Hogwarts Legacy
Ultra - 65.5 / 41.3 / 30.4
High - 69 (nice) / 38.8 / 26.1
Spider-Man: Miles Morales
Max - 99 / 62.1 / 41.6
High - 111 / 68.5 / 54.6
Shadow of the Tomb Raider (SMAA4x)
Max - 113.8 / 81.4 / 53.2
High - 127.9 / 96.7 / 69.7
Watch Dogs: Legion
Ultra - 86.2 / 68.9 / 59.3
Very High - 103.1 / 76.1 / 68.8
graphical settings in 2023:
low: looks good! 182fps
ultra: would you look at that extra pebble! 13fps
Subscribed on this video. Your content is just perfect these days, keep it up.
Thank you :)
Two bits I noticed:
Forzas Extreme preset had a notable difference in that the car cast shadows on itself, where it didnt on Ultra, which was one of the more visible differences.
Hogwarts Legacy was the other. The high preset had this annoying fog that I think was the greatest difference in looks between any of the examples. It would annoy me to hell in a game if I ever found out.
Also, the thing with stutters is definitely an issue of stuff loading in, bigger textures cause a bigger stutter. Nothing groundbreaking here, but it can be a big factor in certain games that try to load things on the fly when they get paired with old HDDs, or even SATA SSDs, where the data rate just doesnt meet the games demands. But thats honestly on the game as its an optimization issue, and the textures could be cached ahead of time into RAM if enough is available...which in this case it certainly was.
Forza on Extreme(max) added fog, but looked much better than Ultra.
In Hogwarts Legacy the fog is also higher on Ultra(max).
In Forza the position of the sun its different...
I have seen many split decisions on the settings of the highest and next one down, but most tech tubers would surmise that the highest setting would be more used for the photographic people out there who like to take huge amount of screen shots more than for the playability as the fidelity is like you said a lot more on a still image. I most of the time on my 6700XT playing at 1440p will use the setting down unless of course it can get cranked to the highest without much trouble. Great video yet again and just confirms really my thoughts on how I already do things.
@@RocketRenton My 6700XT cost £600 when I had no choice to replace my 1660ti If it had been 3 months later I would of got the 6800XT for the same price however I needed it at the time lol I would defo favour the 6700XT over the 3060ti any day. With the way things are going with Vram AMD saw this coming a long time ago which is why most of their reasonably priced cards are 10gb or above. I fitted my son a 6700 with 10gb as his budget wouldn't accommodate the XT variant but he plays at 1080 so big issue. The only thing Nvidia cards have slightly better is DLSS however there really is not much difference between FSR and DLSS when your actually into the game. For me team green lost my confidence a long time ago and have been team red for a while now and will probably stick that way in the future, my next upgrade is going from a 3600x to 5800x3d to maximise my AM4 platform until the AM5 gets over its hiccups.
How many fps that you got in recent title using RX 6700XT on 1440P resolution and high graphic, the problem with benchmark nowadays is that they use the highest graphic option while i am just gonna use high setting, and also apparently video recording take some fps (some argue that it is false and some argue that it is true), so i want information from the real owner (I want to buy RX 6700XT in the near future or maybe something better if the market goes right)
@@jalma9643 I've never done much recording with it but I would assume I wouldn't be able to run the very highest whilst recording as it probably would take a performance hit, however for instance with Hogwarts Legacy I'm get around 75-100 fps on max settings with FSR on. If there's been any dips I have not noticed them what so ever. With games that people play online like cod and fortnight I wouldn't know but people use competitive settings so one can assume there is more than enough grunt for those. One of the main games I do play is Farming Simulator 22 (yes I know its just farming) however I'm able to play that completely maxed out with 200% res scale so essentially 4k with changes in the ini to further distance rendering by 4x and I'm getting a comfortable 60-75fps on average more than enough for me. The cost to performance is defo in AMDs corner for that gen of cards. I cannot fault this card in anyway. The only thing I would suggest is if your an RT fanboy then probably not the card for you though it's capable its not very strong at it in anyway but I don't use it so that feature to me is irrelevant. I have XFX quick and its really quite quiet under load also even OC so for me it's a winner and would highly recommend it as a med/high end card.
Me watching in 480p: Ultra settings really are useless.
Actually you can often get away with with low settings but Textures at Max looking really good in my opinion. Recently did that with Crysis Remastered. I almost did not notice a difference, only when driving cars in third person FOV did I notice some popin. I bet I could increase one of the video options just one notch and get rid of that. And the FPS gain was massive by the way.
I was just playing this on my 4090 maxed settings with ray tracing and got about 120 fps sometimes higher depending on the area
This was a fantastic video... You've actually changed my mind on ultra settings... I usually just go for the absolute best visuals, but I quite literally can't tell ANY difference in most of these comparisons.
there is a diference. dont forget youtube compresses videos.
@@djam7484 A difference that isn't worth the fps loss.
@@EngineerMikey5 never said anything about it being worth the fps loss or not. just that there is a difference..
@@djam7484 I didn't say that you said that either... I was just adding to it.
@@EngineerMikey5 why did you tag me then when this has no relevance to my original comment. Sounds lime your just trying to be smart ass about it.
I think High is enough most of the time, I even went to get my glasses halfway through the video as I wasnt noticing anything that different between the options. The main thing I noticed that ultra sometimes seems to have better lighting on things like hogwarts, but that seemed to be about all.
yeah yt compression also does not help. Anyway, I agree, the biggest difference is probably lighting/shadow, but not significant during gameplay
Big difference in the draw difference in Hogwarts. Forza you can see a difference in lighting/reflection in a pause screen but in game you ain't gonna be looking at that.
sometimes i wont lie i can't tell if raytracing is on or not unless its mentioned in the video because things are not raytraced enough in alot of games.....
Ultra is simply a marketing ploy to make people spend more on GPUS. Very few games are actually "Ultra."
there are instances where there is a very clear difference in quality between Ultra and High (not necessarily that they're WORTH, depending on who you are, but if you're looking for them you can find them). TH-cam compression, however, says "nah...these are the same picture"
I went from a 1080p display to a 4k display recently and noticed resolution is infinitely more important. Since you wont see the tiny details with the difference between high and ultra without being in a higher resolution. I tried staring at a wall in 1080p and 4k and going to between them is like turning quality from low to ultra. In 1080p i saw a massive cobbled wall. In 4k i saw the fingerprints of the man who placed the bricks there and was able to estimate the date the wall was constructed. (both in max settings)
Edit - After getting used to 4k, i do wonder how i ever was able to see anything in 1080p. Sounds silly but my god it looks mushy now.
I never see the differences in Ultra vs High comparisons on TH-cam, but when I play games myself and in Native Resolution, without the TH-cam compression, it always looks noticably better.
Couldn't really tell the difference...
Exactly 😁
You'll be able to tell that 10 FPS difference though
to be fair the last time a game really looked massively different at the highest preset was Crysis of all games. Everything else is just "future proofing" by increasing things that you can't see without pixel peeping. Ultra used to mean something and now its just a term.
Yeah remember the difference in Crysis between low and ultra? It was like two different games 😂
Ultra isnt noticiable becuase high is good graphics in 2023
I usually start with the High preset, then adjust settings if I see object/LOD pop-in, pixelly shadows, or blurry textures
I think the textures in Hogwarts Legacy actually looked crisper on high than ultra
I would love similar video for medium settings. I think you can push some older hardware quite a bit and not lose that much in quality
I think this comparison should have been done with a 3060 instead of a 3050, the 8gb vram buffer might mean that games might be silently downgrading or compressing textures so as to fit within the vram buffer. Hogwarts legacy and forspoken have been shown to do that, leading people to think that the game looks uncharacteristically bad on their 8gb gpus.
This is interesting. He should try again with a higher GPU.
Yes, i recently switched from a rtx 2060 (6gb vram) to a rx 7900 xtx and on hogwarts legacy with same setting look much better.
I remember playing games like sekiro with 15 fps on 800 × 600 on lowest settings with my radeon HD 6670
Thats tough
That's literally me up until this year I bought an actual gaming pc
The funny thing is that I managed to beat isshin with only 12fps
You can't imagine my feeling when I fought him again with 60fps
Hows your pc now? just curious if your situation improved
the fact that I played Crysis Remastered on the maximum settings, including RT, despite using 1650S. my justification was "I finished this game dozens of times, I don't care I'm getting 15fps stable, I want maximum visuals"
it was something...
Everything above 10 fps is doable i would say, 15 fps is premium exp.
@@fiece4767 hey, as long as it's stable, I can bear anything
Og crysis looks better than remastered and runs better too
@@eniff2925 exactly why I did what I did. agreed on that, original looks so much better
I got into PC gaming back in 2018 and I always followed the "never use ultra settings" rule, because I always had lower-end GPUs (1050 Ti, 1660, 1660 Ti). But last week I upgraded to a 4070, which is a massive leap in performance from the 1660 Ti I've been using for a few years. Now that I have the GPU horsepower that can max out games even at 1440p, I'm finally able to just set all graphics settings to max. Definitely feels good. Though I will admit that with all games, there really isn't any significant or noticeable difference between ultra and high graphics - unless you're comparing raytracing on vs raytracing off visuals, of course. And even then, in some games you can't even tell the difference between raytracing on vs off. But it does still feel good to know I can just set everything to ultra, turn on raytracing, and just play the game while maintaining a 60+ frames per second performance.
This is why I optimise my settings or find others who have already done so, even though I can max games out. Unless its LOD settings, which I max out instantly just to reduce as much pop in as possible. Cant wait for nanite to help drastically rude that.
I agree with other comments. The only "ultra" or truly max setting I ever enable is texture quality because I have the VRAM for it, and it's the most noticeable improvement with the smallest impact on frame rate. Lighting settings are by far the most intense and usually past "medium" or "high" you're just giving up half your frames for tiny differences that you can barely notice.
I think in most cases in this video i liked the lower presset more than the higher, probably the better illumination makes stuff looks shallower in the higher presset
The differences are more noticeable at higher resolutions. Yeah, wont see much on 1080p, but things like lightning, thextures and shadows have a great jump on 4k, you can CLEARLY set presets appart.
Having actually working for small development startup studios it actually comes down to the quality of assets the team is using or the abilities of the engine. Some tie the fidelity of the assets and textures to the resolution and its already a set graphic quality, and some actually change the entire game from textures, filtering quality, draw distance, hair, lighting, etc. In order to really notice you have to use a wide variety of games. Games from big talented studios like Naughty Dog will be harder to spot because their art direction and animations are so good. You need to use indie and non-AAA games in order to notice alot more.
Ultra settings are usually built for future hardware in mind. Typically the best experiences is mimicking the settings consoles use if applicable.
Lots of console games often have the best mix of visuals to performance given their hardware. Replicating those settings on a PC and potentially getting higher framerates with your better hardware is nice.
On my 3080 and 5800X3D I play Forza Horizon 5 using the settings the Series X version has, but with the game running at 60 instead of 30 since I have the extra power.
Max settings above the console version doesn’t really look all that better.
Like you said mixing up settings and going with a custom balanced preset is always the best choice , for example the ultra and high settings in cyberpunk are very similar in terms of visual fidelity and the biggest difference is reflections (SSR) as ultra is very expensive to turn on and costs more frames than Ray traced reflections , so it's a no brainer to turn it off
keep in mind that chromatic aberration is automatically enabled on the ultra preset and disabled on the high preset. it adds a lot of blur to the edges of the ultra preset. using ultra without it might be a closer call.
Seems like the Vram usage changed when switching between the highest and second highest settings. That's something to consider if you are running close to your max Vram.
my gt 710 can do 720p and fsr ultra performance and get 2 fps
Better than 1 fps haha
At least it got "ultra".
@@RandomGaminginHD true... Better than nothng I guess
@@DragonOfTheMortalKombat 🥲
You got me on the first half😂
Worth noting that quite often using lower settings will increase CPU usage as the framerate goes higher. That can actually be seen in this video.
Most settings only affect the GPU, only certain settings will affect the CPU (like draw distance, physics/animation, ray tracing). With a low-end CPU, you might actually find yourself CPU-limited and lowering the settings might not even help.
One thing I've learned is often when you are actually playing a game, you just don't notice a lot of this stuff. If my computer can run a game maxed at over 120FPS at 1440p, I'll go ahead and crank the settings. If not, I'll dial them back. But, I've actually dialed some settings back before even if my system was handling it fine, because I wanted to reduce the load on the video card to make it run cooler and quieter, especially during certain times of year. Maxed settings are nice until your room turns into an oven.
I’ve always had things on medium - low settings throughout my years on pc, just never saw the minor graphical improvements worth losing frames. Current rig is a 2070s & 9700k and will likely continue to do this even if i won the lottery
Do you remember the old days when one level of graphical fidelity looked like completly different game? Recently I played Heretic 2 for nostalgic reasons. Great looking game for the time, especially with Glide or OpenGl, but you wouldn't want to play in on medium or low.
Had that and Hexen. Fun games.
I swear, there's often no difference between the highest and 2nd highest preset screenshots. I would have to take the 2 screenshots and take a difference with GIMP to see what's up. Might be that they're at 1080p though, there may be an actual difference on higher resolutions.
its really hard even for me on 4k to see the difference in the youtube video between the ultra and high...and don't forget youtube compresses videos to hell and back
Jedi Survivor looked (to me) to have the biggest improvement going from high to ultra. But to tell the truth, I'm shocked at the small amount these changes make. Been a gamer since the 90's (1890's, feels like) and like you said, the gap keeps closing. It used to make a quite large difference going from "high" to "highest" (or equivalent) settings.
Jedi Survivor also does crap like lowering resolution scale without telling you when changing graphic presets.
it really annoys me that most benchmarking channels on youtube only test on ultra settings. Especially when the videos make it look like the games not playable on certain hardware.
for example, looking for Forza Horizon 5 benchmarks with the RTX3060, most videos make it seem like you cant play on 4K60fps with the card, when in reality you get 70+ fps at 4k if you just turn the graphics down to ultra and maybe use DLSS Quality.
In general, 4K benchmarks with RTX cards should always show DLSS aswell.
For Cyberpunk, I would like to point out one thing not obvious from the footage in the video. High preset (at least at launch) had Contact shadows off and this made a big difference for character faces, without it they looked a lot flatter. The setting came with a performance hit, but was IMO worth it, so my actual settings ended up being based on the High preset, but with Contact shadows turned on and Aniso 16x.
I think that after some updates (after I finished the game) they changed this (broke it up into more than one setting, and I'm no longer sure which preset does what).
One thing that would presumably be different would be things like effects when action's occurring like explosion's or magic spells. I already agree that they're not enough difference between max settings vs the next best for the performance downgrade most of the time. It's just that with effects, makes me wonder if max setting might actually end up seeing lower lows as a result.
I can't remember the video I saw once that Ultra vs high (or equivalent), but the point he made was that it often came down to psychological reason then the actual results which often didn't really matter.
What about mixing High with medium and High and Ultra comparison? I do also kinda think Medium nowadays is less dogwater than it used to be and Medium - High mix settings is a good balance that I find heavily underrated
Is it just me or do some games look actually sharper with reduced settings?
If that's the case, the higher settings probably turns on the Temporal anti aliasing which can make the image blurry
what i do is low to medium preset then setting the texture quality to high or ultra it looks way better then low textures and with less fps drop compared to setting every thing high or ultra you can confirm with a test of your own
I remember the first Forza Horizon Port running kinda bad on pc only if you ran on Ultra but it ran great on high and the dev said something along the lines of "Ultra is meant for taking pictures, High is meant for actually playing the game" and ive stuck with that mantra ever since.
I love to watch digital foundry for the technical aspects. But when they have to slow down the footage while zoomed in, so you can see the difference after they pointed it out, you can notice it, when you look straight for it. Tells me barely make a difference in game. Shimmering from bad AA on the other hand can be distracting even in the peripheral field of view.
In the 90s, you could tell from across the room, if someone was still running a 3D Game in software mode, or had one of these new fancy 3D Accelerators, with all smooth and warp free textures.
damn i love ur channel, great in-depth analysis and commentary, thanks!
Texture quality:Ultra
Everything else: Low
If game looks good and not lagging: leave it
If game looks bad and lagging: delete it
Your voice is so calming to listen to
I'm not sure if a 3050 is the best to show this, since games will sometimes load in lower resolution textures regardless of your settings if your card can't handle it from what I understand... I think I don't know.
The trouble with GFX comparisons on YT is the compression. Even at 1440P, the artifacting around the car is awful.
Can't wait for AV1 to be a standard. No one has looked into it from what I've seen, but lets hope the sound gets improved too...although, it does seem to be a YT thing, so maybe not.
id rather set the preset to high then increase every anti-aliasing settings and visual clarity enhancers like filtering, anti-aliasing, lowering texture settings depending on how much vram i have, then setting some unimportant settings to high or medium, but making the lighting/shadows/AO to the max, a balanced fps of 60-100 on games with quality visuals
I've been following this tip for a long time, my favorite to lower is draw distance when it actually makes a difference!
My 980ti ran everything I needed it to (at 1080p) and almost everything I wanted to play ran at max settings (I don't care for a lot of games that came out after 2016 so it makes sense)
... until I ran Doom Eternal. I think I played through that game at medium settings and overclocked my card a bit (which still looks great).
But last year when the market was decent, I snagged a 6950XT.
I still like running old hardware when possible. I have a blower R9 290 that sits in a spare rig for LAN parties... Loud, but still crushes most games we play.
I always ask people to look at their most recent games to justify an upgrade.
CoD WaW, CSGO, Risk of Rain, TF2, Borderlands 1, Saints Row 2/3/4. Portal 1/2. Fallout 3/NV. Bioshock series. Half Life.
Plus more indie games than I could ever count...
Most of these will run on a R9 290/390 or RX 480 from AMD...
And from Nvidia a 970 or 1060 will still play lots of amazing titles (scored a good deal on a 1070 for build I did for a friend a few months ago). Which might explain why the 1060 was top of the steam hardware chart for years.
*Insert meme*: "Corporate needs you to find the differences between this picture and this picture"
😂
Even if my rig is capable of Ultra, there are some settings that just seem silly. For example, "shadow resolution" seems unimportant. Sure I don't want pixelated shadows, but I've yet to see a game where I need to set shadow resolution above "medium" at most. Then again, it's not the dumbest/useless setting I've seen (i.e. DOF, Subsurface scattering).
Shadow resolution depending on the method used can be far more noticeable at higher render resolutions than low. Same with volumetrics or any other setting with a resolution independent of render res. If each setting level is a static resolution rather than a % then lower will look fine at 1080p but can be really distractingly ugly at 4k. Oftentimes high matches 4k though so ultra is more than needed.
Then there are games that tie resolution and draw distance into a single setting. It can be really annoying seeing the shadows being painted in only a metre in front of the character.
I generally an a 80/80Ti class GPU buyer and this time (I’m learning DaVinci Resolve) opted for an RTX 4090 and I still only game at high settings (never ultra) as the performance hit, and even frame dips do occur more so in many titles which I’d rather avoid. Also, other than shadows and lighting definition differences, these aren’t noticeable in the vast majority of titles when your moving around the game ….imo anyhow
its like u tested this on my gaming laptop. i've got the same specs. how nice of you
This will vary from game to game for sure but high settings look good with ultra textures in most games. Games like Assassin's Creed Odyssey/Valhalla though for example you'll notice things like Terrain, fog, clutter and water when lowered but even then at the high settings rather than very high they still look really good. I go with ultra if my GPU coasts it otherwise I don't mind tuning a few settings. That's the beauty of PC gaming over console gaming you can fine tune the game to run as smooth and look as good as possible on your hardware. At least when the ports aren't released needing a dozen patches to get them running well. Great video mate don't see these ones very often.
We are now in a period where graphical fidelity has advanced so much that the ultra settings has become obsolete as evident by how microscopic the differences are in this video between high and ultra. You can only improve visuals so much and we are really hitting diminishing returns. At this point ultra is simply for bragging rights nothing more.
You can notice a big difference in 4k, and TH-cam compression isn't good at all, so it's harder to spot the details.
The best settings are optimized settings, which can be found on youtube for each game.
Can you do a comparison between low and high settings ? As nowadays low settings also looks pretty good !
TIP FOR OLD BUDGET GPU GAMERS : to increase fps in pc games, just turn you shadows to low and anti aliasing to fxaa or equivalent. If game supports Ambient occlusion, please turn it on cuz it makes a huge difference in visuals without sacrificing fps. TEXTURES ALWAYS ON MEDIUM-HIGH. And this works for every pc games.
ambient occlusion makes a big fps difference when gpu limited though
@@ketrubkiryu chan
@@ketrub may be in some games that I wouldn't have played. But most of the time not a big difference. And it makes games more beautiful
It depends on hardware and monitor choice. If youre shelling out more than 1k for a card then you better be able to turn on all the eye candy. Also if you have a nice high refresh rate 4k monitor then lower settings definitely wont look as good. But if youre more budget or “lower high end” (emphasizing the quotation marks cause anything over $400 seems to be peoples limit) oriented then med to high is the way to go for best clarity. Low to medium is good for those on the lower budget side of things. For the most part textures makes the biggest difference between low to ultra in most games. Another one is more than 2 options ray tracing if that is turned on it does make the game look different compared to base.
Sometimes I lock games at 30 or 45 fps just so that I can enjoy the highest settings.
If you are on 1080p sometimes there is no reason there are miniscule details in that case
As a 1440p player i tend to use medium or high preset and just crank up texture and filtering to match my 12gb of vram. Then i seek for some extra optimization based on my configuration
The biggest different when in static image seemed to be Forza but when you play you still see nothing.
So unless your card can play highest settings no trouble there isn't much reason to bother.
im happy with my RTX 3050 & i dont mind playing all triple AAA games on high instead of ultra settings for better performance thx for testing my card in TLOU1 & Hogwarts legacy i thought 8GB Vram is not enough for them in the highest settings but i was wrong now i can buy both of them without regret but before that do u think 16GB of ram is enough?? or should i upgrade to 24GB? i hope ull answer me i rly appreciate it :))
Is the bitrate weirdly low for anyone else? The video looked super blocky for me despite being on higher resolutions
I think it's much better to do a comparison by flicking back and forth between the shots, rather than show a transition or side-by-side. Also, upload in as high quality as possible at 4k.
The reason why is pretty simple, we hit the quality wall for raster graphics at 1080p with the GTX 16xx/RX 5xx series GPUs. The future for raster graphics is in higher resolutions. 4k is the arena where AMD and Nvidia have chosen to compete now, and what most "Ultra" settings are targeted at.
There's little room to add more that isn't annoying, lost on 1080p, or only looks good when done with ray tracing anyways. To the point you could probably get away with using a Quadro P4000 with "RTX Experience" (Nvidia's Quadro equivalent to GeForce Experience) to game comfortably @ 1080p (cheap and power efficient to boot) for years to come.
So around 30% fps increase in many games with barely any visual difference, i usually run my games with ultra textures and rest on high. My Red Devil 7800xt is a beast with those settings at UWQHD.
*If u play on PC anyway don't use the presets for video settings*
most games can easily get a 10-30 fps increase just by switching 3 or 4 settings from Ultra to High (in some games even going down to medium doesn't change visuals on some options)
for example if u want to play RDR 2 at 4K Ultra at more than 60 FPS your gonna need a beefy 3070/4070 maybe even xx80
after tweaking with the settings a little bit i bet you could even run it 4K with a 3060/4060 at 70-80 fps without any noticeable visual difference (unless you take a screenshot and zoom x800 lol)
and if like me your lazy af look up "(game name) HUB optimized settings" then copy the settings in the game video settings & voilà
Usually there is a power difference too, though not a huge factor, can be something to track when changing ultra vs high vs med vs low. Especially if there is a framecap
Honestly, I've been playing Metro 2033 + LL (Redux versions) because they're some of the only games natively supported on my M1 MacBook Air, and I'm truly blown away at just how incredible even these old games look on the Medium preset. I literally can't notice a difference between Medium and Very High.
FH5 is the only game shown where I would recommend Ultra or Extreme if you can afford it. The difference in the LOD of the vegetation is abysmal between low/med/high and ultra/extreme, even playing on high settings you can see how the LOD of the trees changes in front of your face, on ultra the transition is much smoother.
Gran vídeo, tens un nou subscriptor.
What's your daily rig> If there is any, as I consider all you have at your disposal.
The difference that my eye can see is that how the light and shadow work, like in Forza Horizon 5 ultra, mountain seem more visible rather than the extreme setting, maybe its shadow change. And for the light change seem less visible so it might be subjective (you can argue it), for example is TLOU Part 1 on high setting, the light seem more blurry rather than the Ultra setting
Some settings are scaled differently based on screen resolution so the difference between high and ultra might be different on higher resolutions.
Cyberpunk's SSR setting is brutal on higher settings and has almost raytracing level of performance hit. Turning it down to high while keeping rest of the ultra settings makes up around 90% of the improvement between ultra and high.
To be fair in Forza Horizon 4 & 5 the differences are pretty big between presets and for the 20fps drop i feel like the sacrifice is worth it, if your fps is above 60 all the time of course. In motion you can't see the difference that clearly, but while steady or in photo mode the difference is significant.
Thanks for the video! I often think about this very subject when setting up a new game install, and I'm in agreement with you; Sometimes it just doesn't make sense - from a fidelity standpoint - to go to the highest preset. I, personally, don't even use the highest presets on new releases with a 5800X3D, 3080 Ti, and 32GB of memory. I'll select the highest preset at first, then go disable any anti-aliasing (especially reflection AA), reduce shadows to their second-highest value, reduce cloud/weather effects to their second-highest value, and disable the subjective post-effects that I don't personally enjoy (e.g. motion blur). In lieu of AA, I simply use resolution upscaling, be it in-game or through the nVidia control panel; FXAA softens textures, TAA causes smearing in motion, and MSAA redraws some geometry X-amount of times - so best just to use upscaling (in my situation) to reduce jaggies. Cheers!
It almost seems some developers put these options in PC games for the sake of them being there rather than if most people will notice any difference. Even older games if you play on a higher resolution than most were able to back in the day some of the AA settings become pointless the higher you go.
There there are more modern games and some of them even on lower settings look very good. Although I remember back at one point lowest settings pretty much meant stripping almost all textures and details out the game until it didn't even resemble what it was supposed to look like.
Thats why optimization guides are amazing, you can boost your performance by 20-40% without even losing important visuals.