Future proofing. Highest settings always cause issues with "current, average" hardware, but playing a decade old game with highest settings on your new rig can be pretty sweet!
Yes exactly! It's not the biggest difference between High and Extreme, but it might look like a way bigger difference on whatever monitor I have in a decade or two.
Back in 2004, I remember watching my friend play Far Cry at a subpar frame rate (he had it all maxed out). He took a bathroom break, I went into the Settings and turned the graphics down to Ultra to High or whatever the presets were called, he returned, didn't notice anything different until much later when he wondered why the game felt smoother. That taught me that people (even graphics nerds) are way more likely to notice the difference between 30fps and 40fps than Ultra and High settings. I repeated this multiple times with multiple games over the next decade - same outcome each time. People don't notice any quality improvements but suffer low frame rates for no obvious reason.
I really like 4k resolution, and antialiasing is not strictly necessary at that resolution and combined with lower settings it can yield good performance "1080p" gpus. Therefore I think reviewers should benchmark lower graphics presets.
@@eliasahawneh6199 The original Far Cry in 2004, one by CryTek. It was a great year for PC games - Doom 3, Half-Life 2, Far Cry, Halo 2, The Sims 2, Rome Total War, NFS Underground 2...so many great titles.
I figure that Ultra settings are mostly for aging games. A year or two down the line when a mid-level machine can run the game, you can max it out and itll still look good.
I'd agree with that sentiment if Ultra actually did look significantly better, but that isn't really the case these days. Games like Crysis or the Witcher 2 came with Ultra settings that actually were a noticeable visual improvement. Those settings were straight up impossible to run at an acceptable framerate backwhen the game released, but they were a good example of settings built for future hardware.
I thought the whole point of the video was that the difference was negligible/not perceivable. I would assume this will be true regardless of how many years you wait.
Right. And I don't think there's anything at all wrong with devs "future proofing" their games by unlocking graphical options intended for later generations of cards. Like, if your engine can handle 20 gazillion particles at 8K, but current hardware can only render 5 gazillion at 4K with a decent framerate, why handicap the game? Leave the options in, so it'll look nicer in the future. I especially liked how GTA V on PC handled this, with a separate graphical options menu specifically for those "extra" settings which weren't realistically playable on consumer hardware in 2015, but are totally do-able a few years later. I think that helped prevent slider envy among players.
I've found out through experience that setting everything to high instead of ultra causes an average fps increase of 30% - 40% while maintaining the appearance of the graphics practically unchanged
It actually depends on your screen resolution. For mordern game, with 2k monitor, you should use atleast high settings. For 4k monitor, you must use ultra settings for the best visual.
Is high = ultra?, on less than a 1080p screen I can tell you it isn't, in Forza Horizon 5 the car details set to high are way more blurry the car badge is blurry even the contours and any exposed parts are more blurry than normal so yes ultra is needed. 1080p on less than a 1080p (900p) screen looks bad, it's good looking but not as immersive as 1440p or 4k.
Well imma try this in COD then, currently playing on 2k and ultra settings, I usually get capped 60 FPS (smoothest experience imo) but sometimes the fps drop down very badly in specific graphic intense biomes, I’ll try and see how changing this one setting will improve my framerate
Exactly. Ultra settings have exactly two purposes: to make the screenshots for the box / any other promo material, and to let the game "grow" with future PCs. A game is meant to look _good_ on normal/medium settings, _great_ on high settings, and even a bit better than that on max/ultra settings once the hardware needed becomes affordable. Low settings are meant for systems which barely satisfy the requirements and should look OK, but the main focus is on keeping a game playable. Playing the old games on the new PC, with higher settings, is one of the best feelings. Too bad that today's industry loves to produce incompatible OSes. The "Win10 touchscreen patch" from 2 years back, anyone? It forced just about everybody to rewrite their software, just to remain usable after the patch. Without a patch, you can't scroll the map around; it just jitters back and forth uselessly. Yes, MS failed at something as simple as processing mouse input properly.
@@uujgndn Elder scrolls was one for me. I started out on console with oblivion and later got a cheap laptop that only just manged to run it. Wasn't until I got my first real PC in 2015 that I was able to play them on higher settings. Even now I still revisit oblivion and skyrim from time to time and play through them. Big nostalgia hits for me, particularly on oblivion the game that sucked me in to the franchise. Damn that horse armor dlc I still feel ripped off.
I love playing Half Life now with less than one second loading screens. I remember my first PC taking forever. It sucked when I had to go back through one just to grab ammo or something.
See the thing is, once upon a time, ultra graphics settings were a way of future proofing a game. Back when improvements in processing power in CPUs and GPUs were resulting in SUBSTANTIALLY different graphical results in games. I'm talking about differences like going from 640x480 resolution to 1920x1080 resolution, which is going from 'Yes I can tell that's a cube, those collection of pixels over there are definitely in the shape of one' to 'Whoa I can see the individual blades of grass...'. Back when GPUs were only just starting to achieve things like per pixel lighting effects, and every bit of processing power improvement in per pixel processing was allowing for doing way more interesting stuff in the lighting equations. Back when which GPU you had determined if your PC could do AA or SSAO or normal mapping.. like.. AT ALL... or if it could do it at 60fps. Nowadays, buying a 3090 Ti and running a game at ultra settings grants you upgrades from 'pixels so small you almost can't see them' to 'pixels so small you definitely can't see them' or 'reflections that look almost accurate' to 'reflections that raytraced to be absolutely accurate'. And for these modest improvements you need an insane increase in processing power. The fact is, and this is a horror story for AAA game developers: GRAPHICS HAS PEAKED. We're at the point where something like Cyberpunk 2077 can run at 720p on a handheld gaming PC for 3 hours with decent quality settings, and it looks amazing. A beautiful stunning city that feels like something out of a movie, running on a handheld device in just 15W. We're at the point where the biggest bottle neck in game development is no longer questions like 'How many polygons can the GPU handle?' or 'How many enemies can we get on screen?' but questions like 'How many more A list hollywood actors can we sign on to do voice over work?' and 'How few hours of sleep can our developers survive on before we might get sued?'. We're at the point where games from 10 years ago still look absolutely gorgeous. That wasn't the case 20 years ago. 20 years ago, if you played a game from 1992 in 2002, it looked like absolute dog****. Now, a 2012 game in 2022? Still looks fantastic. Graphics has peaked! We don't need MOAR TEXTURES MOAR SHADERS MORE POLYGONS etc etc. We're past that point now. Now it's time for game developers to focus on the more critical aspects of their games. Gameplay, story, art, sound, music, improving the customer experience through smaller download sizes, fairer prices, lowering power demands for low end devices, improving accessibility options, etc. Which is why this is a nightmare horror story for AAA game developers. Now they actually have to focus on the quality of their games to stand out. We're in the dying breaths of the era of gaming where making something look 'higher resolution / more detailed' was enough to make it sell.
Very well said. I ageee with the sentiment, except for the characterization as a "Nightmare Horror Story" for AAA game developers. It may be for Execs who can't throw money at a game to make it sell better. But to be honest, I don't think that was ever really the case. Limitation inspires creativity, and a lot of successful games in the past dealt with hardware limitations in unique ways that boosted the experience of the audience beyond what was expected. The era where hardware limitations affect game design to a point where developers have to make design decisions with those tradeoffs at the forefront is mostly over. But this means that developers that really would rather focus on gameplay, story, and art design, etc. have so much more freedom to explore. It also means that as experience in the industry collects, smaller game studios are more capable of providing high fidelity experiences to an audience. I'm not trying to say the industry is perfect, or excuse crunch culture called out above, but looking around at all the small studios popping up, led by experienced developers that left AAA studios like Blizzard and Bethesda, gives me hope for the future of AAA Quality game development. I do not see anything resembling a nightmare here
Never had truer words been spoken. This is why a lot of nintendo games are so high in quality because they don't waste their time and resources to pointless graphical fidelity. Instead they shift their focus to the real experience that players would actually care about. 2 of the best games this decade coming from nintendo (botw n mario odyssey) literally runs on a shitty tablet that's it's hardware is extremely obselete by modern standard, yet they obliterates most games with higher graphical bs and probably 15 times more demanding than these two games in terms of sheer scale, scope and gameplay fluidity. Also even with all the crazy graphical advancement today, some devs still managed to make their games look like absolute garbage. Wd legions is the worst example of this. Even with all the ridiculously demanding ray tracing the game still looks bland, boring and just plain ugly horseshit.
Here's a fun fact: Contrary to popular belief, the meat found in hotdogs IS NOT the disgusting remains of the animal that was killed, but rather, the parts of the animal that didn't quite make the cut (no pun intended) during the process of making steaks (which, in case you were unaware, are glued together). In other words, if a steak is "Grade A+" meat, then hotdog meat would be "Grade A" meat.
@@DarkArachnid666 Sure, but a 1.5 dollar hotdog probably has shit soggy or dry bread, old ass hot dog, bad quality condiments etc. (At least from the cheap hotdogs I've had in several cities) You get most of the value of expensive cuts by cooking them properly, and you can definitely get some insane hotdogs by getting good quality ingredients across the board.
For me, it’s about fidelity and longevity. Like some other commenters here, I too build for a 5-6 year machine, or more as I usually am able to use them for 7-9. Like a piano, 80% of what you play only utilizes 50-60% of the keys at best, but when you come across those pieces that require the full range you’re glad you have them. In the other hand, your hardware will last longer if you operate it within its limits as opposed to at or slightly beyond.
And without COVID-19, if you get barely fast enough GPU today, you can get another "barely fast enough" GPU 4 years into the future which is faster with then modern games and the total sum for the two GPUs is less than the high end GPU today. If you do not need 100% of the GPU today, do not get the expensive GPU.
@@steinkoloss7320 The longetivity is not about heat death but ability to play future games without upgrading hardware. My point was that buying the hardware you need in the future is pretty pointless because you can get better hardware for the same money in the future. Get the hardware for the already existing software/games.
@@steinkoloss7320 absolutely right. It’s not about settings vs frame rate, it’s about overall utilization. I try to balance my GPU to about 90% max, and capping frame rates helps with that.
I think the half-life series has certain things in it that make it better than modern games when was the last time you picked up a wrench in a video game? What about a random piece of wood after smashing a box? Some of these things go a long way to make that game feel better then more modern games.
@@bland9876 Yup, I don't really understand the need to play new triple A titles. There's very little innovation in terms of gameplay with these. I tend to either play older games, or go indie, and in both cases a 3070 is usually enough to play at max settings 4K 90-120FPS.
Bugatti*, and no. The point is to stroll through the riviera going 2 mph and flex on them poors. If others don't see you, then there's no point OR you actually bought it to drive on mountain passes or something
@@Anankin12 If I had that kind of money then yea I probably would buy some stupidly expensive flashy car and then pay someone to drive me around in it as I don't have a licence.
or you could buy an aventador or f430 for a fraction of the price and majority on non car people wont notice the difference in fact i think most girls who dont know about cars will say the ferrari is more expensive than the buggati
@@carlosmspk Not quite the same, I agree - but people will sometimes buy a GPU that they don't need (e.g. an rtx 3090 for minecraft) which is, in a way, flexing money. All I'm saying is, people do (sort of) buy GPUs for flexing. Any expensive, popular object is bound to be used to show off.
I have these concepts that go further down than that (I kid you not as a game developer), would you care to honor my future work in that direction with your opinion about the stuff I'll give you for free to have? One of those should be hardware as well :)
I find it so weird that you CONSTANTLY see Shadow of the Tomb Raider used for benchmarks, even to this day, but I don't really see many people play it or make other content about the game. It's like it's just a benchmarking tool.
Well it’s a really good game in its own right but it’s one of those games you really only play once through and that’s it. There really isn’t much more content to make after doing a playthrough and the lore and story is very upfront so no videos explaining that is needed. The survivor trilogy is amazing though and one of the few times a studio nails a reboot series.
AA is pointless, anisotropic stuff can bugger off, sharpness filters and stuff can bugger off and most importantly get rid of motion blur and depth of field But depending on game and Vram you could just be running out of Vram so what GPU is it?
@@commanderoof4578 one thing not to get rid of is texture filtering, especially at higher resolutions where its easier to spot lack of detail but mostly yes, AA is hard to recommend unless you have a huge excess of performance
@@_._shinonome_._ depends on game if the game uses 1080p or lower res textures yes its gonna be bad to turn it off But if a game uses 1440p, 4k or in some cases 4096x4096 then its a setting that is pointless
The differences are less prominent these days. Go back 10-20 years, there were usually night and day differences between High and Ultra settings. And even Low/Medium quality have come a LONG way these days as well. Those back then looked especially awful.
That's because the main difference back in the day was that Ultra almost always meant uncompressed textures, which was a really big deal back in the era of much lower GPU RAM quantities.
This is basically why I just upgrade my setup every 5-6 years and save money in-between. In that time The games have changed to be like heaven and earth and the hardware has as well. If you upgrade every year the changes will be there but definitely less noticeable. I prefer the thrill and excitement of a larger difference to get me back into modern gaming than the less noticeable gradual one.
Same. I built my first pc at 15 with an amd bulldozer cpu, and a gtx 950. I got a job and at 16 I completely redid it with a ryzen 1600x and a gtx 1080. I sold that to my girlfriends brother and he still uses it for heavy daily gaming, meanwhile at 21 I just got a steal on a NZXT pre-built with a Ryzen 5800x (liquid cooled) and an rtx 3080ti. The leaps in quality everytime have been mindblowing, but the longevity of the hardware in upgrading from is also really admirable
Had console since PS1-PS4 and bought a zephyrus G15 just got done updating everything and I’m scared to buy a game😂 I know there’s no returning back to console after this
Jokes aside, the dumb thing about preordering is paying for something that couldn't be independently reviewed first. That is not always the case for products on drop, only when they offer a product exclusively for the first time ever
One of my best friends made me realize this when it comes to games. "You don't always need the highest settings. Having a smoother experience is much more worth it in most cases", and I finally understand what he meant. Plus if your hardware isn't in the highest end to begin with, you have to set realistic expectations for yourself. Once you accept that fact it all becomes so much easier
Yes playing gta 5 with 13 fps on all lowest settings lowest resoultion/setting and shit like that is much better when you know that that is the best you could play a game you downloadid for 1 whole day
I think better, more realistic graphics have their days counted as marketing tool. I think we're reaching a point were improving graphics is making less and less difference, and the better looking games are those with good artstyles which maybe aren't as realistic but still look good. That doesn't mean the run for better processors or gpu's is going to end, because in those games what can get a lot better is number of things happening on the screen at the same time, and we're still a long way to get a realistically crowded city for example.
I've found that if you play some older games at max settings it adds a lot of value but yeah when it comes to new games it only really goes so far and the extra upgrade isn't super noticable.
I don't know what kind of phone you have but my phone has a higher resolution than 1080p. I know this because I mirrored my phone onto my computer and I set it to go pixel to pixel and I couldn't see the entire phone screen even with my monitor rotated sideways. S7 edge
That'll be me when i get to finally play final fantasy 15 at a playable framerate without it looking like trash.. Okay maybe not trash but high/ultra is really impressive looking to the point where I think it's how the game should look.
@@Aghar74 basically on medium settings onwards in War Thunder there would be plenty of foliages, trees, bushes, terrain deformation, etc that made it harder to spot enemies in battle, but on this setting called Ultra Low Quality (ULQ for short), most of those simply does not exist, which made it easier to spot enemies overall. For example someone could be hiding behind some thick bushes and tall grass, but still get spotted anyway by someone a whole kilometer away because the spotter uses ULQ setting.
@@casualguy634 Just imagine : Last man in squad, Hiding in a Bush, Trying to Clutch. Enemy at ULQ : TF is this guy doing crouching in middle of the map, Haa Amature 😂 ("Shoots")
The main difference between Very High to Ultra or Extreme is actually between 4K textures to HD ones, and so on. There are web pages that show the differences and NVIDA is one of them!
The "ultra high" settings, especially for textures and such, aren't really "added". In many cases they're just the base truth from which the lower quality options are generated. For instance "ultra" textures may mean "use the textures as shipped with the game", while "medium" textures are just lower resolution versions that are generated by down-scaling them. Having an "ultra" option basically costs no development time at all. What does cost a lot of development time is creating optimizations and tricks that make running the game on lower end machines possible - and still look good. Obviously there's exceptions where higher quality just means bigger numbers/more particles/higher range. But then having that option just is a question of "you might as well".
@@meghanachauhan9380 baseline is (supposed to be) meant in the development of course it still gets called an ultra high setting (not too sure if the user you replied to exactly understood any of that either though)
The point of video is, why create the texture at 4k at all? It didn't just come into existence in 4k. Why not leave everything 1k and then only include up to 1k in the game? You can't say the game just was 4k to start with. That is a choice made by someone, that this video is arguing didn't need to be made
@@kashif4463 The CPU is unlikely to be the bottleneck and in most modern games in general GPU is usually the limiting factor to FPS until you get above 150-300 FPS when the CPU is required to feed a lot of data, but this usually only happens in easy to run e.g. e-sports titles (like CS:GO above 300 fps). But obviously, see how it goes, that's a fairly well-balanced setup and from Linus's review, the 3060 is a perfect sweet spot and a big generational improvement. You can always upgrade in the future when the time and pricing are right. But yeah 11400 and 6 cores is more than adequate for gaming. Even the i7-4790 or i7-6600 era CPU's from 5+ years ago (4/th6th gen and we're now on 11th gen), they will run decent with a modern graphics card and don't hold it back too many in most games. Also depends on your resolution as higher res taxes the GPU more, higher FPS with lower res may stress the CPU more
@@Michael-op1lj Thank you for the response. Looking at the new 12th gen Intel cpus I might just wait for the b660 motherboard to relase and the 12400f.
i struggle to find differences that don't require analyzing frames past high settings , especially at higher resolutions , i think a smoother experience is far superior to eye candy that tanks your fps
One could say "game distance units". It's a little arbitrary. If they accidentally said feet instead of meters, it doesn't matter. Unless the game is collaborating on engineering a Mars rover.
@@JB52520 Like Angolin said, it does matter. A meter does not equal a foot and a foot does not equal a meter. The measurement systems are very diferent and thus it does matter no matter what the scenario is
@Angolin The blocks are 1m square, so Steve is actually around 5' 7", and trees 9ft. A meter is roughly 3ft. The scale makes sense realistically, but it doesn't really matter. The whole Minecraft world could be miniscule and it wouldn't change anything from the player's perspective as long as everything sticks to the same scale.
What I find very very important to be left enabled is Anisotropic Filtering. Makes a huge difference to textures on the distance (even mid distance) in e.g. Dirt Rally 2.0
also things like crowd density, any settings that reduce pop in, shadows that look decent enough. Theres usually some settings that only eat up vram and don't hurt performance at all, for example in ow2 anistropic filtering can be maxed out for a couple hundred megabytes of vram at no performance impact
As someone who works with graphics, AO is best and more realistic on mid settings or lower and not be overdone. In my experience it's caused so many artifact glitches when making games. Best to keep it low.
I'm always licking the walls for cover in my games and you can definitely see the texture resolution difference there! (it honestly is a bit surprising this video glossed over how situational some high settings can be)
Yea like on truck sim for instance higher resolution mean you don't have to use the interior zoom to read your dials. At 4k they are nice and clear but at 1080p they are a little blurry on bigger gauges and impossible to read the smaller ones without the zoom. Not needing to zoom in dramatically improves quality of play.
@@toshiroyamada2443 pretty sure here the main focus was on AAA and big name game titles which truck sim isn't a part of. Also yeah in most games you'd not notice the difference as already mentioned, after all the information here is supposed to be for the general audience and it's the viewers job to differentiate the rest
Not to get too off topic, but you should try not to hug cover as it greatly reduces the angles you can see (and thus your reaction time to someone coming around a corner) while not protecting you much more than if you were a few feet away.
View distance can make a huge impact on the general feeling of the game. I remember editing config files in Gothic III, so i could see herds of animals running along the horizon. That was awesome, even justified the bad frames inside towns lol
That's a good point, when I played the 3DS port of Dragon Quest VIII, the biggest thing that I didn't like compared to the PS2 release was the view distance. Watching trees and mountains pop in shot my immersion, and a lot of my appeal
@@notorioussteve5000 hmm… I played it a looong time ago and haven't finished. Maybe will try again soon, thanks. Back then it was a laggy mess, at least on my PC.
Completely grazed past the future-proofing point. When an engine is capable of cranking up the graphics to ridiculous settings, some day everyone will be able to run them and the game will age well into the future.
This is the only relevant answer to this. What you may not be able to run at all or properly now, you may be able to run just fine in the future because of how PCs are. If it weren't for consoles hamstringing PCs, we'd have more future-proofed games, as sometimes entire games were built with the future in mind(usually based around an engine a developer plans to use for future games or license out).
Exactly. It's a joy revisiting old games on max settings. It's actually clear many devs don't go crazy enough with settings, because to max them you often have to go into config files to tweak the settings. But it's worth it.
exactly. Bought a 780ti back when it was a year old and perfect for playing game at the time, but soon after the vram was just not enough on it, especially when going from 1080p to 1440p
@@ruekurei88 Consoles don't hamstring PC games, quite the opposite in fact, they provide both the lions share of game profits, allowing bigger budgets, and more effort to be able to be put into those high end PC settings used to sell to console players with bullshot false advertising, and they set an ever increasing baseline for minimum specs that goes up with every generation. If we didn't have consoles, all PC games would be made with half the budget and built with the baseline target of integrated GPU's on the budget PC's and laptops far more people own than they do high end GPU's.
Only if you cap your fps, if you don't cap fps it can actually make your room hotter because the cpu is working harder to queue more frames for the gpu. This is especially noticeable if you are using water cooling
I feel like Ultra is for when we wanna max a game to see it in full glory, I will sometimes max out a game and have it run at 25 FPS temporarily so I can look at how the game CAN look, then I change it back to something thatll run at 90fps or so and play the game that way.
Yeah, it's noticable if you are trying to get screenshots, but even then jsut barely. Texture quality matters if you are trying to read small things on signs or something. The only time I noticed an actual difference in gameplay switching from ultra to high was in Shadow of the Tomb Raider. Funnily enough it was the shadow resolution, made the leaves on the trees look weird.
I mean in the end it's what you are willing to sacrifice of game on your PC.... Every PC has a limit after all.... If one of fine with low settings.... So... Depends on the person
Bless the devs at bungie for allowing me to run my laptop 1050 and still play the game at a little over 30 frames, they’re watching out for all of us 🙏
The higher you go, the lower the returns. It's always been like that in almost any aspect of life. Trying to reach the top five percent of what's achieveable is usually reserved to those who go after a hobby hard.
Indeed. And in fact, it used to be same with overclocking. Modern platforms just don't have any of the headroom. But back with stuff like LGA775, LGA1366, AM3, it took some work to get your $150 CPU to be faster than the $600 CPU.
For WoW that's not true, cause you want at least medium details on the spells so you can clearly see them, and even projected textures on objects and surfaces so you don't or do step out/on them.
This really, really makes me think of Freespace 2. Not that the game was any showstopper by anyone's imagination, but the extra detail and graphics they put in really shined (relatively for the era) in later years when cheaper hardware could use those settings. I feel like this video completely overlooks the majority of us who can't afford high end hardware and who also don't buy a game the instant it comes out...
I get where you're coming from. I too played the Freespace games only years after their prime when they were sold in the cheap jewel cases on the budget shelf. That said, I do think we're in an era of diminishing returns regarding graphics. The technology can DO lots of amazing things, but mostly it's rehashing the same core gameplay loops over and over again with ever more levels of prettiness. A revolution in gameplay is, IMO, far more important right now than a revolution in graphics.
That's literally what I do. If I can't tell the difference visually I'm keeping it off. I don't need to run my rig on full load for no visual benefits. Also, the controls get heavy the higher you set the graphics to and the game becomes less smooth.
@@eurosonly yea. and sometimes i can see a difference, but it just looks "different" not necessarily better, at which point maybe i go with the version i prefer, or just save the frames.
@@senor135 Yeah, it reminds me of sound mixing. Some adjustments just "feel" slightly different and you can't place why-but you might prefer one over the other. I notice with shadows and ambient occlusion stuff that even if the effect is hard to "see" it still adds to the visual polish and therefore immersion (assuming it's not tanking your FPS)
Let's be real the only one who cares about those ultra graphic shenanigans are the one shit talking you for buying "cheap" gpu, because apparently if u can't play triple AAA games at ultea settings while opening 69 chrome tabs, your gpu sucks big time
Even a test that satisfies you will prove that while you are engrossed in a game you won't notice. As he said in the video there are diminishing returns the higher you go. Oh also your monitor has to match the res you want to use or you are not getting the true res. For example I have an old monitor that produces 1280x1024 at max. You can set a HD youtube video at 1080p and it will stream it just fine but you won't get true 1920x1080. So if you want to watch a video at high settings say 4K your monitor must be 4K or you will not see it at 4K.
It is funny tho, that they decided to pick Shadow of the Tomb Raider, which is probably one of the easiest games out there to spot the differences (Rise - previous one - also). If anyone in here says they can spot the difference between high and very high it's a lie. Even at 4K the compression - and the fact that it's 1/4 of the screen - won't show the difference. Like Dan said, when you're engrossed in the game you don't notice small details, and coming back to why it's funny they chose TB, it's because you DO notice, even if you're engrossed, in this case. There's A LOT of downtime in TB, and I remember fiddling with the setting to get the most out my system; very high didn't run very smoothly, but high did, and I tweaked stuff up, until it stayed okay. Because it took a few minutes of testing, I really saw the difference. Another series you can spot difference across all settings are the newer Assassin's Creed games (Origin and up). Every game that has a bit of downtime you can spot the differences, specially shadows from a moving object and moving light source, and hair. At 4K, even texture quality is noticeable. Not even going to mention Ray-Tracing cuz it's stark obvious when it's on/off and their quality levels
I usually just play on the third or second highest preset for games. "Max" settings only exist to make it seem like your device isnt up to standard anymore. RDR2 - Very High - 100+ FPS Ultra - only 60-70 FPS. Cannot even tell a difference between the two settings.
Sometimes, it depends on the settings. Draw distance, for example, really makes a huge difference. It varies from game to game, but it's usually maxing out this setting. Sample games: Witcher 3, Rise of the Tomb Raider, Assassin's Creed: Valhalla, GTA 5, and Deus Ex: Mankind Divided.
tbh i do it simularly my rx 570 8gb struggles a bit so i play on medium or low settings with high textures as the 8gbs are enough........ for now. definitly seeking an upgrade though
Ya literally could replace this whole video saying 24 gb of vram is how you max textures without issues lol People would be surprised to find out the vram target they set in games means nothing. I can confirm I have played games on ultra that used over 19 GB Vram.
I recently hooked my PC up to a LG C1 OLED and even though in some games I’ve had to drop graphics settings slightly from going from 1440 to 4K, my god the difference in clarity and sharpness, colour, contrast etc is amazing. I could care less about maybe playing games on High instead of Ultra or maybe a mix of the 2, I’ll take that resolution jump and overall picture quality over a few settings being maxed anyday.
If you understand how Integer Scaling works, you don't need High settings. You can keep everything on low. Integer Scaling only works on native resolution. It can't work on it's own. Say for example your game only runs at 720p 60fps, but you wanna play at a full 3840x2160. All you gotta do is change windows display to 3840x2160. Then place your game In windowed mode. Then you need a separate program to turn that 720p windowed game screen into a 3840x2160 windowed version. Magic Bordless can do this. 3840x2160 windowed only is important. Then you can apply Integer Scaling. That original 720p with everything low will turn full HD no blur. Or you could do it the simple method and use lossless scaling. I prefer doing it the hard way because the picture looks sharp and crisp. Here's the simple method using Nvidia Image Scaling with Lossless scaling. th-cam.com/video/Cc9FzYSlsp0/w-d-xo.html To me Integer Scaling is better after I figured out how to use it. The way Lossless Scaling has it set up is an incorrect way of using it so do it the way I taught you if you wanna use Integer Scaling.
Same. I was skeptical about the quality jump from 16:10 1050p up to 16:9 1440p, but I have to say, I felt it. The details that came with the higher number of pixels, are not to be ignored. I've tried 4k and 8k as well and the difference is definitely there. Much less of a thing when going from 4k to 8k tbh, at least for now. Maybe in the future when we buy a GPU and add other things to it in order to make a gaming PC, aka when textures are like 64kx64k, there might be practical point of playing on 8k, but for now I'll say that 4k is an edge case when it comes to quality. When you walk slowly and take in the environment at 144 fps, sure, you will see the difference between 2k and 4k, but in any high speed scenario, it just disappears.
@100Bucks image resolution and graphics settings are two different things. Your resolution can be 8k if you like but if your game is on a Low settings preset, the fidelity of graphics will inherently be lower. On a low preset, you will see low poly models, very low texture resolutions and texture filtering, bad shadows, little to no anti-aliasing (which I admit high resolutions can somewhat alleviate) and little to no particles or vfx. So you'll see a very high resolution potato. Best graphics to performance result is a combination of resolution and the right graphical settings.
Tbh i dont care what the game looks like but theres a lack of satisfaction when you spend 2000 dollars on a computer and you're not maxing everything out
Here I am on my Dell Precision 7760 (i9 11950H, 4k 120hz screen, RTX A4000, 64gb ram, and 1tb ssd) that just arrived at my house a few hours ago... I just want to slide everything to the right and see some nice fps :) But I obviously didn't get it for games; I can't wait to finally be able to render Blender Cycles with good settings and have good viewport performance in cycles
God I couldn't agree more, spent roughly 2k on my pc only for warzone to give me frame drop issues ended up needing an ssd and some more ram 🤦🤦 so dumb.
You are very right. Just recently I installed GTA IV and am enjoying the hell out of it on max graphics (apart from render distance scalers) on my 4k TV. It's such a blast, and it almost feels like playing a completely different game - aliasing is barely noticeable, everything looks crystal clear and runs at 60 fps, completely different than when I played it on 720p with medium graphics at probably 30 fps.
I genuinely wonder if the Stanley parable ultra deluxe was poking fun at the idea of ultra settings making little to no noticeable difference in the way a game looked.
But what if the speedometer can't handle all of the speed? That's actually a clever analogy lol Speedometers give you a maximum but it's always over what the car can do. And whatever GPU you have may not be able to play certain games at certain fidelity levels with good or even playable FPS.
@@bradhaines3142 well sure in cars it will cost you a bit of the lifecycle but as long as your Pc hardware doesn't overheat you're good to go. I've never seen my RTX 3070 surpass 70°C after hours on full load and that isn't a worrying temperature for GPUs. If you stress your car to the max of course the wear from vibrations and so on will increase but most modern mid range cars are perfectly fine to drive at 180-220kph on the autobahn without issues and you still won't be the fastest one there (of course only drive as fast as traffic and weather safely allow).
30 fps was awesome back in the day, might have been playing at 480, but the textures blew anyways. Having played the sprite based shooters, even the 20fps range was quite playable.
Never had an RTX enabled card, but any time I’d mess with game settings the one that made one of the biggest differences and easily had the biggest FPS difference was Anti Aliasing. Never played a game that didn’t completely tank when you turned AA from whatever the base shit setting was to to 4x AA or higher. Sure as hell was more pleasing on the eye though.
Like "Tech Deals" said already and some of "you" might know this already ... "High is for GAMING, ultra is for SCREENSHOT" ... i game on high since i had a ATI Radeon HD 4770, and for me there is no need to go higher since the textures looks the same, and mid game, u barelu have the time to spot or see "those changes" ... but, that is "subjective" ... good informative video though :)
Sorry dude but the last part is simply untrue. When I play RDR2 at 4K ultra, I definitely take my time to just roam the open world aimlessly and take in the nice graphics. If my GPU can put out acceptable framerates why shouldn’t I play at ultra
In pvp settings i always go the lowest settings with exceptions on model distance and actual distance. And if you can turn effects up or down I leave them high. For rpg or single player games sometimes co op I opt for 80 to 120fps so what ever mix of settings I need to get there. For certain games I opt for 60fps and try to get it as stable as I can.
Im playing Portal 2 on my Gtx970 at maxed out settings, and i relally appreciate, that they exist, because i can cet higher quality, so thats also an argument: playing games, when newer hardware is released, and people can get better qualty.
For awhile I was upgrading every generation, almost yearly. That actually led me to becoming pretty cynical about it. For awhile my rule of thumb was buy the $500ish card, every other generation. But that stopped being applicable with the 20-series tiered price increases. And now AIB prices are just dumb.
Reminds me of people who spend thousands for hi-fi audio components for such a barely noticeable incremental improvement over their previous high end system. Some folks seem to enjoy buying and showing off gear more than actually enjoying content.
I play almost every game at minimum settings and I can’t tell the difference because my resolution is also so low I can’t make out the lack of detail. What I can make out, however, is how many frames I’m getting, by the oversized frame counter sitting in the top right that I generally regard as my current score.
At what resolution do you play? Even at 1280x720p you can surely see differences between Low or "Ultra" settings, just thinking about lighting. And at such low resolutions you should also have enough performance to go for Medium settings at least, even integrated graphics chips should be able to do 720p60 most of the time.
@@dennisjungbauer4467 my gt 740m begs to differ,. Can't get 30fps in wildlands,. 23,28 is the best it will do and is still playable single player IMO. Guess I may need to Tweek some more settings.
I upgrade every 4 years to the second best level of hardware of that generation, and its always been half the price of the uppermost top tier and I've always managed to keep everything maxed out, I like having a cinema quality to my games 11700k 3080 1440p
"cinema quality"? So, you play at 24-30 fps, lol? Just kidding. Actually that was exactly what I did back in the days when I could only afford entry level Hardware. But once I got a mid-range GPU and hit 60-ish in new games, I could never go back! XD
It depends heavily on the setting Ray tracing is incredibly noticeable, and also heavily taxes a game. The most expensive hardware is for enthusiasts. Enthusiasts always spend hundreds and sometimes thousands of dollars to eke out a few percentage points of improvement.
probably the same reason as some crazy people to keep the government from reading its mind, in a previous video linus's teleprompter gained sentience so the 3070 probably did to
I feel a fair amount of times that removing some advanced features make the game look a lot better. Like I always remove motion blur since it make 144hz monitor feel redundant when I can't enjoy the smoothness.
If it's any consolation, motion blur is not a realistic thing anyway. Motion blur was a problem early cameras had, we just got used to it from movies to the point that, when better cameras came out that didn't have that problem anymore, people felt like the movies recorded with them were weird / less professional. Video games just followed suit and added it because well, it's what we are used to see when looking at a screen.
@@gozutheDJ the movies thing he talked about it accurate. That's why movies are in 24fps and not 60fps even tho the cameras are well advanced to record 60fps videos. We like the choppiness of 24fps better. Makes the action scene, more... badass.
I've been playing videogames for almost 26 years now. I've never in my life I played a game in ultra when the game came out at all. And the 99 % of my games are played in medium range or high but always looking for best performance. So I don't get why gamers nowadays get that crazy obsession for high definition.
You've grown up with Mario and tetris We grew up with cod, battlefield, crisis and other good looking games It's not some crazy obsession lmao. This is what we expect
Hey LTT. This is one of the BEST VIDEOS IN MONTHS! you uploaded. Please give Riley a raise.... Truth is finally out there and many of us poor folks can finally feel like a normal person. (I bet this was a Riley´s idea)
I paid for the whole GPU, I'm gonna use the whole GPU.
Yeah!
You have to utilise 24 gigs of vram
Your GPU is being fully utilized anyway to generate higher FPS (Given that the game is optimized and you didn't set a limit)
You also paid for the whole monitor
@@windowsxpprofessional GOOD! Another reason for upgrade
@@windowsxpprofessional Thats not true whatsoever
I prefer to game at 480p Ultra settings. That way I get to satisfy my primal need to slide things to the right, but still get 144 fps.
Mr Beast comme nted on my latest vi d in my playlist!! omg I’m so shocked I’m crying right now 😭 😭
I think using 1080X760 is best for a normal guy
that's the true PRO GAMER MOVE!
@UCt_z7JPmzid3Na0Ss3R3V2g wait what
but 4K 24 fps is the way the producer intended it #cinematic
I could really care less about how it looks or plays. The main appeal of playing at max settings is keeping your room nice and toasty in the winter.
How much less could you care?
Yea how much less could you care bro?
Lmao
my pc blows out cold air
Katsup been real silent about how much less he could care.
Future proofing. Highest settings always cause issues with "current, average" hardware, but playing a decade old game with highest settings on your new rig can be pretty sweet!
I mean to be fair you could do it with a 1051ti lol
@@anjanrajamani8821 Lol good luck running Red Dead Redemption 2 on anything but low on 1080p resolution
exactly lol. played dishonoured 1 on highest setting on my gtx 960 when it first came out and god damn!
@@anjanrajamani8821 what no, it released close to 2019!
Yes exactly! It's not the biggest difference between High and Extreme, but it might look like a way bigger difference on whatever monitor I have in a decade or two.
Back in 2004, I remember watching my friend play Far Cry at a subpar frame rate (he had it all maxed out). He took a bathroom break, I went into the Settings and turned the graphics down to Ultra to High or whatever the presets were called, he returned, didn't notice anything different until much later when he wondered why the game felt smoother. That taught me that people (even graphics nerds) are way more likely to notice the difference between 30fps and 40fps than Ultra and High settings.
I repeated this multiple times with multiple games over the next decade - same outcome each time. People don't notice any quality improvements but suffer low frame rates for no obvious reason.
😂😂haha
Smooth performance is always better than high graphics.
I really like 4k resolution, and antialiasing is not strictly necessary at that resolution and combined with lower settings it can yield good performance "1080p" gpus. Therefore I think reviewers should benchmark lower graphics presets.
2004? i think you mean 2014
@@eliasahawneh6199 The original Far Cry in 2004, one by CryTek. It was a great year for PC games - Doom 3, Half-Life 2, Far Cry, Halo 2, The Sims 2, Rome Total War, NFS Underground 2...so many great titles.
I figure that Ultra settings are mostly for aging games. A year or two down the line when a mid-level machine can run the game, you can max it out and itll still look good.
Something I can agree with, especially as resolutions get higher
I'd agree with that sentiment if Ultra actually did look significantly better, but that isn't really the case these days. Games like Crysis or the Witcher 2 came with Ultra settings that actually were a noticeable visual improvement. Those settings were straight up impossible to run at an acceptable framerate backwhen the game released, but they were a good example of settings built for future hardware.
I thought the whole point of the video was that the difference was negligible/not perceivable. I would assume this will be true regardless of how many years you wait.
Hell yeah, Far Cry 4 looks great on ultra settings
Right. And I don't think there's anything at all wrong with devs "future proofing" their games by unlocking graphical options intended for later generations of cards. Like, if your engine can handle 20 gazillion particles at 8K, but current hardware can only render 5 gazillion at 4K with a decent framerate, why handicap the game? Leave the options in, so it'll look nicer in the future.
I especially liked how GTA V on PC handled this, with a separate graphical options menu specifically for those "extra" settings which weren't realistically playable on consumer hardware in 2015, but are totally do-able a few years later. I think that helped prevent slider envy among players.
riley: can u spot the difference, low to medium or medium to high sure. youtube compression: no
Came here to say this too.
@@CattyRayheart I watched it on my cracked phone screen lol
Would have been a perfect opportunity to advertise floatplane :D
Too bad they didn't post to Vimeo
lol yeah, really makes any quality comparison video kinda flawed
I've found out through experience that setting everything to high instead of ultra causes an average fps increase of 30% - 40% while maintaining the appearance of the graphics practically unchanged
It actually depends on your screen resolution. For mordern game, with 2k monitor, you should use atleast high settings. For 4k monitor, you must use ultra settings for the best visual.
Is high = ultra?, on less than a 1080p screen I can tell you it isn't, in Forza Horizon 5 the car details set to high are way more blurry the car badge is blurry even the contours and any exposed parts are more blurry than normal so yes ultra is needed. 1080p on less than a 1080p (900p) screen looks bad, it's good looking but not as immersive as 1440p or 4k.
Well imma try this in COD then, currently playing on 2k and ultra settings, I usually get capped 60 FPS (smoothest experience imo) but sometimes the fps drop down very badly in specific graphic intense biomes, I’ll try and see how changing this one setting will improve my framerate
@@MaxwelI60 fps in CoD is less than desirable for competitive players
I love going back to play all the older games my old systems could never play on max graphics whenever I upgrade, always the best feeling
What games are on your list?
Exactly. Ultra settings have exactly two purposes: to make the screenshots for the box / any other promo material, and to let the game "grow" with future PCs. A game is meant to look _good_ on normal/medium settings, _great_ on high settings, and even a bit better than that on max/ultra settings once the hardware needed becomes affordable. Low settings are meant for systems which barely satisfy the requirements and should look OK, but the main focus is on keeping a game playable.
Playing the old games on the new PC, with higher settings, is one of the best feelings. Too bad that today's industry loves to produce incompatible OSes.
The "Win10 touchscreen patch" from 2 years back, anyone? It forced just about everybody to rewrite their software, just to remain usable after the patch. Without a patch, you can't scroll the map around; it just jitters back and forth uselessly. Yes, MS failed at something as simple as processing mouse input properly.
@@uujgndn Elder scrolls was one for me. I started out on console with oblivion and later got a cheap laptop that only just manged to run it. Wasn't until I got my first real PC in 2015 that I was able to play them on higher settings. Even now I still revisit oblivion and skyrim from time to time and play through them. Big nostalgia hits for me, particularly on oblivion the game that sucked me in to the franchise. Damn that horse armor dlc I still feel ripped off.
It is very satisfying to see 400+ frames on a game I used to run at 20 lol
I love playing Half Life now with less than one second loading screens. I remember my first PC taking forever. It sucked when I had to go back through one just to grab ammo or something.
The RE engine's graphic options are so good for visualizing not only the graphical outcome, but the load on the PC. I wish more games did this
Oh yeah definitely .. RE ENGINE AND it's optimization is world class imo 🔥🔥
@@DarkGamerA tell that to re8
@@Gazzoosethe1 huh ?
@@DarkGamerA runs like shit
@@DarkGamerA when you can control the scenes its much easier to optimise - rather than open world type games.
You've turned my brain into the world's most advanced sponsorship segment prediction system
One day our off springs will born with this skill... evolution
fr fr xD
use sponsor Block extention
FACTS
Big faccs
See the thing is, once upon a time, ultra graphics settings were a way of future proofing a game. Back when improvements in processing power in CPUs and GPUs were resulting in SUBSTANTIALLY different graphical results in games. I'm talking about differences like going from 640x480 resolution to 1920x1080 resolution, which is going from 'Yes I can tell that's a cube, those collection of pixels over there are definitely in the shape of one' to 'Whoa I can see the individual blades of grass...'. Back when GPUs were only just starting to achieve things like per pixel lighting effects, and every bit of processing power improvement in per pixel processing was allowing for doing way more interesting stuff in the lighting equations. Back when which GPU you had determined if your PC could do AA or SSAO or normal mapping.. like.. AT ALL... or if it could do it at 60fps.
Nowadays, buying a 3090 Ti and running a game at ultra settings grants you upgrades from 'pixels so small you almost can't see them' to 'pixels so small you definitely can't see them' or 'reflections that look almost accurate' to 'reflections that raytraced to be absolutely accurate'. And for these modest improvements you need an insane increase in processing power.
The fact is, and this is a horror story for AAA game developers: GRAPHICS HAS PEAKED.
We're at the point where something like Cyberpunk 2077 can run at 720p on a handheld gaming PC for 3 hours with decent quality settings, and it looks amazing. A beautiful stunning city that feels like something out of a movie, running on a handheld device in just 15W.
We're at the point where the biggest bottle neck in game development is no longer questions like 'How many polygons can the GPU handle?' or 'How many enemies can we get on screen?' but questions like 'How many more A list hollywood actors can we sign on to do voice over work?' and 'How few hours of sleep can our developers survive on before we might get sued?'.
We're at the point where games from 10 years ago still look absolutely gorgeous. That wasn't the case 20 years ago. 20 years ago, if you played a game from 1992 in 2002, it looked like absolute dog****. Now, a 2012 game in 2022? Still looks fantastic.
Graphics has peaked! We don't need MOAR TEXTURES MOAR SHADERS MORE POLYGONS etc etc. We're past that point now. Now it's time for game developers to focus on the more critical aspects of their games. Gameplay, story, art, sound, music, improving the customer experience through smaller download sizes, fairer prices, lowering power demands for low end devices, improving accessibility options, etc.
Which is why this is a nightmare horror story for AAA game developers. Now they actually have to focus on the quality of their games to stand out. We're in the dying breaths of the era of gaming where making something look 'higher resolution / more detailed' was enough to make it sell.
Fr
Very well said. I ageee with the sentiment, except for the characterization as a "Nightmare Horror Story" for AAA game developers. It may be for Execs who can't throw money at a game to make it sell better. But to be honest, I don't think that was ever really the case.
Limitation inspires creativity, and a lot of successful games in the past dealt with hardware limitations in unique ways that boosted the experience of the audience beyond what was expected. The era where hardware limitations affect game design to a point where developers have to make design decisions with those tradeoffs at the forefront is mostly over. But this means that developers that really would rather focus on gameplay, story, and art design, etc. have so much more freedom to explore. It also means that as experience in the industry collects, smaller game studios are more capable of providing high fidelity experiences to an audience.
I'm not trying to say the industry is perfect, or excuse crunch culture called out above, but looking around at all the small studios popping up, led by experienced developers that left AAA studios like Blizzard and Bethesda, gives me hope for the future of AAA Quality game development. I do not see anything resembling a nightmare here
Facts
Never had truer words been spoken. This is why a lot of nintendo games are so high in quality because they don't waste their time and resources to pointless graphical fidelity. Instead they shift their focus to the real experience that players would actually care about. 2 of the best games this decade coming from nintendo (botw n mario odyssey) literally runs on a shitty tablet that's it's hardware is extremely obselete by modern standard, yet they obliterates most games with higher graphical bs and probably 15 times more demanding than these two games in terms of sheer scale, scope and gameplay fluidity. Also even with all the crazy graphical advancement today, some devs still managed to make their games look like absolute garbage. Wd legions is the worst example of this. Even with all the ridiculously demanding ray tracing the game still looks bland, boring and just plain ugly horseshit.
Disagree. Path tracing proves graphics have a long way to go
5:09 will haunt my nightmares for years to come
Mr Beast comme nted on my latest vi d in my playlist!! omg I’m so shocked I’m crying right now 😭 😭
Same
obunga
@@Anonymous-dz6ep mr beast sent me 1000 bitcoin all i had to do was send him 1 bitcoin to show I was truthworthy
"Ooh ooh, ah ah, me like sliding."
A+ to the person that rendered that flying hotdog, your work didn’t go unnoticed
Thank you
Time stamp plss
@@illuminatigaming177 2:40
"You're going to enjoy a $70 steak more than a $1.50 hotdog."
You don't know me.
Would you rather have a $70 steak or 46.6666666666666 hot dogs🤣
Here's a fun fact: Contrary to popular belief, the meat found in hotdogs IS NOT the disgusting remains of the animal that was killed, but rather, the parts of the animal that didn't quite make the cut (no pun intended) during the process of making steaks (which, in case you were unaware, are glued together). In other words, if a steak is "Grade A+" meat, then hotdog meat would be "Grade A" meat.
@@DarkArachnid666 Sure, but a 1.5 dollar hotdog probably has shit soggy or dry bread, old ass hot dog, bad quality condiments etc. (At least from the cheap hotdogs I've had in several cities)
You get most of the value of expensive cuts by cooking them properly, and you can definitely get some insane hotdogs by getting good quality ingredients across the board.
@@DarkArachnid666 not quite, grade A meat can still be sold at a higher margin then hotdogs
The only reason I go out drinking these days is for the hotdog / kebab / burger (delete as appropriate) at the end of the night...
For me, it’s about fidelity and longevity. Like some other commenters here, I too build for a 5-6 year machine, or more as I usually am able to use them for 7-9.
Like a piano, 80% of what you play only utilizes 50-60% of the keys at best, but when you come across those pieces that require the full range you’re glad you have them.
In the other hand, your hardware will last longer if you operate it within its limits as opposed to at or slightly beyond.
And without COVID-19, if you get barely fast enough GPU today, you can get another "barely fast enough" GPU 4 years into the future which is faster with then modern games and the total sum for the two GPUs is less than the high end GPU today.
If you do not need 100% of the GPU today, do not get the expensive GPU.
A GPU Doesnt die slower just because you reduced the settings. Your framerate went up, your GPU is still working hard.
@@steinkoloss7320 The longetivity is not about heat death but ability to play future games without upgrading hardware. My point was that buying the hardware you need in the future is pretty pointless because you can get better hardware for the same money in the future. Get the hardware for the already existing software/games.
@@MikkoRantalainen I meant the top comment xD
@@steinkoloss7320 absolutely right. It’s not about settings vs frame rate, it’s about overall utilization. I try to balance my GPU to about 90% max, and capping frame rates helps with that.
I will say this: draw distance significantly impacts immersion.
i say poor shadows and ambient lighting impacts immersion
I think the half-life series has certain things in it that make it better than modern games when was the last time you picked up a wrench in a video game? What about a random piece of wood after smashing a box? Some of these things go a long way to make that game feel better then more modern games.
@@bland9876 You should check out Boneworks bud. That very same effect of half life but in a new age vr game.
@@bland9876 Yup, I don't really understand the need to play new triple A titles. There's very little innovation in terms of gameplay with these. I tend to either play older games, or go indie, and in both cases a 3070 is usually enough to play at max settings 4K 90-120FPS.
@@seeibe i have had no issues with a r9 390 running games at 1080 max setting.
Like doom 2016, Wolfenstein new order, and Fortnite chapter 2.
Riley: "Is it really worth buying a Buggatti to show off on your 5 minute commute?"
Me: "Isn't that the point?"
Bugatti*, and no. The point is to stroll through the riviera going 2 mph and flex on them poors. If others don't see you, then there's no point OR you actually bought it to drive on mountain passes or something
@@Anankin12 If I had that kind of money then yea I probably would buy some stupidly expensive flashy car and then pay someone to drive me around in it as I don't have a licence.
It's a bad analogy, since no one buys GPUs for flexing. Lots of people buy cars for flexing...
or you could buy an aventador or f430 for a fraction of the price and majority on non car people wont notice the difference in fact i think most girls who dont know about cars will say the ferrari is more expensive than the buggati
@@carlosmspk Not quite the same, I agree - but people will sometimes buy a GPU that they don't need (e.g. an rtx 3090 for minecraft) which is, in a way, flexing money. All I'm saying is, people do (sort of) buy GPUs for flexing. Any expensive, popular object is bound to be used to show off.
Over the last couple of years my in-game recommended settings went from Medium to Low to Text-Based.
Mr Beast comme nted on my latest vi d in my playlist!! omg I’m so shocked I’m crying right now 😭 😭
@@Anonymous-dz6ep sorry, was your comment so bad and he scolded 😶
Damn yo, time to break out Nethack. :-)
Colored text, or b&w? :)
I have these concepts that go further down than that (I kid you not as a game developer), would you care to honor my future work in that direction with your opinion about the stuff I'll give you for free to have? One of those should be hardware as well :)
I find it so weird that you CONSTANTLY see Shadow of the Tomb Raider used for benchmarks, even to this day, but I don't really see many people play it or make other content about the game.
It's like it's just a benchmarking tool.
Well it’s a really good game in its own right but it’s one of those games you really only play once through and that’s it. There really isn’t much more content to make after doing a playthrough and the lore and story is very upfront so no videos explaining that is needed. The survivor trilogy is amazing though and one of the few times a studio nails a reboot series.
My video card is so underpowered that when I try to play a new game I have to turn off anti-aliasing, shadows, textures, and my computer.
AA is pointless, anisotropic stuff can bugger off, sharpness filters and stuff can bugger off and most importantly get rid of motion blur and depth of field
But depending on game and Vram you could just be running out of Vram so what GPU is it?
@@commanderoof4578 one thing not to get rid of is texture filtering, especially at higher resolutions where its easier to spot lack of detail but mostly yes, AA is hard to recommend unless you have a huge excess of performance
@@_._shinonome_._ depends on game if the game uses 1080p or lower res textures yes its gonna be bad to turn it off
But if a game uses 1440p, 4k or in some cases 4096x4096 then its a setting that is pointless
@@commanderoof4578 agreed
DLSS is here to save us. Though I do miss sparse grid super sampling antialiasing.
The differences are less prominent these days. Go back 10-20 years, there were usually night and day differences between High and Ultra settings.
And even Low/Medium quality have come a LONG way these days as well. Those back then looked especially awful.
That's because the main difference back in the day was that Ultra almost always meant uncompressed textures, which was a really big deal back in the era of much lower GPU RAM quantities.
@@superslash7254 it is still non compressed but not for Consoles
1:27 is certainly worth a read if you didn't notice it the first time, gave me a good chuckle.
Wow good catch! 😆
Ray Charles Cores, 420GB G69X Memory.
Inspect element is strong with this one!
Low meme performance that ain't good at all I live on HQ memes
James charles cores
Glad someone else caught that. The RC cores caught my eye and I had to investigate.
I really like going to old games and being able to max them out. It’s great to say “ I wasn’t able to do this x years ago”
This is basically why I just upgrade my setup every 5-6 years and save money in-between. In that time The games have changed to be like heaven and earth and the hardware has as well. If you upgrade every year the changes will be there but definitely less noticeable. I prefer the thrill and excitement of a larger difference to get me back into modern gaming than the less noticeable gradual one.
Same. I built my first pc at 15 with an amd bulldozer cpu, and a gtx 950. I got a job and at 16 I completely redid it with a ryzen 1600x and a gtx 1080. I sold that to my girlfriends brother and he still uses it for heavy daily gaming, meanwhile at 21 I just got a steal on a NZXT pre-built with a Ryzen 5800x (liquid cooled) and an rtx 3080ti. The leaps in quality everytime have been mindblowing, but the longevity of the hardware in upgrading from is also really admirable
Had console since PS1-PS4 and bought a zephyrus G15 just got done updating everything and I’m scared to buy a game😂 I know there’s no returning back to console after this
@@gullyboydreamz50 can't go wrong with the zephryus g15 my guy, that's the what I use as a gaming laptop when I'm away from my tower
Well damn same here. I'll build a new PC every 6-8 years.
@@cheedam8738 Nice to see so many people do the same as me
“Often you get suckered into preordering…”
“Our sponsor is drop”
Jokes aside, the dumb thing about preordering is paying for something that couldn't be independently reviewed first. That is not always the case for products on drop, only when they offer a product exclusively for the first time ever
@@TheSchwarztrinker dont get into mechanical keyboards xD
@@onepoh4680 Yeah. the mechanical keyboard scene is a scam. They only care about fidelity, and sound. Performance is their last priority.
One of my best friends made me realize this when it comes to games. "You don't always need the highest settings. Having a smoother experience is much more worth it in most cases", and I finally understand what he meant. Plus if your hardware isn't in the highest end to begin with, you have to set realistic expectations for yourself. Once you accept that fact it all becomes so much easier
Yes playing gta 5 with 13 fps on all lowest settings lowest resoultion/setting and shit like that is much better when you know that that is the best you could play a game you downloadid for 1 whole day
@@jovan.v7690 by that point get a new gaming machine cuz its probably atleast 15 years old
@@freshmarcent2741 its 11 years old and i am getting a new pc i am just waiting for it to deliver
@@jovan.v7690 that is good to hear hopefully your gaming experience gets better
@@freshmarcent2741 thanks
I think better, more realistic graphics have their days counted as marketing tool. I think we're reaching a point were improving graphics is making less and less difference, and the better looking games are those with good artstyles which maybe aren't as realistic but still look good. That doesn't mean the run for better processors or gpu's is going to end, because in those games what can get a lot better is number of things happening on the screen at the same time, and we're still a long way to get a realistically crowded city for example.
Mom: "Don't worry Monke Riley doesn't exist he can't hurt you"
Monke Riley: *5:08*
Underrated comment💯😂
🤣
This should be pinned 😂😂
@@DjGoHamGaming I love your videos bro
Monke Riley is god
"Oo oo LALA ...Me like sliding"
:-Riley Murdock
This line is gold
Mr Beast comme nted on my latest vi d in my playlist!! omg I’m so shocked I’m crying right now 😭 😭
@@Anonymous-dz6ep Thank you for lying, noone care
@@Anonymous-dz6ep this is why u have 0 subs.
@@minelelol13244 he really did!!!!! Check it!! 😭 😭
Every time i get a new gpu: "DAYYYUM THIS GAME LOOKS NICE ON ULTRA ... Now back to low, i want those frames back."
Good thing you can do that because I have a 1050ti
My gpu: "u want frames?"
My CPU: "No"
Me: " :( "
@@steezy_dreezy7389 you have a pc...
huh... I have a 8 year old laptop
@@aidangambura7067 what is your cpu?
boi if this ain't the truest thing 😂😂😂😂😂
I've found that if you play some older games at max settings it adds a lot of value but yeah when it comes to new games it only really goes so far and the extra upgrade isn't super noticable.
Can you spot the difference ?
Me on my 1080p phone screen: "your absolutely right all games settings are a scam"
Most people play in 1080p tho.
@@Pieteek 1080p 300Hz masterrace
@@mr9293 Just ordered the x17 in 360hz 1080. 3070, 32gb ram, 2tb SSD. Gonna be sick.
I don't know what kind of phone you have but my phone has a higher resolution than 1080p. I know this because I mirrored my phone onto my computer and I set it to go pixel to pixel and I couldn't see the entire phone screen even with my monitor rotated sideways. S7 edge
I watched on my 4K monitor and couldn't tell the difference in that scene.
my favorite day ever is getting a new pc and going through my list of steam games that used to play like shit to blast full graphics on
you have defined me a week ago
I know right? It's like getting revenge. So satisfying.
Agreed
That'll be me when i get to finally play final fantasy 15 at a playable framerate without it looking like trash.. Okay maybe not trash but high/ultra is really impressive looking to the point where I think it's how the game should look.
@@vinicius401 i need to make a list in steam called "ill see you in 3 years bitch" for all the games i cant play now lol
Low settings can give a competitive advantage
*laughs in war thunder where it gives you the ability to literally see through almost everything*
Would please elaborate on that, in regards to war Thunder.
@@Aghar74 Phly made a video about it some years ago: th-cam.com/video/JhZZf_knsAk/w-d-xo.html
@@Aghar74 basically on medium settings onwards in War Thunder there would be plenty of foliages, trees, bushes, terrain deformation, etc that made it harder to spot enemies in battle, but on this setting called Ultra Low Quality (ULQ for short), most of those simply does not exist, which made it easier to spot enemies overall.
For example someone could be hiding behind some thick bushes and tall grass, but still get spotted anyway by someone a whole kilometer away because the spotter uses ULQ setting.
@@casualguy634 Sometimes on fps games too
@@casualguy634
Just imagine : Last man in squad, Hiding in a Bush, Trying to Clutch.
Enemy at ULQ : TF is this guy doing crouching in middle of the map,
Haa Amature 😂
("Shoots")
The main difference between Very High to Ultra or Extreme is actually between 4K textures to HD ones, and so on. There are web pages that show the differences and NVIDA is one of them!
"oohh oohh ah ah, me like sliding"
-Riley Murdock, 2021
That edit scared the fuck out of me.
Nightmare fuel.
Are there random tech tips compilations?
The "ultra high" settings, especially for textures and such, aren't really "added". In many cases they're just the base truth from which the lower quality options are generated. For instance "ultra" textures may mean "use the textures as shipped with the game", while "medium" textures are just lower resolution versions that are generated by down-scaling them.
Having an "ultra" option basically costs no development time at all. What does cost a lot of development time is creating optimizations and tricks that make running the game on lower end machines possible - and still look good. Obviously there's exceptions where higher quality just means bigger numbers/more particles/higher range. But then having that option just is a question of "you might as well".
Exactly. The ultra setting is what the game's graphical baseline is. If your hardware is current then you can run games on ultra fine.
@@daemonthorn5888 I don’t think so dude medium is console quality
@@daemonthorn5888 i dunno mate. I cant think of a company wholl make ultra their baseline and have their games run terrible for consoles
@@meghanachauhan9380 baseline is (supposed to be) meant in the development of course it still gets called an ultra high setting (not too sure if the user you replied to exactly understood any of that either though)
The point of video is, why create the texture at 4k at all? It didn't just come into existence in 4k. Why not leave everything 1k and then only include up to 1k in the game? You can't say the game just was 4k to start with. That is a choice made by someone, that this video is arguing didn't need to be made
High/medium is usually where I aim with settings, maybe ultra if it's an old game that's not hard to run.
How long do you think a 3060 ti with 11400f could do 1440p 60 fps for at high to medium settings?
@@kashif4463 you'll be good for 4-5 years.
@@ytbeasty6770 even with the 11400f?
@@kashif4463 The CPU is unlikely to be the bottleneck and in most modern games in general GPU is usually the limiting factor to FPS until you get above 150-300 FPS when the CPU is required to feed a lot of data, but this usually only happens in easy to run e.g. e-sports titles (like CS:GO above 300 fps). But obviously, see how it goes, that's a fairly well-balanced setup and from Linus's review, the 3060 is a perfect sweet spot and a big generational improvement. You can always upgrade in the future when the time and pricing are right. But yeah 11400 and 6 cores is more than adequate for gaming. Even the i7-4790 or i7-6600 era CPU's from 5+ years ago (4/th6th gen and we're now on 11th gen), they will run decent with a modern graphics card and don't hold it back too many in most games. Also depends on your resolution as higher res taxes the GPU more, higher FPS with lower res may stress the CPU more
@@Michael-op1lj Thank you for the response. Looking at the new 12th gen Intel cpus I might just wait for the b660 motherboard to relase and the 12400f.
i struggle to find differences that don't require analyzing frames past high settings , especially at higher resolutions , i think a smoother experience is far superior to eye candy that tanks your fps
Cranking up that Minecraft render distance makes it feel more immersive and make the world feel much larger
Eh, Max distance with dlss and raytracing enabled makes the world feel more alive lol
Me laughs in 8chunks
Places that felt huge, feels now tiny at 40+ chunks render distance...
@@LapLandSystem if only Minecraft could make use of all of my cpu cores :(
@Shrek Minecraft Bedrock edition will run amazingly on an Epyc but Minecraft Java will run on a single core hogging power from the boost clock
They really included feet as a reference length in Minecraft
I know lmaooooo 😂
One could say "game distance units". It's a little arbitrary. If they accidentally said feet instead of meters, it doesn't matter. Unless the game is collaborating on engineering a Mars rover.
@@JB52520 Then Steve is 1.76 feet tall? Most trees are only about 6 feet tall? Think, if you even can.
@@JB52520 Like Angolin said, it does matter. A meter does not equal a foot and a foot does not equal a meter. The measurement systems are very diferent and thus it does matter no matter what the scenario is
@Angolin
The blocks are 1m square, so Steve is actually around 5' 7", and trees 9ft. A meter is roughly 3ft.
The scale makes sense realistically, but it doesn't really matter. The whole Minecraft world could be miniscule and it wouldn't change anything from the player's perspective as long as everything sticks to the same scale.
I really love Riley's writing style and the way he hosts. One of the bests not only at LTT, but in the whole youtube tech sphere
Riley is my favorite, my wife only watches TechQuickie's with me if he's hosting the episode. Hmmmm
This video was written by Plouffe
Mr Beast comme nted on my latest vi d in my playlist!! omg I’m so shocked I’m crying right now 😭 😭
@@Anonymous-dz6ep stop crying
His hair is glorious too.
What I find very very important to be left enabled is Anisotropic Filtering. Makes a huge difference to textures on the distance (even mid distance) in e.g. Dirt Rally 2.0
also things like crowd density, any settings that reduce pop in, shadows that look decent enough. Theres usually some settings that only eat up vram and don't hurt performance at all, for example in ow2 anistropic filtering can be maxed out for a couple hundred megabytes of vram at no performance impact
The first 7 minutes of this video:
"Settings have diminishing returns because of diminishing returns"
Gotta say, Riley absolutely crushed this one. Fantastic hosting!
Yeah , he is the best for the job , Linus should let Riley host more in this channel, a lot more
A wise man once said "Ultra is for screenshots, very high is for Playing"
An addendum to that "High is for Enjoying" - Sun Tzu
And he continued "Medium is for competitive"
Tech Deals often says it
@@LuisFelipe-pq9lr *low
@@stickboslightning *potato
I always max out the ambient occlusion. It has the greatest visual impact next to texture resolution.
As someone who works with graphics, AO is best and more realistic on mid settings or lower and not be overdone. In my experience it's caused so many artifact glitches when making games. Best to keep it low.
I'm always licking the walls for cover in my games and you can definitely see the texture resolution difference there! (it honestly is a bit surprising this video glossed over how situational some high settings can be)
Yea like on truck sim for instance higher resolution mean you don't have to use the interior zoom to read your dials. At 4k they are nice and clear but at 1080p they are a little blurry on bigger gauges and impossible to read the smaller ones without the zoom. Not needing to zoom in dramatically improves quality of play.
@@toshiroyamada2443 pretty sure here the main focus was on AAA and big name game titles which truck sim isn't a part of. Also yeah in most games you'd not notice the difference as already mentioned, after all the information here is supposed to be for the general audience and it's the viewers job to differentiate the rest
Not to get too off topic, but you should try not to hug cover as it greatly reduces the angles you can see (and thus your reaction time to someone coming around a corner) while not protecting you much more than if you were a few feet away.
View distance can make a huge impact on the general feeling of the game. I remember editing config files in Gothic III, so i could see herds of animals running along the horizon. That was awesome, even justified the bad frames inside towns lol
That's a good point, when I played the 3DS port of Dragon Quest VIII, the biggest thing that I didn't like compared to the PS2 release was the view distance. Watching trees and mountains pop in shot my immersion, and a lot of my appeal
Gothic 3 on maximum possible settings looks great even today imo.
@@notorioussteve5000 hmm… I played it a looong time ago and haven't finished. Maybe will try again soon, thanks. Back then it was a laggy mess, at least on my PC.
Agreed. I have a SICK Crysis cfg file.. goes FAR above Ultra and to this day games still don't look as good as my version of Crysis..
Oblivion
Completely grazed past the future-proofing point. When an engine is capable of cranking up the graphics to ridiculous settings, some day everyone will be able to run them and the game will age well into the future.
This is the only relevant answer to this. What you may not be able to run at all or properly now, you may be able to run just fine in the future because of how PCs are. If it weren't for consoles hamstringing PCs, we'd have more future-proofed games, as sometimes entire games were built with the future in mind(usually based around an engine a developer plans to use for future games or license out).
Exactly. It's a joy revisiting old games on max settings. It's actually clear many devs don't go crazy enough with settings, because to max them you often have to go into config files to tweak the settings. But it's worth it.
exactly. Bought a 780ti back when it was a year old and perfect for playing game at the time, but soon after the vram was just not enough on it, especially when going from 1080p to 1440p
*cough* 7:12 *cough*
@@ruekurei88 Consoles don't hamstring PC games, quite the opposite in fact, they provide both the lions share of game profits, allowing bigger budgets, and more effort to be able to be put into those high end PC settings used to sell to console players with bullshot false advertising, and they set an ever increasing baseline for minimum specs that goes up with every generation.
If we didn't have consoles, all PC games would be made with half the budget and built with the baseline target of integrated GPU's on the budget PC's and laptops far more people own than they do high end GPU's.
I'd also add that lowering settings can (sometimes significantly) lower noise from fans and the amount of heat generated.
Only if you cap your fps, if you don't cap fps it can actually make your room hotter because the cpu is working harder to queue more frames for the gpu. This is especially noticeable if you are using water cooling
I feel like Ultra is for when we wanna max a game to see it in full glory, I will sometimes max out a game and have it run at 25 FPS temporarily so I can look at how the game CAN look, then I change it back to something thatll run at 90fps or so and play the game that way.
IMHO, Ultra is for finally playing the game 10 years after it came out, when you finally got around to play it.
Yeah, it's noticable if you are trying to get screenshots, but even then jsut barely. Texture quality matters if you are trying to read small things on signs or something. The only time I noticed an actual difference in gameplay switching from ultra to high was in Shadow of the Tomb Raider. Funnily enough it was the shadow resolution, made the leaves on the trees look weird.
You must have an older card. 90fps?
F THAT
@@JanTuts that 10 years was hyperbole for 2 years right?
@@matthewalvarez6884 Let me put it this way: I _have_ Skyrim... :|
“What’s with him?”
“Slider envy..”
"Low settings dont look too bad nowadays"
Oh wait till you see my Destiny 2 become Terraria.
Destiny 2 minimum settings looks not good lmao. Low settings look fine but why would I play with them in any game
Yes but Destiny is more designed to go as low as possible for really low END PC's xD it just means much wider audience
I mean in the end it's what you are willing to sacrifice of game on your PC.... Every PC has a limit after all.... If one of fine with low settings.... So... Depends on the person
Idk what you're talking about bud
Bless the devs at bungie for allowing me to run my laptop 1050 and still play the game at a little over 30 frames, they’re watching out for all of us 🙏
Riley saying that Low to Medium or Medium to High is obvious, meanwhile the 4 look just exactly the same to me
The higher you go, the lower the returns. It's always been like that in almost any aspect of life. Trying to reach the top five percent of what's achieveable is usually reserved to those who go after a hobby hard.
Indeed. And in fact, it used to be same with overclocking. Modern platforms just don't have any of the headroom. But back with stuff like LGA775, LGA1366, AM3, it took some work to get your $150 CPU to be faster than the $600 CPU.
That was deep. I needed that. Thanks Leon.
@@nomad7317 You're most welcome 👍🏻
Low for competitive
High for fun
Ultra for flex
Mr Beast comme nted on my latest vi d in my playlist!! omg I’m so shocked I’m crying right now 😭 😭
@@Anonymous-dz6ep nobody is gonna fall for this you useless child.
@@ItssMitch Don't respond at all. Ignore it.
For WoW that's not true, cause you want at least medium details on the spells so you can clearly see them, and even projected textures on objects and surfaces so you don't or do step out/on them.
@@nax4448 that's a minority tho
I love how Tarran just sounds like he is complaining that he cant see the difference 😂
[Troll] Game Devs watching Tarran's macro videos "but we can't see the difference". ;)
This really, really makes me think of Freespace 2. Not that the game was any showstopper by anyone's imagination, but the extra detail and graphics they put in really shined (relatively for the era) in later years when cheaper hardware could use those settings. I feel like this video completely overlooks the majority of us who can't afford high end hardware and who also don't buy a game the instant it comes out...
I get where you're coming from. I too played the Freespace games only years after their prime when they were sold in the cheap jewel cases on the budget shelf.
That said, I do think we're in an era of diminishing returns regarding graphics. The technology can DO lots of amazing things, but mostly it's rehashing the same core gameplay loops over and over again with ever more levels of prettiness.
A revolution in gameplay is, IMO, far more important right now than a revolution in graphics.
if i cant see the difference, especially in motion, then i'm just losing frames for no reason. "ultra" can bite me.
That's literally what I do. If I can't tell the difference visually I'm keeping it off. I don't need to run my rig on full load for no visual benefits. Also, the controls get heavy the higher you set the graphics to and the game becomes less smooth.
@@eurosonly yea. and sometimes i can see a difference, but it just looks "different" not necessarily better, at which point maybe i go with the version i prefer, or just save the frames.
@@senor135 Yeah, it reminds me of sound mixing. Some adjustments just "feel" slightly different and you can't place why-but you might prefer one over the other. I notice with shadows and ambient occlusion stuff that even if the effect is hard to "see" it still adds to the visual polish and therefore immersion (assuming it's not tanking your FPS)
Use custom settings, I hope all games moving forward will explain which graphics settings would impact performance and by how much.
Let's be real the only one who cares about those ultra graphic shenanigans are the one shit talking you for buying "cheap" gpu, because apparently if u can't play triple AAA games at ultea settings while opening 69 chrome tabs, your gpu sucks big time
Tech Quickies: *Compresses 4 screens to fit a single frame*
Also Tech Quickie: Spot the Difference, You Can't!
Actually its a LTT video.
And then also TH-cam compression
Even a test that satisfies you will prove that while you are engrossed in a game you won't notice. As he said in the video there are diminishing returns the higher you go. Oh also your monitor has to match the res you want to use or you are not getting the true res. For example I have an old monitor that produces 1280x1024 at max. You can set a HD youtube video at 1080p and it will stream it just fine but you won't get true 1920x1080. So if you want to watch a video at high settings say 4K your monitor must be 4K or you will not see it at 4K.
It is funny tho, that they decided to pick Shadow of the Tomb Raider, which is probably one of the easiest games out there to spot the differences (Rise - previous one - also). If anyone in here says they can spot the difference between high and very high it's a lie. Even at 4K the compression - and the fact that it's 1/4 of the screen - won't show the difference. Like Dan said, when you're engrossed in the game you don't notice small details, and coming back to why it's funny they chose TB, it's because you DO notice, even if you're engrossed, in this case. There's A LOT of downtime in TB, and I remember fiddling with the setting to get the most out my system; very high didn't run very smoothly, but high did, and I tweaked stuff up, until it stayed okay. Because it took a few minutes of testing, I really saw the difference.
Another series you can spot difference across all settings are the newer Assassin's Creed games (Origin and up). Every game that has a bit of downtime you can spot the differences, specially shadows from a moving object and moving light source, and hair. At 4K, even texture quality is noticeable. Not even going to mention Ray-Tracing cuz it's stark obvious when it's on/off and their quality levels
Monkey Riley is horrifying!... Someone make a meme out of that stat!
I usually just play on the third or second highest preset for games. "Max" settings only exist to make it seem like your device isnt up to standard anymore.
RDR2 - Very High - 100+ FPS
Ultra - only 60-70 FPS.
Cannot even tell a difference between the two settings.
Sometimes, it depends on the settings. Draw distance, for example, really makes a huge difference. It varies from game to game, but it's usually maxing out this setting. Sample games: Witcher 3, Rise of the Tomb Raider, Assassin's Creed: Valhalla, GTA 5, and Deus Ex: Mankind Divided.
5:08 nightmares of this for the rest of my life, thanks...
thank you for the jump scare warning
To me the golden rule is having a decent amount of VRAM. Its a "free visual upgrade" if you have enough to hold higher res textures
So, the opposite of free, since you have to pay more for more of it.
It's not that easy. More VRAM doesn't mean your card can benefit of it at some point.
tbh i do it simularly my rx 570 8gb struggles a bit so i play on medium or low settings with high textures as the 8gbs are enough........ for now.
definitly seeking an upgrade though
Two words: CUDA cores
Ya literally could replace this whole video saying 24 gb of vram is how you max textures without issues lol People would be surprised to find out the vram target they set in games means nothing. I can confirm I have played games on ultra that used over 19 GB Vram.
I recently hooked my PC up to a LG C1 OLED and even though in some games I’ve had to drop graphics settings slightly from going from 1440 to 4K, my god the difference in clarity and sharpness, colour, contrast etc is amazing. I could care less about maybe playing games on High instead of Ultra or maybe a mix of the 2, I’ll take that resolution jump and overall picture quality over a few settings being maxed anyday.
If you understand how Integer Scaling works, you don't need High settings. You can keep everything on low. Integer Scaling only works on native resolution. It can't work on it's own. Say for example your game only runs at 720p 60fps, but you wanna play at a full 3840x2160. All you gotta do is change windows display to 3840x2160. Then place your game In windowed mode. Then you need a separate program to turn that 720p windowed game screen into a 3840x2160 windowed version. Magic Bordless can do this. 3840x2160 windowed only is important. Then you can apply Integer Scaling. That original 720p with everything low will turn full HD no blur. Or you could do it the simple method and use lossless scaling. I prefer doing it the hard way because the picture looks sharp and crisp. Here's the simple method using Nvidia Image Scaling with Lossless scaling. th-cam.com/video/Cc9FzYSlsp0/w-d-xo.html To me Integer Scaling is better after I figured out how to use it. The way Lossless Scaling has it set up is an incorrect way of using it so do it the way I taught you if you wanna use Integer Scaling.
Same. I was skeptical about the quality jump from 16:10 1050p up to 16:9 1440p, but I have to say, I felt it. The details that came with the higher number of pixels, are not to be ignored. I've tried 4k and 8k as well and the difference is definitely there. Much less of a thing when going from 4k to 8k tbh, at least for now. Maybe in the future when we buy a GPU and add other things to it in order to make a gaming PC, aka when textures are like 64kx64k, there might be practical point of playing on 8k, but for now I'll say that 4k is an edge case when it comes to quality. When you walk slowly and take in the environment at 144 fps, sure, you will see the difference between 2k and 4k, but in any high speed scenario, it just disappears.
@100Bucks image resolution and graphics settings are two different things. Your resolution can be 8k if you like but if your game is on a Low settings preset, the fidelity of graphics will inherently be lower. On a low preset, you will see low poly models, very low texture resolutions and texture filtering, bad shadows, little to no anti-aliasing (which I admit high resolutions can somewhat alleviate) and little to no particles or vfx. So you'll see a very high resolution potato.
Best graphics to performance result is a combination of resolution and the right graphical settings.
Can you enable Auto-Generated Closed-Captions? Even if it’s not perfect, it’s really useful for non-native English speakers. Thanks 🙏
And partially deaf people, like me. I have a world of hurt without captions.
You can do it globally by going into chrome settings and turning Live Captions ON
Ikr i don't like turning up my audio
@@mohammedwasib8910 not on mobile tho
I think TH-cam just takes time to process the automatic captions, similar to the higher resolutions
"Ultra for screen shots high for playing" ---- Tech Deals
I do that
>Textures: Ultra
>everything else: low/off
yep... it's gamer time
That's all you need sometimes 😂
I find in most games you can just put everything except AA and textures on low and itll look amazing.
@@Polyvalent imo shadows / lighting settings in general make a HUGE difference, even when you conversely lower texture quality
In warzone this works, you can put textures on ultra and lighting low or viceversa and it works
Shadows on medium is always nice to have if you can afford it
great video , felt blessed watching you talk
Tbh i dont care what the game looks like but theres a lack of satisfaction when you spend 2000 dollars on a computer and you're not maxing everything out
Sounds like you have a spending problem.
@@Shadowdoctor117 Sounds like you have a problem with people buying nice things.
Here I am on my Dell Precision 7760 (i9 11950H, 4k 120hz screen, RTX A4000, 64gb ram, and 1tb ssd) that just arrived at my house a few hours ago... I just want to slide everything to the right and see some nice fps :)
But I obviously didn't get it for games; I can't wait to finally be able to render Blender Cycles with good settings and have good viewport performance in cycles
God I couldn't agree more, spent roughly 2k on my pc only for warzone to give me frame drop issues ended up needing an ssd and some more ram 🤦🤦 so dumb.
@@Shadowdoctor117 $2000 for a pc sounds reasonable in these times. Now the people paying $2000 for a 3080/3090 alone are the spending problems lol
Never trust a cable tie salesman.
Or an elf.
@@godnessy Or Count Grey
It adds replayability to games. I often pack out old games, go max settings and be like: "Darn, that looks pretty good".
You are very right. Just recently I installed GTA IV and am enjoying the hell out of it on max graphics (apart from render distance scalers) on my 4k TV. It's such a blast, and it almost feels like playing a completely different game - aliasing is barely noticeable, everything looks crystal clear and runs at 60 fps, completely different than when I played it on 720p with medium graphics at probably 30 fps.
@@DelaHazey halo i have a problem of shadow render distance in both gta 4 and 5 any fix
I feel setting the bumpscosity slider all the way to the right really makes the game look ten times better. Glad I have my 3090
I genuinely wonder if the Stanley parable ultra deluxe was poking fun at the idea of ultra settings making little to no noticeable difference in the way a game looked.
In Germany we say I paid for the whole speedometer so I'm gonna use the whole speedometer same goes for my GPU.
But what if the speedometer can't handle all of the speed?
That's actually a clever analogy lol Speedometers give you a maximum but it's always over what the car can do. And whatever GPU you have may not be able to play certain games at certain fidelity levels with good or even playable FPS.
@@danimayb right, my Buick Speedo says 160 is the top... but it's electronically limited to 125... but but 160!!!!
In Soviet Russia car fork you!
@@bradhaines3142 well sure in cars it will cost you a bit of the lifecycle but as long as your Pc hardware doesn't overheat you're good to go. I've never seen my RTX 3070 surpass 70°C after hours on full load and that isn't a worrying temperature for GPUs. If you stress your car to the max of course the wear from vibrations and so on will increase but most modern mid range cars are perfectly fine to drive at 180-220kph on the autobahn without issues and you still won't be the fastest one there (of course only drive as fast as traffic and weather safely allow).
@@Cardthulhu if you play 1440 you can easily max most games tbh.
Ultra for screenshots high for gaming
Mr Beast comme nted on my latest vi d in my playlist!! omg I’m so shocked I’m crying right now 😭 😭
Good call
I am high all time
Haha ultra until I bought my first 1440p monitor
The Anno 1800 way
For those of us who remember 640x480 being “ultra” anything is amazing….
When this and 25 fps were top tier
I remember I was playing on It counter strike 1.3
i used to play at this resolution and even getting around 15-20 fps on gta sa. now im playing anygame atleast at 720p on med - ultra
800x600 and 1024x720 maan. Running blood and duke nukem 3d on 800x600 lowered the fps so drickung much, and the difference was close to invisible.
30 fps was awesome back in the day, might have been playing at 480, but the textures blew anyways. Having played the sprite based shooters, even the 20fps range was quite playable.
1080p resolution with medium settings along with 60fps are more than enough for me.
Never had an RTX enabled card, but any time I’d mess with game settings the one that made one of the biggest differences and easily had the biggest FPS difference was Anti Aliasing. Never played a game that didn’t completely tank when you turned AA from whatever the base shit setting was to to 4x AA or higher. Sure as hell was more pleasing on the eye though.
OOOOOooooohhhhh!!! That 3070 has "diamond" stuff on it. Looked like someone took crinkled tinfoil and superglued it to it. Bling it up everyone!
I was thinking the exact same😂
@@sanctuary_of_mango yes
Yeah it's quite ugly . But I think it's supposed to be like ice, and it glows
Like "Tech Deals" said already and some of "you" might know this already ... "High is for GAMING, ultra is for SCREENSHOT" ... i game on high since i had a ATI Radeon HD 4770, and for me there is no need to go higher since the textures looks the same, and mid game, u barelu have the time to spot or see "those changes" ... but, that is "subjective" ... good informative video though :)
That guy is pretentious.
I can respect what was said in this comment
with a hd 4770 you cant go higher graphics lol
Sorry dude but the last part is simply untrue. When I play RDR2 at 4K ultra, I definitely take my time to just roam the open world aimlessly and take in the nice graphics. If my GPU can put out acceptable framerates why shouldn’t I play at ultra
How old of a game are you playing if you can go with high graphics setting om an radeon hd 4770
In pvp settings i always go the lowest settings with exceptions on model distance and actual distance. And if you can turn effects up or down I leave them high. For rpg or single player games sometimes co op I opt for 80 to 120fps so what ever mix of settings I need to get there. For certain games I opt for 60fps and try to get it as stable as I can.
Im playing Portal 2 on my Gtx970 at maxed out settings, and i relally appreciate, that they exist, because i can cet higher quality, so thats also an argument: playing games, when newer hardware is released, and people can get better qualty.
such a good game, i'm playing through it now too.
For awhile I was upgrading every generation, almost yearly. That actually led me to becoming pretty cynical about it. For awhile my rule of thumb was buy the $500ish card, every other generation. But that stopped being applicable with the 20-series tiered price increases. And now AIB prices are just dumb.
2:38 I'm not going to enjoy paying for that 70 dollar steak more than the hotdog.
I'd rather buy 5 hotdogs, ngl.
Reminds me of people who spend thousands for hi-fi audio components for such a barely noticeable incremental improvement over their previous high end system. Some folks seem to enjoy buying and showing off gear more than actually enjoying content.
1:26 "enhanced Ray Charles [RC] Cores" I love the editors
Lol
I play almost every game at minimum settings and I can’t tell the difference because my resolution is also so low I can’t make out the lack of detail. What I can make out, however, is how many frames I’m getting, by the oversized frame counter sitting in the top right that I generally regard as my current score.
At what resolution do you play? Even at 1280x720p you can surely see differences between Low or "Ultra" settings, just thinking about lighting. And at such low resolutions you should also have enough performance to go for Medium settings at least, even integrated graphics chips should be able to do 720p60 most of the time.
The lowest resolution possible is still very playable in most games imo
@@dennisjungbauer4467 my gt 740m begs to differ,. Can't get 30fps in wildlands,. 23,28 is the best it will do and is still playable single player IMO. Guess I may need to Tweek some more settings.
I upgrade every 4 years to the second best level of hardware of that generation, and its always been half the price of the uppermost top tier and I've always managed to keep everything maxed out, I like having a cinema quality to my games
11700k
3080
1440p
3990x
a6000
4320p
"cinema quality"?
So, you play at 24-30 fps, lol?
Just kidding. Actually that was exactly what I did back in the days when I could only afford entry level Hardware. But once I got a mid-range GPU and hit 60-ish in new games, I could never go back! XD
It depends heavily on the setting Ray tracing is incredibly noticeable, and also heavily taxes a game. The most expensive hardware is for enthusiasts. Enthusiasts always spend hundreds and sometimes thousands of dollars to eke out a few percentage points of improvement.
Riley: Talking about max settings and stuff
Me *wondering why the Rtx 3070 is covered in tin foil*
probably the same reason as some crazy people to keep the government from reading its mind, in a previous video linus's teleprompter gained sentience so the 3070 probably did to
LTT : Gaming at high settings is really dumb .
Me with Intel HD Graphics : I beg your pardon , what did you say ?
Mr Beast comme nted on my latest vi d in my playlist!! omg I’m so shocked I’m crying right now 😭 😭
I can’t hear you with my 640x480 5fps gaming
@@Anonymous-dz6ep cry harder
True Zen is when you only have an Intel HD and are okay with that.
*cries in ATI Radeon 3470*
*Riley does good segue to sponser*
Linus: I raised that kid.
Segue
@@dinelkap lol i typed it fast
He raised him from trash and turned him into treasure.
@@bluephreakr words of wisdom
"How to Collectively Make Every Single PC Gamer Mad with 1 Video"
I feel a fair amount of times that removing some advanced features make the game look a lot better. Like I always remove motion blur since it make 144hz monitor feel redundant when I can't enjoy the smoothness.
If it's any consolation, motion blur is not a realistic thing anyway. Motion blur was a problem early cameras had, we just got used to it from movies to the point that, when better cameras came out that didn't have that problem anymore, people felt like the movies recorded with them were weird / less professional. Video games just followed suit and added it because well, it's what we are used to see when looking at a screen.
Oh course you remove motion blur, motion blur is implemented specifically because it improves the image of low framerate videos
@@gozutheDJ the movies thing he talked about it accurate. That's why movies are in 24fps and not 60fps even tho the cameras are well advanced to record 60fps videos.
We like the choppiness of 24fps better. Makes the action scene, more... badass.
@@gozutheDJ That's for the ancient times. Currently they have no limitations. And yet movies come in 24fps because of the above said info.
@@gozutheDJ More than 90% of the movies are shot on digital cameras. It's rare to use films now. Only for a very specific task they would use it.
I've been playing videogames for almost 26 years now. I've never in my life I played a game in ultra when the game came out at all. And the 99 % of my games are played in medium range or high but always looking for best performance. So I don't get why gamers nowadays get that crazy obsession for high definition.
Me too
Especially now that low graphics already look great.
You've grown up with Mario and tetris
We grew up with cod, battlefield, crisis and other good looking games
It's not some crazy obsession lmao. This is what we expect
@@random22453 Graphics have hit the point of diminishing returns 15 years ago.
@@MicroChirp not true
Hey LTT. This is one of the BEST VIDEOS IN MONTHS! you uploaded. Please give Riley a raise.... Truth is finally out there and many of us poor folks can finally feel like a normal person. (I bet this was a Riley´s idea)
If I have the GPU, I'm going to make the most of it. So most of the time I get the settings as high as possible without going under 60fps.