+Necris: The whole point here is that these people weren't able to tell the difference between 480p and HD, until they were notified by some setting, just like in most cases you wouldn't be able to tell if you're playing on "Very High" or "Ultra" unless you went to check the setting (or Ultra skipped frames...)
Frank R. Haugen Its because studios are pressured into tight release scedchules by their publishers. They simply dont get the time they need to polish the game.
Dark Storm 1080 ti is a waste of money. Just save that money and invest in 128gb ram. Much better. Now just download the ultra fast speeds for the ram and you can run practically any game. 100%
I've read several times in the last few years that developers target 1080p high as the standard. Ultra is rarely optimised, it's just "lets see how ridiculous we can push everything setting".
There are shadow technologies way more advanced then just High. Take AC: Syndicate. Shadows on high look good, but they are still crap compared to shadows on PCSS Ultra. So ultra settings do something.
I RuBiiX I Weird because I almost never get hitches in any games I play at ultra settings. And I mean none. I have a 1080 TI. Must be a SLI problem or something wrong with your PC or PC settings.
I RuBiiX I keep in mind watch dogs was intentionally made to look worse by the devs to make the console port look better by comparison and also had some weird shit going on with it to begin with anddddd most games only utilize quad core CPUs so that could be another problem your having oh and among all that though SLI is just fidgety to begin with
Vast majority if gamers are using 1080p and lower and use low end GPUs. So yeah developers target 1080p medim-high. And what youll learn if you dig a bit deeper, "ultra" settings are first and foremost intended for taking screenshots, not actually playing the game.
I can even turn shadows off, but if I see grass pop in, it breaks the immersion for me. I'd just like MORE CONTROL of graphics. Super low quality grass with no pop in would make me so happy!
The only time I notice that the tall grass and pebbles on the ground exist is when they pop up 5 feet in front of me. Why modern games don't just give you an option to turn off terrain decoration is beyond me, especially considering that this was often an option in older games.
@@KARAENGVLOG Watching a game is completely different to playing it. Frame rates in game change how responsive the game is, TH-cam is just watching the footage so it's fine.
that shadow pop in on gtaV is almost unacceptable, even though the actual objects are loaded in it still makes the terrain look like its morphing into shape before you
That exists even on max settings and what's worse is the engine can't handle being stored on anything less than a 7200RPM hard drive...yet Watch Dogs 2 runs perfect for me with it installed on a 5400rpm drive...
Well to be fair, 5400 rpm hard drives are totally obsolete. Even 7200 rpm drives are being fazed out by SSD prices decreasing more and more after every year. I myself got a 500 gb SSD that I have been using for a year with no issues for $120. Definitely more expensive, yes, but the ends justify the means for the unbelievable 200% improvement over HDD's. And the more they are implemented and mass produced, the cheaper they will become due to manufacturers seeking to fulfill the demand for SSD's by finding cheaper and more efficient means to produce them. Besides the industry cannot keep utilizing old technology forever. . that would be like Andrew Carnegie saying "The Bessemer process is great, sure. . . but I'll just stick with my old tech because I like it more and I am used to it."
you mean the bullets and the enemies that shot the tree? How do people watch a video, then get what they saw wrong so often, no wonder eye witness testimony is basically worthless... lol!
I wouldn't say it's lackluster when it "only" scores 200 something FPS vs the the Intel chips around 300 FPS on low rez, and almost identically on higher rez. I mean, 300hz monitors don't even exist.
What I do when I get a new game is I first set every setting to maximum, test the game and if the FPS feels too low I'll gradually decrease the settings until I find a good compromise between quality and performance.
Nah I normally click the detect settings, and if it runs well I increase what matters to me and check FPS hit and if it runs badly I decrease shit that I know is costing too much
-Mario Kart 64 WHAT CAN IT HANDLE? 8x MSAA Anti Aliasing 1.5x native resolution -Mario Kart Wii 2x SSAA or 2x MSAA if playing online 1.5x native resolution
Thats the beauty of PC ISN'T it . That u can tweak the settings as you want , according to your system limit . Consoles have restrictions . PC is freedom .
microsoft didnt want to release halo1 on the pc. they werent going to let halo 1 on the pc show up the xbox. note how we still dont have halo 3 yet. lol
i don't get it? every developer priotitizes consoles and atm ps4 has all the highest rated exclusives like bloodborne 92 on metacritic, uncharted 4 highest rated game of 2016, horizon, nioh and its got many upcoming games like last of us 2, death stranding, red dead 2, spiderman, god of war etc.
mjp152 really depends on preferred genre of game, competitive I want as high of fps as possible but if im playing single player I’m happy with anything above 60 fps
@@B20C0 AMD sucks, FACT. What happens when one company can't compete with the other? The other company puts out a bunch of stupid crap because it doesn't know what to do with itself and then AMD still can't catch up at all because it's overheating garbage that wishes it was something that it isn't.
@@samsh0-q3a Wow, there we have a fanboy ladies and gentlemen :D You are aware it's not the 2000s anymore, right? But sure, go ahead and pay the idiot tax.
Videogame graphics are getting so good that even medium or high settings look amazing, and i'm starting to find 144hz to be a far better experience than ultra settings. Heck, even higher resolutions are starting to become a better experience than higher graphic settings.
look at me...1080p is a dream for me,i can play games at 720p (very old games). my standard is 640x480 and 800x600...each time i play a newer game on my pc i can hear my pentium cpu cry...
IdontKnowWhyAmiPuttingThisLongSentenceAsMyName __ you dont know what suffering is, my bga socket celeron cant run even csgo in a decent resolution, i can hear its fanless cpu cri evertiem
CrazyWeeMonkey a classic example of this is when you see original PS/Nintendo games over 15-20 years old in 4K - they're truly beautiful. You aren't focused on the quality of textures, or the graphical fidelity but simply lose yourself in the frame clarity and pixel edge definitions that compliment the amazing games that came out. take Spyro in 4k for an example!
Where is the fun running a modern game on Ultra settings using the most powerful graphics card? The *REAL* fun is trying to run any modern game at the lowest settings on a shity graphics card, at a considerable FPS, considerable = 20 or more lel Now *THAT* is what I call fun! It gives you such a feeling of accomplishment!
fun for me is playing the damn game, not messing around with graphical settings. although from the messing with pcs point of view, i agree. i actually had fun trying to make forza horizon 3 playable on my i5 + 650m laptop last year. at first it was running at like 14fps on average at the lowest settings at the lowest available resolution, which was unplayable. (720p, something i thought was absurd. i wanted to run it at 800x480 and wasn't even given the choice, and since its a uwp game, i couldn't even edit settings manually :|) but i got an external cooler and my laptop stopped throttling, and from there i was able to overclock the gpu to further increase performance. then i noticed that changing from the cockpit to bumper cam also gave a huge boost to performance. ended up with 25~30fps average with the lowest drops on the most demanding parts of the map going to 18fps and the less demanding parts hitting 60. i was able to work with that until i came back home.
Same for me tbh. What I like about building PCs is getting a pc that's capable of doing PRECISELY what I want for the lowest possible price. In my case, it was playing Overwatch at 144fps while recording, and I got that for about 400-500€.
Face it: Framerate is MORE IMPORTANT than Graphics these days in gaming. This is why consoles are bashed by PC gamers for being mostly only 30 FPS. My rig sports an Nvidia GeForce GTX 660, and the "Minimum Requirements" for Doom says a GeForce GTX 670, but I managed to launch the game and play it at the lowest possible settings, and guess what? It still plays at around 60 FPS and I had a good time without complaining too much about the graphics!
Not really THAT much of an accomplishment considering doom is praised for it's optimization, also, you should switch to something more modern, even a gtx 980 would be okay. Trust me.
No as long as the game can keep 60FPS, then graphics is wayyy more important. Unless you're a competitive gamer. I'm sure 99.999% of gamers doesn't need more than 120hz, even for VR. Would you rather play a Hollywood-graphics-quality game at 60fps or Fortnite at 240fps ? If everyone was crazy about FPS in the past like some gamers are today, we would be playing at potato graphics @240hz.
This is why I don't listen to people that insist my GTX 1080 ti will be useless in 2 years. I'm playing at 1440p, and I have no problem turning settings down to high or medium. I think I'll be okay for quite a while.
Hell my Fury X still destroys all games at 1440p ... if it wasnt for games not realizing the difference between HBM and regular memory id be fine playing almost all games at 4k .. but alas some games only see that i have 4gb of ram and dont factorvin that 4GB of HBM is like having 6-7GB of GDDR5 ... still im holding offvas long as i can because Nvidias options are waaaay overpriced even without crypto mining and AMDs option has yet to show me mature driver performance -_-
Just bought a 200$ card and installed the card on my old potato PC, some new fancy shit like the case, and now I can play new games PERFECTLY. Only with 200$. We need to stop this shit of paying thousands of dollars just to play with more grass.
Ummm... this comment is wrong I many ways -will it have a bottleneck? -The power supply is enough for the gpu? -Do the mother board is going to make the gpu better or worse? Etc...
David gomez deossa - It doesn't have bottleneck - The power supply is more than enough - The motherboard is good This computer was 500$ when I bought it, it came with no graphic card. I've been upgrading it for years.
So true. I had completely forgotten that lower game settings exist bellow ultra. Remembering this will definitely influence my future hardware upgrades.
my girlfriend showed me this video....I dumped her and threw all of her shit out of MY apartment. i have never been more offended in my life. what the fuck will the internet think of me if I game in medium to high settings? I live in America for god's sake!!!
Berenstein Bear I mean he has a point. PC gamers going on about pretty visuals "muh gwaphics" and how awesome the graphics look. Doesn't matter how amazing the game looks if the player behind the controls still plays like a nooblord.
respect!ya bitc*es.stop throwing money at the poor skilled people and beat the dark souls franchise.also bloodborn.as well as the original driver.did i mention street legal racing redline?
Yes, even though I completely agree on this, I'd still rather look for ultra benchmarks as a guaranteed Average FPS... I think a better way to do this is to show both ultra and "optimal" benchmarks, but that would also cause problems as "optimal" is different for everyone...
15 years ago, I was so happy having like 20-30 fps when playing games (my pc was pretty bad back then lol) now i have tasted the ultra at 60 fps .. that kind of made me feel like i lived under a rock lol
The "modern games look good even on low" statement goes both ways. With a few exceptions like War Thunder, It usually just raises the bar for hardware entry and results in little change in performance between graphics settings on low end hardware.
So fucking true^^ I just got a new custom gaming pc a few months ago that i chose the parts for (1800$), but before i had an crappy old toshiba laptop, war thunder was literally the only game that could run on my potato with low setting graphics that still looked beautiful.
I'd rather have an older game maxed out instead of a modern game at minimum. An unrealistic dream of mine is thst modern games would have an option to use a different graphics engine to increase performance. IMO, forza horizon 3 on minimum settings looks far worse than something like nfs prostreet on max settings.
Yeah, in most games there are graphical settings that really fall in the "experimental" category. Now, if you can run the HairWorks in Witcher 3 without sacrificing performance, that do that by all means, but it's not like regular hair in Witcher 3 looks like a big blotch of hair with no detail, actually, regular hair looks awesome, so the added option of making it better is a welcome bonus, but it's far from necessary and doesn't really add to the sense of immersion and fidelity, yet it annihilates the frame-rate.
I have to disagree on one point: The reason you stated why reviewers should not use higher settings, is the exact reason why they SHOULD. It's intended to be a benchmark, testing the limits of the card, not just what's playable. Doing benchmarks at absurdly high framerates (induced by graphically un-intensive settings) creates a CPU bottleneck, and skewers the results. Ultra Settings is there makes sure that the card is always at 100%.
want the truth if I play twitch shooting on war games you want a 1ms 1080p monitor 240hz don't care about HDR that's what I would use for playing multiplayer but as I play story modes alot not a multiplayer type person due to amount of kids these days plus alot of hackers and cheaters it's a unfair world now I rather have 8k 30fps 40fps 60fps would be nice but i want my story mode to look the best possible experience
I grew up with the genesis, playstation, N64 where slowdowns and 16 fps was common, and pc gamers cobbling together crazy builds just to PLAY games. The modern pc gaming community just seems like it's, for a large majority, people with well paying jobs or kids with wealthy parents running 3080's and 90's and complaining on not hitting 144 fps on ultra with RTX on. The world feels different man.
Akash Jaiswal that's actually a dumb idea because at that resolution you won't get all the effects which your rig process on ultra settings like high resolution textures and stuff
MikaHR100 supersampling is better than the actual resolution of your display but it isn't as good as the native resolution. So why use same performance and get less gains
mosquitor !! It's the exact thing he said in the video what i said so why the fuck are you watching the video if you disagree. Seems you are bumd not me. Fucking noob
Even though developers are making games runnable on low settings, hardware is still too weak to run the latest games: now every game has to be larger than 100 GB for some reason.
Considering the speed of hardware evolution, you'll be able to play those games at 4k with ultra high def textures with RTX in 2-3 years no problem. The reason they do it is the same reason you get remakes and remasters of old games. Or should I say the exact opposite. So they are viable over a longer period of time. It is good, necessary and smart. People who haven't played some games and people who revisit their favorite games a few years later will enjoy them even more then. Like I just bought Titanfall 2 and play it on literal max and it's mindblowing how good the game looks and how good it actually is. And it's a 2016. title. Most underrated game of the decade and the multiplayer is so good I have no clue why anyone even plays CoD anymore. Now answer me this - Why can a Bugatti Chiron go 450 km/h if the speed limit is 150?
@@DerinTheErkan Well I cant argue with you on that one... I sell my rig every 2 years and buy a new one. But there are used computers all over the place for 70-80% of the price just because they've been used a few months. There is nothing wrong with them and most sellers would even let you test them. But the smartest thing is to set up your own rig part by part. You can also save about 10-20% of the price just by doing it yourself and not buying a preset computer. There are guides on the internet how to do it and which components to use. All in all, just a little time and effort can save you hundreds of $$$.
Thing is, those sizes are pretty inflated from all the MTX content like skins and character/weapon variants and other useless stuff like that. But also from all the extra LODs (level of detail) being generated for all ever increasing assets. Have you ever wondered how games manage to have so many graphics options that let you scale down to potato levels? It's all based off of pre-generated LODs for everything in the game. You might make a 4k texture (that is 4096x4096 pixels) but the game engine also generates "mip-maps" or smaller versions of the texture that can be streamed in and out as needed based on both view-distance and quality settings. And all those extra textures are going to add up.
Come on, investing in a hobby is not an "excuse" for anything. And you're thanking him for "finally" pointing out what, exactly? That gaming is viable on mid-range hardware without cranking up the settings to the max? What a shocking revelation.
Kris Molinari Well, ok. He is saying that he feels that ultra isn't meant to,be standard or sought after. But personally, I use my own custom settings to add the extra bonuses (AA or other cool effects) while staying at 'high' setting. But ultra is still nice, I mean I recently got skyrim (not special because that doesn't mean much) and set it to ultra and it looks like a modern game! I mean the HD bonus pack helped but still. Ultra settings I feel are for games that are going to last and last, not games that rotate constantly
Unless you are like me, and can easily see the differences between high and ultra high in games. It honestly depends on what game I'm playing, I can happily play modded Skyrim at 30 fps, if it's pretty. If it's a competitive game, I'll try to hit 60 minimum. Ultra settings are definitely not excuses.
NAh, people will still complain if they run their games on Extra Ultra and get over 150fps. They want more no matter what. Me, if I can run the game and get a decent 25-40 fps. Im happy. Even if it means playing potato games.
I'm happy with 50 fps.. on tf2.. with ultra low config... the game still lags.. tho usually because my ram is 2gb and tf2 almost takes all of it.. sometimes tf2 crashes because of not enough ram if i run google chrome too.
You have a very strong point. Although the part where you mention that benchmarks should not be shown on the highest settings. The point of a benchmark is to push the system to its limits and see how cards compare under stress that might be shown to it over the 2-4 year lifespan someone owns the card. The framerates of a 1050 and a 2080 ti on lowest settings will be much closer because it is not hard to run most games. There comes in the bottleneck of the cpu. Which exists to matter what. Therefore benchmarks should always be on the highest settings. Video games make their graphics beautiful because people care about how the game looks. I play almost all games I own on lowest settings to get that sweet 144fps. Many people prefer quality over frames, not to mention streamers. Also, not everyone plays first-person shooters where you get an advantage on lowest settings. Games that are slower, like you showed in the video, The Witcher 3, they were built to be beautiful graphics and story that immerses you. Why would anyone want to be running around a low poly world if they could make it look beautiful by changing the graphical settings? (Assuming their hardware can handle medium to high settings.) Do I even need to mention Virtual Reality games that will be coming soon in the future where ultra won't just look good? They may be so believable that you may forget that you're in a game. These are my two cents, I do agree higher fps is king. But there are some criticisms still here! Have a nice day!
Battlestar Frakyrie i wouldnt want to go for vega for volta is going to be out next year. Vega is a nice card, but personally wouldn't go for the card however you do you i guess.
@Desertcamel @Cheeki Breeki for 2018 amd has Vega 2.0 to compete with Volta, and it will not have xx70 and xx80 performance, but higher imgur.com/a/4s3nV Volta is not gonna slaughter amd until the gpu's are out
I think this is more of a misunderstanding of the purpose of those benchmarking graphs. Usually the priority is **not** to represent a "realistic" use case scenario, but rather, it is to make the differences between the products being compared as big as possible so that you can have a fair comparison of each product's maximum potential. This means that benchmarkers deliberately create GPU-limited scenarios when comparing GPUs, which means turning up the resolution and graphics settings, and CPU-limited scenarios when reviewing CPUs. I feel like people often misunderstand this which can mislead them. Benchmarking graphs like these are primarily supposed to be used for relative comparison, not really for absolute numbers; as said in the video there may often be better performing options that look similar, that's just not the point of the benchmark. Furthermore, in reality the differences between between these products may in many scenarios be less pronounced than in the benchmarks, that's because the benchmarks are deliberately designed to make these differences apparent. I really hope that more people will keep these things in mind when interpreting benchmarking graphs and making buying decisions.
Also some Ultra Effects are actually annoying to play with - for example overused DOF or especially Motion Blur. Dear Developers. Stop putting Motion Blur to your game! It's annoying af.
Stuntkoala WTF NO. Developers don't listen to "Stunykoala" who thinks that just because he hates motion blur that it must be true for everyone else! I love motion blur. As a matter of fact, I look for that setting in any game I play. It makes the game more cinematic and more fluid looking. And the motion just looks badass. I can't understand how people hate it so much. I also read motion blur is supposed to be a realism setting. Uh because in real life when you move your eyes, there's a blur? Duh.
El Libertoso like he said it looks more artistic, because I guess you wouldn’t be able to see ripples from there and the look on the ultra is more realistic, don’t get me wrong, I think the water looks better in normal.
Artistic difference. Sometimes more realistic stuff looks worse than drawn stuff. Take that fps he was playing here. Would it look better if it was actual footage of some guy? No. Our 3d artistic creations are now more desirable than what is real. Have you seen real people on 4k? They look horrible
I think that that's down to SSR (Screen Space Reflections). See, the brownish rocks on the right are reflected in the water - resulting in a muddy looking river. While, on the "normal" / "high" setting it's static water with some specular highlights. Honestly, I've always prefer against using SSR, since its artifacts and performance drops are immensely annoying and not worth the questionable increase in visual quality.
That's because the one on the right is closer. Watch a bit further and the very high and ultra ones look pretty much the same. Or seemingly closer anyway, not sure what that is.
Absolutely well said. Additional graphical fidelity starts to become a negative at the point where the drop in frame rate is more noticeable than the supposed "better" visuals.
This video just woke me up. For years I haven't bothered with anything but ultra because that is how you are playing "badass". I am going to try turning down some settings and get better fps.
Before: High Settings were the only ones, that looked good, so High was the Best Today: Everything that is Normal or above looks the same, except that everything above normal runs slower, so Normal is the Best despite High and Ultra also existing.
I feel like as a consumer you just need to be aware of the purpose of a benchmark. It's to show the difference and see what each GPU is capable of. What you're requesting is optimization guides on what mix of settings are the best for a balance between image quality and performance. They're two different things.
hell walker Why would anyone want to play 4k at 30fps especially in an action game. This is really nothing new though, people who don't choose to educate themselves usually prioritize resolution over fps.
Fps is always the most important factor in action games, especially ones such as CS:GO. It's all good and nice being able to run a game at 4K on ultra, ya da ya da, but if you're only running 30fps you are not going to do well.
I rather have a benchmarking "standard" (like the ultra presets) that every media outlet uses, but not reflects the way you actually want to play a game, because: 1. I can compare benchmarks between different magazines, websites, users etc. 2. I just use benchmarks to compare the performance of a GPU and not the performance of a game. If want to know how to get the most fps at a good quality, I can look at a lot of guides online. 2b. I (and certainly 95% of the community) would buy a game, even if I can't run it on Ultra. Because I don't want to miss out on a great game and because I can certainly run it if I am within the recommended system requirements
I think standardizing the benchmarks is more important than getting a super accurate indication of performance off the bat. Imagine the struggles on getting a good comparision between hardware when everyone uses slightly different settings, oh boy.
Should reviewers show how hardware runs on a number of settings. I.e, New AMD card gets 70% performance of its rival, if you want to match the rival in terms of fps, you need to lower to say medium? (figures made up for example). I would honestly like that in a reviewer as a PC enthusiast.
Or they could put a little more effort in and benchmark BOTH ultra and medium/high settings. People who buy a 1060 aren't looking to play on ultra settings so a REVIEW shouldn't show only ultra. It is retarded and misleading.
Yeah you've just convinced me to go with an ultrawide (which I've always wanted to game on) when I get my RX 5700XT. I've been hearing so many mixed opinions on whether that card will be enough for a decent 3440x1440 experience and the benchmarks are okay but not amazing. It's so true though, if it can run the most demanding AAAs of today on ultra at 40-60 then I'll be able to achieve 60+ at high/medium in future AAAs for a few years yet.
It has come to the point where I barely care enough about the settings to even fiddle with them. I'll usually go in and see what I can turn down just to get more fps. Some games look fine on low, medium and high, so I have no reason to play it on ultra.
My only regret from buying an R9 285 when it came out, was the I got one with only 2GB of VRAM. That severely limits the one quality setting I care the most about, and that's texture quality/resolution.
I accidentally put it on low and didn't even notice for like 3 levels. The optimization on Doom is insane. I built a $300 steam machine with used parts that can run it at above 90fps (I think on high but I don't want to lie)
couldnt agree more phil. i am still using a gtx480 and people cant even believe that i can run games like gta v, witcher 3 and crysis 3 on 1080p. they think that i am liar. i dont see a reason to upgrade. i dont need more power(for now). the freaking marketing made everyone think that you cant play AAA games with older, or just lower end hardware.
Exactly. I plan on getting a 1440p monitor soon, and I've been told that I absolutely need to upgrade my GTX 970 because there's no way I can run games at 1440p, but of course I can, a 970 is still a fantastic card. I hope to try out my old GTX 750 and HD 6670 at 1440p just to see what's possible.
Gou Well, I had an 480 and upgraded to a 1060 this year just because all of the games you listened didn't run properly even on the lowest settings (except Crysis,because I haven't played it) and it got obnoxiously loud and hot. But it was a pretty nice card since I had it like 5 years and it never caused serious problems.
***** That could be the thing why it runs better. But I guess the cards just gives up after 3 years of excessive use every day (~4h on workdays, 8-9h weekends) and 2 years of inregular use. And the cooling things on the side of the card are just brown or light black now.
I thought the purpose of Ultra Settings were for when a game is 14 years old and our PC are monstrously powerful when compared to the PCs we had at the time. My computer’s pretty old now, my hardware’s failing, and components are old and underpowered. But I can run old games like Half-Life 2 (2004) or something along those lines, on ultra, 32x Anti-aliasing, perfectly fine. It’s awesome honestly for that, it’s just games post-2012 where I start having issues as my computer just can’t do it anymore, and I have to reduce the settings to high or medium, and for even more recent games I have to go low and very low, and even then I can’t get 60.
i mean that's a good point. I had a rx550 2GB, which is a cheap, budget oriented graphics card from 2017; and while playing need for speed underground 2, i could put in on max settings and on the max resolution it supports, hit the fps cap (140fps), whith a stable framerrate, while still using at most 70% of the gpu in the worst case scenario. Still, the game looks pretty good to me even though it'a a game from 2004. Back then, however, you needed a high-end graphics card to be able to run the game at max settings, max resolution, and a playable framerrate. Nowadays, you can do it even on integrated intel hd graphics (you would be getting about 30 fps in that case tho, however that is definitely *playable* )
Motion Blur Off. Anti-Aliasing Off. Depth of Field Off. V-Sync Off. Shadows Down. Physics Down. View/Render (LODs) distance Medium to High. Textures and similar settings rarely change drastically between even just High to Very High, and Very High to Ultra. FOV, High. Borderless Windowed Mode. Bloom/HDR, Depends on the Game. We Good Boys.
5 years on and I'm here feeling like an old man talking about vinyl. "Why did you take 25% more of my GPU and an extra 75 flipping WATTS to make the game look WORSE?" I mean, having been a kid, I tend to defer to kids with regards to "taste" and "culture" as it's very rare for anything I liked 20 years ago to be relevant today... but do high schoolers actually prefer to have motion blur turned on, for example? I personally hate it.
@@antt2228 Simulating motion blur in a high-FPS game is strange to me since it's a camera defect in this context from prolonged shutter speeds to my understanding. Our eyes do have something akin to motion blur (although at around 50 FPS shutter speed and you can think of our eye as a top-quality, noise-free camera even in low lighting conditions with ultra-high resolution near our focal point) but we'd perceive such a motion blur anyway in a game or video playing at a high enough frame rate even without motion blur.
I agree with Chatos. Framerate is even more important on a large screen. I left 60 Hz with the original playstation in the 90s, even DOS was 75. No way am I ever going back to 60 Hz. 4k 120 Hz needs to come way down before it will make sense.
I always tweak every setting of every game to the minimum that makes a certain graphic element visible in gameplay. Siege for example: shadows on low turns it off, on medium it turns on and from then on it just gets higher res, so there's no reason going beyond medium for that one. It's the way I get the best visual/performance ratio, works every time.
I could actually see a case where that might give you an edge as a gamer even if you can run ultra at 60+ FPS, since no shadows means you can see things more clearly. Shadows make things dark and offer less visual/light information to our eyes. Only time I think a shadow helps us functionally (not aesthetically) is to improve spatial awareness, but not all games require that (ex: first-person games don't require that nearly as much as third-person since you're already directly seeing through the character's eyes).
TH-cam automatically translated this video's title to spanish for me and due to spanish's grammar the meaning is left ambiguous and could be interpreted as "Settings ultra-suck" which I find amusing heheh.
I completely agree. Ever since the games industry overtook the movie industry the target audience changed slowly over the years to the point where now your loyal true gamers have been completely shuned. Now is the age of the casual gamer where games are made to look and feel like the games you used to play. Everything is about graphics rather than emersion when they both should go hand in hand. The most visually graphical games today are just dead inside.
Even at 1080p i turn off AA. Ok, SOMETIMES i turn it on when the Game itself has "ugly edges" (dunno how exactly to describe what i mean ... bad english skills, native german, sry). Witcher 3 for example: the world is so overflood with details that i barely even noticed if AA was activated or not (except it's highest state, but resulting to 10-15fps). Turning off AA hasn't decreased my whole (awesome) experience with Geralt at all. I got 40-55fps at customized max settings with only 8 gigs of ram, a RadeonHD 7970 and i5-2500k @1080p.
What the fuck are customised max settings even supposed to be? They're either maxed out(in which case you're not getting 55 fps with a 7970) or they're customised, in which case they're not max settings.
The only thing that looks better with Ultra is the settings panel.
@@2kliksphilip replying to a reply on a 3 month old comment on a 2 year old video. Youre too good for us Philip
@@LOLquendoTV Sure humbling him, aren't you.
Everything in its proper place
@@2kliksphilip reminder to dELete ThIS
@@Kier4n99 why tho lmao
here I am trying to notice differences between the images before realizing I watched the whole video in 480p 30 fps on youtube
fish.
Cyranek I don't understand why people care souch about something so subtle.
It is just materialism. "Always looking for more." You can't miss something that you don't know about.
Use Magic Actions addon, set auto HD to 1080p and night-mode feature, better your lives by 150%
+Necris: The whole point here is that these people weren't able to tell the difference between 480p and HD, until they were notified by some setting, just like in most cases you wouldn't be able to tell if you're playing on "Very High" or "Ultra" unless you went to check the setting (or Ultra skipped frames...)
2:16 Jesus, man. That's a Tortoise, they can't swim.
What did you expect from an ignorant peasant who say consoles can make pretty graphics
eh wat? Tortoises can swim. Or is this the joke? o.o
turtle swim tortoise walk
Reminds me of that 4chan "Jeb!" meme.
@chefar you're pretty ignorant aren't you? Consoles can make decent graphics, they just can't make them 60 fps or 1080p.
One thing really bugs me, as processing power has gone up, games, game engines and software have neglected optimization
They haven't neglected optimization, it's just that games aren't necessarily optimized to run on PC hardware, but rather on console hardware.
Håkon T. no. Thtas the case for only a select few games.
Frank R. Haugen Its because studios are pressured into tight release scedchules by their publishers. They simply dont get the time they need to polish the game.
@@hakont.4960 they use the same hardware
@@saltyitalianguy3243 What?
Where can i download a gtx 1080 ti?
Deep web > Twitter > Download GTX 1080 Ti.
Dark Storm 1080 ti is a waste of money. Just save that money and invest in 128gb ram. Much better. Now just download the ultra fast speeds for the ram and you can run practically any game. 100%
Why not just torrent the RAM? No point wasting money when so many RAM download sites are scams anyway
thanks downloading gtx 1080 ti .......... ultra settings i m coming
Easy it's www.superdeluxeultramodernfreedownloads,co,uk,com,net,ru,de,org,gov/graphix/nvidia/gtx90000ti
20 fps on lowest settings boyz
Why are we here
Ah I thought I missed the train!
M8 i gots r7 240 i get at max 10 fps on low in fortnite.
Poor IGFX thread I was once here. :'( rip
Ethan L I have same gpu
I've read several times in the last few years that developers target 1080p high as the standard. Ultra is rarely optimised, it's just "lets see how ridiculous we can push everything setting".
ultra basically increase shader and map resolutions. more defined shadows, more precise shaders... mostly shit you don't even notice playing.
There are shadow technologies way more advanced then just High. Take AC: Syndicate. Shadows on high look good, but they are still crap compared to shadows on PCSS Ultra. So ultra settings do something.
I RuBiiX I Weird because I almost never get hitches in any games I play at ultra settings. And I mean none. I have a 1080 TI. Must be a SLI problem or something wrong with your PC or PC settings.
I RuBiiX I keep in mind watch dogs was intentionally made to look worse by the devs to make the console port look better by comparison and also had some weird shit going on with it to begin with anddddd most games only utilize quad core CPUs so that could be another problem your having oh and among all that though SLI is just fidgety to begin with
Vast majority if gamers are using 1080p and lower and use low end GPUs. So yeah developers target 1080p medim-high. And what youll learn if you dig a bit deeper, "ultra" settings are first and foremost intended for taking screenshots, not actually playing the game.
I can even turn shadows off, but if I see grass pop in, it breaks the immersion for me. I'd just like MORE CONTROL of graphics. Super low quality grass with no pop in would make me so happy!
Trev Blah Just play minecraft.
The only time I notice that the tall grass and pebbles on the ground exist is when they pop up 5 feet in front of me. Why modern games don't just give you an option to turn off terrain decoration is beyond me, especially considering that this was often an option in older games.
@@greasehardbody4018 Just play minecraft.
@@stickystudios5318 you want to play rdr2 how you want? Play minecraft!
Wait thats a different game...
That's a wierd fetish
I remember the good old days where 15 fps on low settings for CoD1 was tolerable for me.
Insane how we have changed isn't it? Lmao
I can't stand below 60 fps now
So you cant stand something below 60 fps meanwhile watching a youtube cod gaming video that is 30fps yet tolerate it?
@@KARAENGVLOG Watching a game is completely different to playing it. Frame rates in game change how responsive the game is, TH-cam is just watching the footage so it's fine.
@@scooty727 fair enough,keeping in mind that more fps = more input avaliable so you can bash your keys and somehow all of them are registred
would ı rather fps or graphics
it doesn't matter my pc can't hav any of those
Same
ahh xddddddd
im sure theres a guide on how to get good fps,good grgaphics on a toaster with a moniter
Me too.
If it is multiplayer then fps
If it is single player then graphics
that shadow pop in on gtaV is almost unacceptable, even though the actual objects are loaded in it still makes the terrain look like its morphing into shape before you
Ikr
That exists even on max settings and what's worse is the engine can't handle being stored on anything less than a 7200RPM hard drive...yet Watch Dogs 2 runs perfect for me with it installed on a 5400rpm drive...
dril_house I think it looks pretty nice
Well to be fair, 5400 rpm hard drives are totally obsolete. Even 7200 rpm drives are being fazed out by SSD prices decreasing more and more after every year. I myself got a 500 gb SSD that I have been using for a year with no issues for $120.
Definitely more expensive, yes, but the ends justify the means for the unbelievable 200% improvement over HDD's. And the more they are implemented and mass produced, the cheaper they will become due to manufacturers seeking to fulfill the demand for SSD's by finding cheaper and more efficient means to produce them.
Besides the industry cannot keep utilizing old technology forever. . that would be like Andrew Carnegie saying "The Bessemer process is great, sure. . . but I'll just stick with my old tech because I like it more and I am used to it."
Again, why does a newer game that people call poorly optimized run fine on the slower drive for me than GTA V?
"Ultra settings aren't that great"
Me watching in 144p over cellular data: ill take your word for it
Imagine the low settings in 144p with TH-cam compression
@@aryabratsahoo7474 DAMN!
Am I the only one amazed that you cut down a tree by throwing a chicken at it?
It was the shooting from the enemies that destroyed it.
Science Leponi Crysis was clearly ahead of its time.
Maximum poultry.
you mean the bullets and the enemies that shot the tree? How do people watch a video, then get what they saw wrong so often, no wonder eye witness testimony is basically worthless... lol!
Ya good joke but im pretty sure that that was the guys on the jeep shooting the fucking 50 cal machine gun at you :P
When ultra isnt enough you dive into the ini file and start fucking with those settings :)
grass density 9000 lol
Over 8000! Do you even Manga?
Skyrim modding is so good baby.
oh, I used to find myself doing the same things when low settings weren't enough...
I never know you could fuck with virtual item :P
Our standards have slowly Ryzen
Yellowstone I see what you did there sir
i don't know whether to laugh or sigh XD
lol which is ironic due to Ryzens rather lackluster game performance. Amazing rendering chip tho!
sigh..
I wouldn't say it's lackluster when it "only" scores 200 something FPS vs the the Intel chips around 300 FPS on low rez, and almost identically on higher rez. I mean, 300hz monitors don't even exist.
I guess I should be proud and keep my potato settings
Why don't get a gf instead?.
Google Is Trash what kind of question is that lol out of the nowhere
Don't watch muh videos and don't scrubskribe I'm not. Can't even play Don't Starve Shipwreck and League of Legends at the lowest setting.
Is that any excuse to type like that? :)
Agreed. But the PC gaming community is increasingly becoming the "It's not 200fps on max graphics OMFG unplayable" people.
Textures - Ultra
Motion blur - Off
Every thing else: high/very high 👌
and shadows on medium :D
@@EasternUNO If it's a Bethesda Game ;)
Bloom - Off
We have a loser here
@@JarthenGreenmeadow no!
What I do when I get a new game is I first set every setting to maximum, test the game and if the FPS feels too low I'll gradually decrease the settings until I find a good compromise between quality and performance.
Isn't that how everyone does it?
@@Z4J3B4NT yeah pretty sure that's the whole point of doing it lol
Nah I normally click the detect settings, and if it runs well I increase what matters to me and check FPS hit and if it runs badly I decrease shit that I know is costing too much
And I'm here with intel hd graphics trying to run Mario Kart Wii with 4x MSAA Anti Aliasing...
at least your profile picture is a cute pokemon
MSAA? That's not the real N64 experience...Resolution down to 640x480 and you are good to go
-Mario Kart 64
WHAT CAN IT HANDLE?
8x MSAA Anti Aliasing
1.5x native resolution
-Mario Kart Wii
2x SSAA or 2x MSAA if playing online
1.5x native resolution
I also have intel HD graphics, and a tower that's too small to fit anything else into it :(
....and 3-5 FPS in Multiplayer. Good times. Maybe that's why I can't understand all this "Everything under 60 FPS in unplayable"-talk...
Thats the beauty of PC ISN'T it .
That u can tweak the settings as you want , according to your system limit .
Consoles have restrictions .
PC is freedom .
Thats the beauty? That you have console graphics now on PC?
Russia Good u don't have good pc .
you didnt watch the video here
so do it, consoles and pc look identical now
So what, this was always the case on consoles? It also used to be 720p vs 1080p, but this also is now fixed on consoles and beyond.
Crysis on the lowest settings looks like Halo 1 on the highest settings wtf?
Computexpert nope theres pretty bad ass pc graphical games ...
Computexpert good try, but no
Raziel de Zang he's saying that Halo 1 wasnt out on pc only xbox, and HZD wasnt out on pc too, so v you cant say what graphics setting it is
microsoft didnt want to release halo1 on the pc. they werent going to let halo 1 on the pc show up the xbox. note how we still dont have halo 3 yet. lol
Jimmy Savile there's Halo CE for PC my dude.
"...unless you're a console gamer..." FEEL THE BURNNNNNNN
Pedro Lanhoso Apply Cold water to burned area.
this comment ruined my day
Such a brave opinion he expressed. /s
i don't get it? every developer priotitizes consoles and atm ps4 has all the highest rated exclusives like bloodborne 92 on metacritic, uncharted 4 highest rated game of 2016, horizon, nioh and its got many upcoming games like last of us 2, death stranding, red dead 2, spiderman, god of war etc.
But consoles run at 60 tho
Never understood the graphics plonkers mindset - I will take smoothness and high fps over ultra settings any day.
Why not both?
@@guitarskooter money dumbass
I'll take stable fps over high fps any day.
mjp152 really depends on preferred genre of game, competitive I want as high of fps as possible but if im playing single player I’m happy with anything above 60 fps
@@coolminer1231 *above 12fps
Title: Ultra Settings Suck. Thumbnail: A Rooster.
Brilliantly British.
OH MY GOD THAT'S AMAZING
I turned up the video resolution so I could notice. The buffering time gave me plenty of time to.
*V-sync has joined the game.*
V-sync has been kicked by Freesync (reason: You suck).
@@B20C0 AMD sucks, FACT. What happens when one company can't compete with the other? The other company puts out a bunch of stupid crap because it doesn't know what to do with itself and then AMD still can't catch up at all because it's overheating garbage that wishes it was something that it isn't.
@@samsh0-q3a Wow, there we have a fanboy ladies and gentlemen :D
You are aware it's not the 2000s anymore, right? But sure, go ahead and pay the idiot tax.
@@samsh0-q3a
People didn't buy AMD GPUs even when they were waaaaay ahead of nvidia.
This is why they're having a hard time competing, you simpleton.
@@samsh0-q3a sucks to be as misinformed as you
nnNNNNNOOOOOO THAT BREED OF TORTOISE CANT SWIM YOU HEARTLESS MONSTER
Ultra setting when you wanna take pictures and stare at something
Spooky
Videogame graphics are getting so good that even medium or high settings look amazing, and i'm starting to find 144hz to be a far better experience than ultra settings. Heck, even higher resolutions are starting to become a better experience than higher graphic settings.
look at me...1080p is a dream for me,i can play games at 720p (very old games).
my standard is 640x480 and 800x600...each time i play a newer game on my pc i can hear my pentium cpu cry...
I prefer 144Hz at lower settings rather than a locked 60FPS at ultra. I've now rendered myself terrified of playing Rocket League at 60fps.
IdontKnowWhyAmiPuttingThisLongSentenceAsMyName __ I hâd the same experience...bruh.......:(
IdontKnowWhyAmiPuttingThisLongSentenceAsMyName __ you dont know what suffering is, my bga socket celeron cant run even csgo in a decent resolution, i can hear its fanless cpu cri evertiem
CrazyWeeMonkey a classic example of this is when you see original PS/Nintendo games over 15-20 years old in 4K - they're truly beautiful. You aren't focused on the quality of textures, or the graphical fidelity but simply lose yourself in the frame clarity and pixel edge definitions that compliment the amazing games that came out. take Spyro in 4k for an example!
I'm over here running silly world of tanks on 20fps, and the developers are upgrading the graphics for later updates.
I wish fantasy lands were real.
lol why are you even playing on PC if your computer is that bad xD
my PC is 4 yoears old an i can run WoT on maximum settings with about 50fps
4 years is not old. Mine is 6 and can ditch it out the same. It's about the components and little over time upgrades (example:SSD).
dude, mu computer is about 8-9 years old, i can play csgo on 30 fps and when i get into the smoke, it drops to 15
cause pc games are better than console games?
Where is the fun running a modern game on Ultra settings using the most powerful graphics card? The *REAL* fun is trying to run any modern game at the lowest settings on a shity graphics card, at a considerable FPS, considerable = 20 or more lel
Now *THAT* is what I call fun! It gives you such a feeling of accomplishment!
fun for me is playing the damn game, not messing around with graphical settings.
although from the messing with pcs point of view, i agree. i actually had fun trying to make forza horizon 3 playable on my i5 + 650m laptop last year. at first it was running at like 14fps on average at the lowest settings at the lowest available resolution, which was unplayable. (720p, something i thought was absurd. i wanted to run it at 800x480 and wasn't even given the choice, and since its a uwp game, i couldn't even edit settings manually :|)
but i got an external cooler and my laptop stopped throttling, and from there i was able to overclock the gpu to further increase performance. then i noticed that changing from the cockpit to bumper cam also gave a huge boost to performance. ended up with 25~30fps average with the lowest drops on the most demanding parts of the map going to 18fps and the less demanding parts hitting 60. i was able to work with that until i came back home.
Same for me tbh. What I like about building PCs is getting a pc that's capable of doing PRECISELY what I want for the lowest possible price. In my case, it was playing Overwatch at 144fps while recording, and I got that for about 400-500€.
Nvidia geforce 210
Played
Overwatch. lowest setting
Fucking resolution at lowest
Oh b0i
I did it with a 240, but it went at like 20fps.
9500 gt, i3 2100 in csgo
Before: 40-50 fps
After overclock, driver update, messing with pc settings: 100-150 fps
Face it: Framerate is MORE IMPORTANT than Graphics these days in gaming. This is why consoles are bashed by PC gamers for being mostly only 30 FPS.
My rig sports an Nvidia GeForce GTX 660, and the "Minimum Requirements" for Doom says a GeForce GTX 670, but I managed to launch the game and play it at the lowest possible settings, and guess what? It still plays at around 60 FPS and I had a good time without complaining too much about the graphics!
Just wanted to say that.
Im playing on a 144hz monitor and everytime I see 30 fps I get the chills.
Lol gotta try it on my gtx 550 ti
@Sword
I have no use for 4k I have a 1080p 240hz monitor with a 4790k and 1080ti . I haven't played any modern game at lower than 100fps.
Not really THAT much of an accomplishment considering doom is praised for it's optimization, also, you should switch to something more modern, even a gtx 980 would be okay. Trust me.
No as long as the game can keep 60FPS, then graphics is wayyy more important. Unless you're a competitive gamer. I'm sure 99.999% of gamers doesn't need more than 120hz, even for VR.
Would you rather play a Hollywood-graphics-quality game at 60fps or Fortnite at 240fps ?
If everyone was crazy about FPS in the past like some gamers are today, we would be playing at potato graphics @240hz.
My MOST powerful computer can only run Half-Life 2 at playable frame rates at the lowest settings and resolution. Why am I watching this?
the whole time i thought i have one of the worst computers..
Does your computer have a "Ready for Y2K!" sticker on it?
Only 3 thumbs up, bc ur jk is way above all these children, gott aget at their parents silly.
I bet you can buy better computers for $99 people...
My pc lags even before I open anything, I always have to wait a minute for google chrome to open
always keep fps over quality. unless it looks like a total disaster..
or, if there is screen tearing
Palaash Atri 3000 FPS in fallout 4 is too much
This is why I don't listen to people that insist my GTX 1080 ti will be useless in 2 years. I'm playing at 1440p, and I have no problem turning settings down to high or medium. I think I'll be okay for quite a while.
Using a 770 GTX here, and I get by. Usually on High settings, too.
GT 740. I get by just fine on Medium and sometimes even High settings. Doom runs great on Medium and is still playable on High.
Hell my Fury X still destroys all games at 1440p ... if it wasnt for games not realizing the difference between HBM and regular memory id be fine playing almost all games at 4k .. but alas some games only see that i have 4gb of ram and dont factorvin that 4GB of HBM is like having 6-7GB of GDDR5 ... still im holding offvas long as i can because Nvidias options are waaaay overpriced even without crypto mining and AMDs option has yet to show me mature driver performance -_-
Omar Bedouin Yes, people say that. They're usually talking about 4k with all ultra settings though, hence why I made this comment.
damn Omar snapped
Update: Ray Tracing Has Joined The Chatroom
fps has joined the chat
I hate motion blur...
I can stand Motion blur . What i hate is fps drop xD
depth of field is also cancer
Motion Blur makes me dizzy everytime I turn fast.
Dover i like it when i dont play fps games
Just bought a 200$ card and installed the card on my old potato PC, some new fancy shit like the case, and now I can play new games PERFECTLY. Only with 200$. We need to stop this shit of paying thousands of dollars just to play with more grass.
Ummm... this comment is wrong I many ways
-will it have a bottleneck?
-The power supply is enough for the gpu?
-Do the mother board is going to make the gpu better or worse?
Etc...
David gomez deossa
- It doesn't have bottleneck
- The power supply is more than enough
- The motherboard is good
This computer was 500$ when I bought it, it came with no graphic card. I've been upgrading it for years.
So true. I had completely forgotten that lower game settings exist bellow ultra. Remembering this will definitely influence my future hardware upgrades.
Zenthex no u
Puro Ultra 4K Yonii!
oooo, "furfag!" that's an insult i haven't heard of in a really long time! what is this? 2007?
look at his profile picture my dude
***** yeah a fucking furfags
Are you really asking game journalist to do genuine research? you are crazy
In other words, you don't need the latest graphic card to play your games in an acceptable way.
my girlfriend showed me this video....I dumped her and threw all of her shit out of MY apartment. i have never been more offended in my life. what the fuck will the internet think of me if I game in medium to high settings? I live in America for god's sake!!!
6-011101 yes because playing in 500000k is the way to go
Well good for you. It shows how proud you are with your stupidity. And don't get me riled about this "America" shit.
This joke
Your head
I seriously hope that comment was a joke.
get that tripe Titan setup to work for it's money.
How about people stop boasting about how they can run games on Ultra , And git gud at the games they are playing.
Berenstein Bear
I mean he has a point. PC gamers going on about pretty visuals "muh gwaphics" and how awesome the graphics look. Doesn't matter how amazing the game looks if the player behind the controls still plays like a nooblord.
Berenstein Bear 2 months ago bro
I agree
Tell that to the battlefield graphic whores who play with hellfighter all the time and think they have any skill at all whatsoever.
respect!ya bitc*es.stop throwing money at the poor skilled people and beat the dark souls franchise.also bloodborn.as well as the original driver.did i mention street legal racing redline?
Yes, even though I completely agree on this, I'd still rather look for ultra benchmarks as a guaranteed Average FPS... I think a better way to do this is to show both ultra and "optimal" benchmarks, but that would also cause problems as "optimal" is different for everyone...
15 years ago, I was so happy having like 20-30 fps when playing games (my pc was pretty bad back then lol) now i have tasted the ultra at 60 fps .. that kind of made me feel like i lived under a rock lol
I think from 15 to 20 years ago we didn't even realise that fps counts... And yeah my pc was pretty bad too, but it was kinda good old days
The "modern games look good even on low" statement goes both ways. With a few exceptions like War Thunder, It usually just raises the bar for hardware entry and results in little change in performance between graphics settings on low end hardware.
So fucking true^^ I just got a new custom gaming pc a few months ago that i chose the parts for (1800$), but before i had an crappy old toshiba laptop, war thunder was literally the only game that could run on my potato with low setting graphics that still looked beautiful.
I'd rather have an older game maxed out instead of a modern game at minimum. An unrealistic dream of mine is thst modern games would have an option to use a different graphics engine to increase performance. IMO, forza horizon 3 on minimum settings looks far worse than something like nfs prostreet on max settings.
True
Turning off the hair fx in the witcher 3 gives me +40fps or something
Yeah, in most games there are graphical settings that really fall in the "experimental" category. Now, if you can run the HairWorks in Witcher 3 without sacrificing performance, that do that by all means, but it's not like regular hair in Witcher 3 looks like a big blotch of hair with no detail, actually, regular hair looks awesome, so the added option of making it better is a welcome bonus, but it's far from necessary and doesn't really add to the sense of immersion and fidelity, yet it annihilates the frame-rate.
FPS>Graphics
Check out Mr. Revolutionary over here!
It's not really revolutionary, it's common fucking sense.
Better frames = better performance in MP
Crowder not always.
Gameplay in general>FPS>Graphics
A lil bit of both never hurt.
I played through hl alyx thinking it was on high from just how good everything looked, turned out it was actually defaulting to low
I have to disagree on one point: The reason you stated why reviewers should not use higher settings, is the exact reason why they SHOULD. It's intended to be a benchmark, testing the limits of the card, not just what's playable. Doing benchmarks at absurdly high framerates (induced by graphically un-intensive settings) creates a CPU bottleneck, and skewers the results. Ultra Settings is there makes sure that the card is always at 100%.
want the truth if I play twitch shooting on war games you want a 1ms 1080p monitor 240hz don't care about HDR that's what I would use for playing multiplayer but as I play story modes alot not a multiplayer type person due to amount of kids these days plus alot of hackers and cheaters it's a unfair world now I rather have 8k 30fps 40fps 60fps would be nice but i want my story mode to look the best possible experience
jonathan oxlade huh, You chose the 5-15% video quality over 20-70% playability.
I call bullshit on Arma 3 running that well.
Try playing multiplayer then and see how it turns out. It's an unoptimized game that doesn't fully utilize graphics cards.
And by playing, I mean actually PLAYING. Doing a mission or something, not just standing in one spot turning and looking.
Because it's almost all physics...
I have a GTX 1080 and i7-4790k, both generously overclocked, and Arma 3(multiplayer) runs like dog shit unless I make it look like dog shit.
Arma 3 is just a terrible multiplayer game. Regardless of settings it always seems to run like ass on any PC setup.
What's wrong with potatoes?? >:-(
people who compare potatoes to pc's hardware and whatever must have the mentality of a potato.
Jack Le I get none
Well, they're not very pretty.
They're very easy to simulate in computer graphics.
I grew up with the genesis, playstation, N64 where slowdowns and 16 fps was common, and pc gamers cobbling together crazy builds just to PLAY games. The modern pc gaming community just seems like it's, for a large majority, people with well paying jobs or kids with wealthy parents running 3080's and 90's and complaining on not hitting 144 fps on ultra with RTX on. The world feels different man.
whats the point in 4k quality?
1360x768 is the real deal
1440x900 get on my level
Matheusk0 ha, dirty filthy peasant! I play 1389x1.
Matheusk0 800x600 scrubs
Matheusk0 640by480 ftw
1680x1050 fite me irl
1360 x 768 . everything at ultra.
Akash Jaiswal that's actually a dumb idea because at that resolution you won't get all the effects which your rig process on ultra settings like high resolution textures and stuff
abrar khan
seems youve neve heard of supersampling
L
MikaHR100 supersampling is better than the actual resolution of your display but it isn't as good as the native resolution. So why use same performance and get less gains
mosquitor !! It's the exact thing he said in the video what i said so why the fuck are you watching the video if you disagree. Seems you are bumd not me. Fucking noob
this video is truly insightful, especially when you mentioned how low settings looked then compared to now.
Damn, we have came a long way since then
Even though developers are making games runnable on low settings, hardware is still too weak to run the latest games: now every game has to be larger than 100 GB for some reason.
Considering the speed of hardware evolution, you'll be able to play those games at 4k with ultra high def textures with RTX in 2-3 years no problem. The reason they do it is the same reason you get remakes and remasters of old games. Or should I say the exact opposite. So they are viable over a longer period of time. It is good, necessary and smart. People who haven't played some games and people who revisit their favorite games a few years later will enjoy them even more then. Like I just bought Titanfall 2 and play it on literal max and it's mindblowing how good the game looks and how good it actually is. And it's a 2016. title. Most underrated game of the decade and the multiplayer is so good I have no clue why anyone even plays CoD anymore. Now answer me this - Why can a Bugatti Chiron go 450 km/h if the speed limit is 150?
@@Z4J3B4NT Now if only salaries increased as rapidly as hardware improvement, what a wonderful world that would be :)
@@DerinTheErkan Well I cant argue with you on that one... I sell my rig every 2 years and buy a new one. But there are used computers all over the place for 70-80% of the price just because they've been used a few months. There is nothing wrong with them and most sellers would even let you test them. But the smartest thing is to set up your own rig part by part. You can also save about 10-20% of the price just by doing it yourself and not buying a preset computer. There are guides on the internet how to do it and which components to use. All in all, just a little time and effort can save you hundreds of $$$.
Thing is, those sizes are pretty inflated from all the MTX content like skins and character/weapon variants and other useless stuff like that. But also from all the extra LODs (level of detail) being generated for all ever increasing assets. Have you ever wondered how games manage to have so many graphics options that let you scale down to potato levels? It's all based off of pre-generated LODs for everything in the game. You might make a 4k texture (that is 4096x4096 pixels) but the game engine also generates "mip-maps" or smaller versions of the texture that can be streamed in and out as needed based on both view-distance and quality settings. And all those extra textures are going to add up.
Preach. Ultra settings is an excuse to buy overly expensive hardware. Thank you for finally pointing that out.
Come on, investing in a hobby is not an "excuse" for anything.
And you're thanking him for "finally" pointing out what, exactly? That gaming is viable on mid-range hardware without cranking up the settings to the max? What a shocking revelation.
Kris Molinari Filthy plebs
Kris Molinari Well, ok. He is saying that he feels that ultra isn't meant to,be standard or sought after. But personally, I use my own custom settings to add the extra bonuses (AA or other cool effects) while staying at 'high' setting. But ultra is still nice, I mean I recently got skyrim (not special because that doesn't mean much) and set it to ultra and it looks like a modern game! I mean the HD bonus pack helped but still. Ultra settings I feel are for games that are going to last and last, not games that rotate constantly
Not overly expensive.
Unless you are like me, and can easily see the differences between high and ultra high in games. It honestly depends on what game I'm playing, I can happily play modded Skyrim at 30 fps, if it's pretty. If it's a competitive game, I'll try to hit 60 minimum. Ultra settings are definitely not excuses.
NAh, people will still complain if they run their games on Extra Ultra and get over 150fps.
They want more no matter what.
Me, if I can run the game and get a decent 25-40 fps. Im happy. Even if it means playing potato games.
SunBro's \[T]/ Same...
SunBro's \[T]/ ONLY 200 FPS ?! FUCKING TRASH MOM I WANT A NEW COMPUTER 😂😂😂
I'm happy with 50 fps.. on tf2.. with ultra low config... the game still lags.. tho usually because my ram is 2gb and tf2 almost takes all of it.. sometimes tf2 crashes because of not enough ram if i run google chrome too.
SunBro's \[T]/ 30 at least 25is painful.
Niels Vantilburg LOL!! No. 15 is painful
You have a very strong point. Although the part where you mention that benchmarks should not be shown on the highest settings. The point of a benchmark is to push the system to its limits and see how cards compare under stress that might be shown to it over the 2-4 year lifespan someone owns the card. The framerates of a 1050 and a 2080 ti on lowest settings will be much closer because it is not hard to run most games. There comes in the bottleneck of the cpu. Which exists to matter what. Therefore benchmarks should always be on the highest settings.
Video games make their graphics beautiful because people care about how the game looks. I play almost all games I own on lowest settings to get that sweet 144fps. Many people prefer quality over frames, not to mention streamers. Also, not everyone plays first-person shooters where you get an advantage on lowest settings. Games that are slower, like you showed in the video, The Witcher 3, they were built to be beautiful graphics and story that immerses you. Why would anyone want to be running around a low poly world if they could make it look beautiful by changing the graphical settings? (Assuming their hardware can handle medium to high settings.)
Do I even need to mention Virtual Reality games that will be coming soon in the future where ultra won't just look good? They may be so believable that you may forget that you're in a game.
These are my two cents, I do agree higher fps is king. But there are some criticisms still here! Have a nice day!
our standards have slowly Ryzen over the years
Rex i see what you did there
Vega seems pretty dim lately.
Battlestar Frakyrie i wouldnt want to go for vega for volta is going to be out next year. Vega is a nice card, but personally wouldn't go for the card however you do you i guess.
Vega is a like a 1080 card... 1 year too late. Volta is gonna slaughter AMD
@Desertcamel @Cheeki Breeki for 2018 amd has Vega 2.0 to compete with Volta, and it will not have xx70 and xx80 performance, but higher imgur.com/a/4s3nV Volta is not gonna slaughter amd until the gpu's are out
This video aged like fine wine.
Auto generated captions be like
*RYZEN*
I think this is more of a misunderstanding of the purpose of those benchmarking graphs. Usually the priority is **not** to represent a "realistic" use case scenario, but rather, it is to make the differences between the products being compared as big as possible so that you can have a fair comparison of each product's maximum potential. This means that benchmarkers deliberately create GPU-limited scenarios when comparing GPUs, which means turning up the resolution and graphics settings, and CPU-limited scenarios when reviewing CPUs. I feel like people often misunderstand this which can mislead them. Benchmarking graphs like these are primarily supposed to be used for relative comparison, not really for absolute numbers; as said in the video there may often be better performing options that look similar, that's just not the point of the benchmark. Furthermore, in reality the differences between between these products may in many scenarios be less pronounced than in the benchmarks, that's because the benchmarks are deliberately designed to make these differences apparent. I really hope that more people will keep these things in mind when interpreting benchmarking graphs and making buying decisions.
Also some Ultra Effects are actually annoying to play with - for example overused DOF or especially Motion Blur. Dear Developers. Stop putting Motion Blur to your game! It's annoying af.
fuck off motion blur is good
Stuntkoala i played rust with motion blur and it was unplayable and when i turned it off it was great
Stuntkoala WTF NO. Developers don't listen to "Stunykoala" who thinks that just because he hates motion blur that it must be true for everyone else! I love motion blur. As a matter of fact, I look for that setting in any game I play. It makes the game more cinematic and more fluid looking. And the motion just looks badass. I can't understand how people hate it so much. I also read motion blur is supposed to be a realism setting. Uh because in real life when you move your eyes, there's a blur? Duh.
I despise motion blur, looks fucking awful and it usually makes me lowkey motion sick
Clayton Hernandez
it isn't a realism setting, its used to replicate camera motion blur
When you're so early that no one has finished the video.
TheKingOfArmadillos I have xd
I did!
that's what 2x speed is for in the youtube player. My speed run record of all of youtube is 7 days 23 hours 8 minutes 40 seconds
beaver-tails64 69 , I watched the video 2 minutes after release.
TheKingOfArmadillos how??Xd
At 6:55 is it me or the water in medium looks much better than in very high and ultra?
El Libertoso like he said it looks more artistic, because I guess you wouldn’t be able to see ripples from there and the look on the ultra is more realistic, don’t get me wrong, I think the water looks better in normal.
medium is nicer but high and ultra is more realistic
Artistic difference. Sometimes more realistic stuff looks worse than drawn stuff.
Take that fps he was playing here. Would it look better if it was actual footage of some guy? No.
Our 3d artistic creations are now more desirable than what is real.
Have you seen real people on 4k? They look horrible
I think that that's down to SSR (Screen Space Reflections). See, the brownish rocks on the right are reflected in the water - resulting in a muddy looking river. While, on the "normal" / "high" setting it's static water with some specular highlights. Honestly, I've always prefer against using SSR, since its artifacts and performance drops are immensely annoying and not worth the questionable increase in visual quality.
That's because the one on the right is closer. Watch a bit further and the very high and ultra ones look pretty much the same. Or seemingly closer anyway, not sure what that is.
Absolutely well said. Additional graphical fidelity starts to become a negative at the point where the drop in frame rate is more noticeable than the supposed "better" visuals.
Still rocking a 7950 still runs great after all these years.
Gary Harrison cringe
Maraudience mate, fuck off.
6970 here. I envy your driver updates.
780ti here, sure it was a beast, but now is below average
7670m here, still can run new games.. if it's struggling, just play windowed with smaller resolution. once you get used to it, no problem at all :D
This video just woke me up. For years I haven't bothered with anything but ultra because that is how you are playing "badass". I am going to try turning down some settings and get better fps.
Ultra settings are overrated as hell! Thank you! Second to ultra settings are best, or even the third.
justifying bad PCs
m1ksu justifying being a yuppie
+Matt Frankman yup
Resolution vs FPS is very subjective. I own a 144 hz monitor and I kind of regret it. I think a 75 hz one would be better for me.
foreign's gaming tech buy a 75 hz and overclock it
Before: High Settings were the only ones, that looked good, so High was the Best
Today: Everything that is Normal or above looks the same, except that everything above normal runs slower, so Normal is the Best despite High and Ultra also existing.
I feel like as a consumer you just need to be aware of the purpose of a benchmark. It's to show the difference and see what each GPU is capable of. What you're requesting is optimization guides on what mix of settings are the best for a balance between image quality and performance. They're two different things.
I'm still content with 1080p 60 fps gaming
yet heres consoles trying to go for this BS semi 4k res and sometimes cant even manage 30 fps
hell walker Why would anyone want to play 4k at 30fps especially in an action game. This is really nothing new though, people who don't choose to educate themselves usually prioritize resolution over fps.
But higher fps at a decent resolution is way better than higher resolution at 30 fps
Fps is always the most important factor in action games, especially ones such as CS:GO. It's all good and nice being able to run a game at 4K on ultra, ya da ya da, but if you're only running 30fps you are not going to do well.
1024x768 all low rocks
yea bitche fuck 1080p and 4k silvers
I rather have a benchmarking "standard" (like the ultra presets) that every media outlet uses, but not reflects the way you actually want to play a game, because:
1. I can compare benchmarks between different magazines, websites, users etc.
2. I just use benchmarks to compare the performance of a GPU and not the performance of a game. If want to know how to get the most fps at a good quality, I can look at a lot of guides online.
2b. I (and certainly 95% of the community) would buy a game, even if I can't run it on Ultra. Because I don't want to miss out on a great game and because I can certainly run it if I am within the recommended system requirements
I think standardizing the benchmarks is more important than getting a super accurate indication of performance off the bat. Imagine the struggles on getting a good comparision between hardware when everyone uses slightly different settings, oh boy.
Should reviewers show how hardware runs on a number of settings. I.e, New AMD card gets 70% performance of its rival, if you want to match the rival in terms of fps, you need to lower to say medium? (figures made up for example).
I would honestly like that in a reviewer as a PC enthusiast.
Or they could put a little more effort in and benchmark BOTH ultra and medium/high settings. People who buy a 1060 aren't looking to play on ultra settings so a REVIEW shouldn't show only ultra. It is retarded and misleading.
Krytern UK I have the RX 480, gaming on 1080p Ultra and getting good framerates, so I'm sure plenty of GTX 1060 users care about Ultra.
Yeah you've just convinced me to go with an ultrawide (which I've always wanted to game on) when I get my RX 5700XT. I've been hearing so many mixed opinions on whether that card will be enough for a decent 3440x1440 experience and the benchmarks are okay but not amazing. It's so true though, if it can run the most demanding AAAs of today on ultra at 40-60 then I'll be able to achieve 60+ at high/medium in future AAAs for a few years yet.
@NelyL I run RTX 3070 on 3440x1440 and most of the time I can easily get >80fps. The exception being demanding RTX games like Cyberpunk or Control.
It has come to the point where I barely care enough about the settings to even fiddle with them. I'll usually go in and see what I can turn down just to get more fps. Some games look fine on low, medium and high, so I have no reason to play it on ultra.
Same here, but boy, do I get a hardon for Ambient Occlusion.
recently learned to love AO, for me it was worth the Frames lost.
I think my priorities are:
Performance/resolution (1080p)
V-sync off
Anti-aliasing/textures that aren't god awful
Lighting
Everything else
DenniTheDude ;) fiddle
My only regret from buying an R9 285 when it came out, was the I got one with only 2GB of VRAM. That severely limits the one quality setting I care the most about, and that's texture quality/resolution.
Same with GTX 770 - 4Gb still relevant; 2Gb - sad.
3:00
yea!!! if my Radon RX240 can't run GTA i'm suing Rockstar
it probably can
even my gts250 ran gta v okay
chen howard My integrated GPU in the A8 7650k runs GTA 5 fine.
There's no RX240.
bestiebest bro, my PC BlueScreens after 2hours of that.
Such a good video. Way ahead of its time than everyone else who did this same video.
Turn down the settings from Ultra? What do you think I am, some kind of PEASANT?
there should be a paid dlc for ultra extra settings , i bet you would buy it insta :P
you're a peasant if you're gaming at 30fps
You're a peasant if your still gaming at 1080@60 in 2017.
You're a peasant if you really think your graphics settings being higher than someone else's makes you better than them.
Wes Tolson it's a joke man.
So basically you're saying "High settings for gaming, Ultra settings for screenshots"
i can confirm that Doom does look really freaking good on lowest settings.
I accidentally put it on low and didn't even notice for like 3 levels. The optimization on Doom is insane. I built a $300 steam machine with used parts that can run it at above 90fps (I think on high but I don't want to lie)
I have RX480 and I run DOOM on ultra about 150FPS 1920*1200
Sticking to 1080p is always good with the hardware increasing
1280x1024 masterrace!
NiTROXx_Gaming stop
NiTROXx_Gaming
true 1360x768 is the best cost effective
I had to endure it, just no. Just no.
5:3 RULEZ
If you are still using a monitor with that resolution i feel sorrry for you
couldnt agree more phil. i am still using a gtx480 and people cant even believe that i can run games like gta v, witcher 3 and crysis 3 on 1080p. they think that i am liar. i dont see a reason to upgrade. i dont need more power(for now). the freaking marketing made everyone think that you cant play AAA games with older, or just lower end hardware.
so true
Exactly. I plan on getting a 1440p monitor soon, and I've been told that I absolutely need to upgrade my GTX 970 because there's no way I can run games at 1440p, but of course I can, a 970 is still a fantastic card. I hope to try out my old GTX 750 and HD 6670 at 1440p just to see what's possible.
Gou Well, I had an 480 and upgraded to a 1060 this year just because all of the games you listened didn't run properly even on the lowest settings (except Crysis,because I haven't played it) and it got obnoxiously loud and hot. But it was a pretty nice card since I had it like 5 years and it never caused serious problems.
Jan Schulze didn't run? I am running gta v on high 720p and I get 50fps. my 480 is oced tho
***** That could be the thing why it runs better. But I guess the cards just gives up after 3 years of excessive use every day (~4h on workdays, 8-9h weekends) and 2 years of inregular use. And the cooling things on the side of the card are just brown or light black now.
I thought the purpose of Ultra Settings were for when a game is 14 years old and our PC are monstrously powerful when compared to the PCs we had at the time. My computer’s pretty old now, my hardware’s failing, and components are old and underpowered. But I can run old games like Half-Life 2 (2004) or something along those lines, on ultra, 32x Anti-aliasing, perfectly fine. It’s awesome honestly for that, it’s just games post-2012 where I start having issues as my computer just can’t do it anymore, and I have to reduce the settings to high or medium, and for even more recent games I have to go low and very low, and even then I can’t get 60.
i mean that's a good point. I had a rx550 2GB, which is a cheap, budget oriented graphics card from 2017; and while playing need for speed underground 2, i could put in on max settings and on the max resolution it supports, hit the fps cap (140fps), whith a stable framerrate, while still using at most 70% of the gpu in the worst case scenario.
Still, the game looks pretty good to me even though it'a a game from 2004.
Back then, however, you needed a high-end graphics card to be able to run the game at max settings, max resolution, and a playable framerrate. Nowadays, you can do it even on integrated intel hd graphics (you would be getting about 30 fps in that case tho, however that is definitely *playable* )
Motion Blur Off.
Anti-Aliasing Off.
Depth of Field Off.
V-Sync Off.
Shadows Down.
Physics Down.
View/Render (LODs) distance Medium to High.
Textures and similar settings rarely change drastically between even just High to Very High, and Very High to Ultra.
FOV, High.
Borderless Windowed Mode.
Bloom/HDR, Depends on the Game.
We Good Boys.
That 3 way GTA V bit made me nauseous, holy shit.
Well if I can run a game on Ultra on 60fps I set it to Ultra but if I can't I don't.
I feel a lot more secure in my GTX 780 now that I've realised that I agree with this video.
Therid Li i would upgrade....
5 years on and I'm here feeling like an old man talking about vinyl. "Why did you take 25% more of my GPU and an extra 75 flipping WATTS to make the game look WORSE?" I mean, having been a kid, I tend to defer to kids with regards to "taste" and "culture" as it's very rare for anything I liked 20 years ago to be relevant today... but do high schoolers actually prefer to have motion blur turned on, for example? I personally hate it.
Nah, most people don't like motion blur for action-oriented games
@@antt2228 Simulating motion blur in a high-FPS game is strange to me since it's a camera defect in this context from prolonged shutter speeds to my understanding. Our eyes do have something akin to motion blur (although at around 50 FPS shutter speed and you can think of our eye as a top-quality, noise-free camera even in low lighting conditions with ultra-high resolution near our focal point) but we'd perceive such a motion blur anyway in a game or video playing at a high enough frame rate even without motion blur.
You would be surprised how big the difference is on large screens.
Especially with FPS, cause everything has to move further along the screen.
I agree with Chatos. Framerate is even more important on a large screen. I left 60 Hz with the original playstation in the 90s, even DOS was 75. No way am I ever going back to 60 Hz. 4k 120 Hz needs to come way down before it will make sense.
Who is watching at 144p 15fps?
me
lol noob I've got 360p at 17fps get on my level
Filib931 Kek... Lucky you I watch it at 80p 5 Fps
I always tweak every setting of every game to the minimum that makes a certain graphic element visible in gameplay. Siege for example: shadows on low turns it off, on medium it turns on and from then on it just gets higher res, so there's no reason going beyond medium for that one. It's the way I get the best visual/performance ratio, works every time.
I could actually see a case where that might give you an edge as a gamer even if you can run ultra at 60+ FPS, since no shadows means you can see things more clearly. Shadows make things dark and offer less visual/light information to our eyes. Only time I think a shadow helps us functionally (not aesthetically) is to improve spatial awareness, but not all games require that (ex: first-person games don't require that nearly as much as third-person since you're already directly seeing through the character's eyes).
TH-cam automatically translated this video's title to spanish for me and due to spanish's grammar the meaning is left ambiguous and could be interpreted as "Settings ultra-suck" which I find amusing heheh.
Hahaha the captions say RYZEN for the word risen
I completely agree. Ever since the games industry overtook the movie industry the target audience changed slowly over the years to the point where now your loyal true gamers have been completely shuned. Now is the age of the casual gamer where games are made to look and feel like the games you used to play. Everything is about graphics rather than emersion when they both should go hand in hand. The most visually graphical games today are just dead inside.
no need to cough, bro!!!!
EVERY CALL OF DUTY SINCE BLOPS 2 HAS BEEN WATERED DOWN DOG SPOOG FOR SUB 100 IQ COUCH ANGELS
:D
i saw in some reviews that ww2 looks the same as mw1, but uses over 10x the vram lmao
scam detected
Just turn AA off if you have a 4k monitor.
Agreed, I don't even find I need any AA at 1440 let alone 4K.
Even at 1080p i turn off AA. Ok, SOMETIMES i turn it on when the Game itself has "ugly edges" (dunno how exactly to describe what i mean ... bad english skills, native german, sry). Witcher 3 for example: the world is so overflood with details that i barely even noticed if AA was activated or not (except it's highest state, but resulting to 10-15fps). Turning off AA hasn't decreased my whole (awesome) experience with Geralt at all. I got 40-55fps at customized max settings with only 8 gigs of ram, a RadeonHD 7970 and i5-2500k @1080p.
The word we usually use for the lines that tend to get smoothed by AA, is "jaggies" :)
What the fuck are customised max settings even supposed to be? They're either maxed out(in which case you're not getting 55 fps with a 7970) or they're customised, in which case they're not max settings.