If he could just tell me exactly what difference he could clearly see, instead of being sarcastic i would try to look more in depth to see for myself. But in my opinion if there is a difference , it is so small that it isn't worth the FPS drop.
I agree but the manufacturers don't have any say when the benchmarks are being done by reviewers. And I would think that Amd or nvidia would want to showcase a lot of there cards at medium or high settings. If they had the choice. They would sell more. I Just hate how the whole industry chases this "ultra" dream. Every review every benchmark every TH-cam channel every website everyone showing ultra. No one showing what the cards can do at lower settings. When most cards struggle with frame rate at ultra. For the future there should be a graph showing ultra and showing high and medium as well. That would give us way more info.
Not really no. In fact one of the best ways to make things stand out in gaming is to reduce details, texture quality, shading etc. Oh and remove anti-aliasing. Aliasing is horrible to look at but actually draws the eyes to movement which is usually a benefit. Of course it does depend on the game and what you are trying to achieve. But for PvP type games a good trick is to set textures low (especially non character ones) and massively simplify lighting options. An example of that is in Left4Dead where lowering those settings makes it easier to see after you have been 'boomed'.
I love that analogy, and you're right, essentially Ultra is there for marketing AAA games, Top-flight GPUs and other hardware, they make the appeal of "You either game at the maximum settings, or you absolutely suck." which, unfortunately works for the most people and you can read a lot of comments on this video defending that fallacious argument. While gaming I think isn't very fun if you play at LowSpecGamer settings (unless you love tweaking stuff), it's not at all bad to try and find a good balance between sustainable frame-rates and good visuals.
Ultra means nothing like it did 10 years ago. The Ultra meaning lost its bark I would say about halfway through the last generations cycle. All it is now is to dump a lot of unoptimized garbage into the game and call it "gpu killing"
CaptToilet some years ago graphic settings REALLY made a difference. Just look at the first crysis and compare it to battlefield 1. crysis goes from potato to high end game (for its time) while battlefield goes from high end game to high end game. Even on low theres not really much difference to higher settings.
Mark Jacobs Fallout 4 can only have settings changed on startup in the launcher. And he must be playing at 480p or something to not see the difference. There are huge differences, especially the textures. Look like cancer on low.
this is me too except for Rise of The Tomb Raider the difference between high and very high on texture quality is about 10fps for me on a gtx1060.. don't know why..
There will always be someone who will say “There is a BIG and significant difference with ultra vs. High settings” I think those guys are the sort of poking their eyes with a microscope then looking in their monitors to actually see the “Real” difference.
Joseph Acosta Stomps madara is shit kaguya is better Japan loves kaguya, Because Kaguya is based on Princess Kaguya(a princess from the moon an alien). a 10th-century japanese folklore. (search google) so your word won't reach Japan. Japan likes Kaguya otsutsuki as a final villian because her mythology, her history is legend.
Definitely playing on high is best bang for the buck you get for money you spend on gaming rig, cause going in the high end zone you pay premium and get fucked by Nvidia anyway. I watched this video on 27" Acer Predator 1440p monitor(same as Joker main monitor) and i can see the difference if i pause it and check it carefully, the ultra one has better texture colors and smoother edges, but you might not see it on a 1080p monitor.
When it comes to graphics, if it's a multiplayer shooter I'll turn down all the fancy settings to ultra-low, no reason for the flim flam when I enjoy winning more than immersion. If it's single player, say fallout 4, I'll jack up what I'm capable of running without abrupt and noticeable frame divots. I think you'll find that when you die out of nowhere, it's most likely from someone playing with foliage on ultra-low. Word to the lesser wise.
We really needed this video Joker. Ive been trying to make these points for ages. Cards like that 570 and the 1050ti really do make our high end cards look like dumb purchases when compared like this. Not going to stop me buying big though. ;)
Darkice Presents those cards still good. think u missed the point of video then. hardly a portion of gamers game at that resolution which will force them to upgrade each yr just coz of tht resolution.
you won't see real difference unless you play the game,low and medium on witcher is huge in game and high is even more ultra isn't that much from high,but if you watch a youtube video low looks like ultra ,when in game medium is already day 'n night
Answer: Neither. CUSTOM settings are worth it. Real PCMR dont do Ultra presets, they either go above Ultra or use custom settings where there is zero visual difference between it and pure Ultra for higher FPS than the High Preset.
I agree with Alexander on this -- custom is much better than a preset, especially if you dislike Depth of Field and Motion Blur (like I do). While it's ""cinematic"" (to some), I feel as though it's unnecessarily taxing your system even more than you already are. [Given that in most games it's not reducing the load // power needed for LoD or far-off sights.]
I agree Joker, but I mean to say that the "Press Ultra preset" guys arent the real enthusiasts either :) I run WItcher 3 for example at higher settings than Ultra. I can also use custom settings that look like Ultra but perform around 15% better. That is real enthusiast level :D !
I beg to differ. In the 90s when I was a poor gamer. I lived in the options to get the best possible visual vs framerate. There's always poor gamers. Not knowing what card you have has nothing to do with fiddling with the options.
Potato settings are for those like me who have computers that are also potato. Seriously tho, I get ~15 fps in Firewatch at 640x480 with the lowest settings. idk how I can stand playing it, but I do
My stance, go ultra if you play solo games or non-competitive games. Also, some games doesn't seem that well optimized for ultra settings, hence you can see the performance difference between high and ultra. Other factors that can be taken into account, high FPS is also a go to for many gamers! Best however is CUSTOM... that way you can either enable/disable anything you want! That's the power of PC gaming!
Retarded advice. You can't tell the difference between high and ultra and you prefer to lose 20-30fps for nothing? Playing solo games is not a reason to play with crippled framerate.
Tuchulu while I agree with you about testing at ultra to compare results apples to apples, the point of this is that testing mid-range gpus on high shows a more real world scenario for the consumer.
I guess I'll use high from now on. I always use ultra on my 1060 and rarely encounter any noticeable frame dips. But if ultra is putting more strain on my card for a very little change, to hell with it.
Exactly this. I've been thinking about this for a while since I replaced my 1080p monitor with a 1440p one. And my good old GTX980 started struggling to give me 60 frames per second at that resolution. I'm a frame rate maniac and I had to lower all my settings from ultra to high, 59 FPS is not acceptable for me. And you know what, bam, I noticed no major difference in image quality, only difference that mattered was the lack of AA on objects in far distances, something that I can live with. I also have a 6700K so I could upgrade to any graphics card on the market w/o having to upgrade any other component. After testing high presets I decided not to, fuck it theres almost no difference between high and ultra. Thanks for the video, approval of my thoughts on this by someone like you was a relief. Keep up the good work!
You won't actually notice 59 tbh. The problem is probably the sudden drops to 40s or low 50s at times. It's the 1% low that's the experience killer. When you are average 59, your 1% low will actually reach 40s or low 50s. Whereas if you are averaging 90, the 1% low usually won't even enter the 50s, which means on a 60Hz monitor you will not notice any dips. For slower paced games, sometimes capping refresh rate to avoid sudden dips (always low as opposed to sudden lows) might provide a better experience. Consistency can be more important for the experience. So what you really want is frame rate "headroom" so that you won't see any dips.
Young Trey Just stop using FPS counters while in game FFS.. I am also a FPS maniac before and disabling any fps counter while in game. Helps alot with the FPS stress... 50-60 fps is decent at most. Unless you’re playing in a fcking Esport or what.. It’s because your always monitoring fps makes you so paranoid about fps drops. Of you know you have a capable PC just play the game and enjoy..
I'm a casual gamer. My favorite GPU is gforce with 7 in its series. For example GTX 570, 970. If my GPU can handle ultra, I will choose, it but if it can't, I will drop some graphic setting until I get acceptable frame rate. I don't upgrade GPU annually because I still have to buy gears for my other hobbies. Basically, If a game has a nice gameplay, setting down some graphic setting is a worthy trade off.
Lucky, I have a gt 710 pc but I don't play on it since I get much better experience on console. Only one thing I will say and that is Pcrichpeoplerace I can afford it but then I couldn't buy any games so I have a beast pc and I'm not playing many games on it Maybe some good free games but paid ones are where the deal is.
I don't realy car, IF I can play at 120~144fps in shooters at 1080p, I'm Happy ! My rig : i7 2700k ( 4,9 ghz OC with AIO ) sli gtx 970 coupled with 16 go of RAM. Above 140fps in BF1 ( high-medium settings ). When I play to RPG or non-shooting games, I put the graphics to Ultra.
It's in fact in developers' hands how their games are tested. AotS named their highest possible preset "crazy" and that's exactly what it is - crazy! It's not intended to be played at, but the benchmarkers somehow feel to test at the maximum possible settings. Let me explain, why it's silly. Back in the day, around 2006-2007 I was goofing around in Serious Sam: TSE. The game looked dated by that time and my hardware by far surpassed the requirements for the maximum settings. Then I found a Unlock Insane Textures (I don't know what it was called) option and decided "hey, I will turn it on, the game will look better! yay!", but it didn't. The FPS dropped from over 100 down to 15 WITH NO VISUAL IMPROVEMENT! That's the point where devs make the difference: either they unlock these crazily stupid options as presets for benchmarkers to use (minimal visual fidelity improvement at best) or they hide them in configs, because they're nothing but testing grounds. It's not that games are "optimised for HIGH, not for ULTRA", no, games are optimised. It's simply that the HIGH settings very often are the best performance/quality ratio the devs chose. And Ultra is meant for the future, _just in case_ you have a lot of unused horse power. Another thing about graphics settings: with rising resolutions anti-aliasing makes less and less difference. It's unnecessary to crank it up to 16x on a 4K display. Say if you needed 16x for 800x600, you'd only need 8x for 1280x1024 and then only 4x for 1920x1080 and maybe just 2x for 4K. Everything else is performance that is well used for other graphics option rather than excessively removing jagged corners. The figures are not correct, but you get the idea. PS: I'd be happy if you left a notice that you read it. That's quite a text and I find it not very usual for people to read comments long after a video was released.
The only setting to always have at Ultra is texture quality settings bc as long you have enough VRAM it will not effect performance that much but will effect image quality greatly.
The reason why there's not much difference between High and Ultra is because of consoles which has a set visual locked in (Mostly at a range of Mid-High settings) The devs who then make the PC version of that game with the console version in mind and do minor touch-ups to visuals and call it "Ultra" settings. And the result is what Joker has portrayed in his video. Little to no discernible difference but at a cost to FPS. Bottom line is, consoles are holding back higher visual fidelity in games. Devs are catering to games on weaker consoles and not utilizing the full potential of PC hardware power for better visuals.
Fahim Doula why would a developer spend huge sums of money just so a handful of pc enthusiasts can stress their graphics cards? The idea that Console gaming is holding back the potential of 3d graphics is unreasonable given developers couldn't afford to reach the quality we see in games now if weren't for consoles providing affordable systems for Mass consumption.
I would say it's our rapid increase in screen resolution. First it was 720p/1080p then 1080/1440 now game engines need to optimise for 4k. That's so many pixels from 1080 (which was what consoles were shooting for at the start of this gen)to 4k and it's not even the end of a generation. If we can all settle on 4k and not push 5k or 8k, then maybe we can start having console games run at med/high settings 4k 60fps next gen and we can all be happy.
And as a man that has mostly has his cost of living expenses go mostly towards other thing I say thank God for consoles doing that. Even if there's a big decreable difference in visuals why take the hit on fps? So you can impress some rich 16yr olds and adults that have limitless budgets towards games in a comments section? Seems kind of ridiculous to me in all honesty but if your talking cpu wise than you actually have an argument in that case, especially when it comes to wanting better physics and a.I in a game. Yes of course console's are holding parts of the industry back. You know that actually thing that effects gameplay vs having over the top visual fidelity?
Usually on Ultra settings, the details appear on close up, for example you can see great details on the faces of the characters (yes i'm talking about you Witcher3). So when that cinematic pops up, ultra has better feel.
This is mostly a subjective thing. Some people will be bothered with even the slightest reduction in image quality, some (like me) can dial them down to medium-ish-high (without FSAA or with very slight FSAA) without even thinking about it. And some don't care at all. I guess that also depends on the hardware you get, I have noticed that people tend to set ultra details simply to "justify" their purchase, in spite of the fact that many of them can't actually tell a difference from ultra to high. So, I call this a purely personal preference.This is mostly a subjective thing. Some people will be bothered with even the slightest reduction in image quality, some (like me) can dial them down to medium-ish-high (without FSAA or with very slight FSAA) without even thinking about it. And some don't care at all. I guess that also depends on the hardware you get, I have noticed that people tend to set ultra details simply to "justify" their purchase, in spite of the fact that many of them can't actually tell a difference from ultra to high. So, I call this a purely personal preference.
I think the point with reviewers using ultra settings is just to show the maximum the cards can do. It would actually be more detrimental if they didn't do this. What they CAN do however is include high settings as well to show how it would do on that setting and you can then figure out your bang for the buck etc. But seriously though products should be tested at their "best" capability. Especially high end ones. It is logical. I do agree that high settings is usually enough and if there is something missing then one can raise one or two settings to ultra to remove an obvious detrimental change.
I do if i sit there with my eyes 2 centimeters away from the screen playing spot the difference, it does help telling the difference on higher resolutions though, i can tell slight differences because i'm playing at 4K, if you tried to find the exact same differences with 1080p you wouldn't be able to tell 1 tiny bit of difference because 1080p is too blurry to see the tiny nitty gritty details anyway.
Yes the ultra has blue lights.. those things are easily noticeable, i mean the tinier things that aren't so obvious, and honestly that part looks way better without those blue lights anyway.
What I've noticed between High vs. Ultra settings is ambience. The lighting quality, fog and particle effects and fullness but that seems to really be it. I have noticed minimal difference between textures and such with a small amount of sharpening around the edges. So basically what you are getting when it comes to High v. Ultra, are betting lighting and particle/fog/cloud effects. (As taken from Crysis 3 testing and studying) from what I could personally tell. Someone with a batter trained eye in the field of enthusiast may be able to sport even more differences, but those were the major ones that immediately popped out to me.
I play overwatch on lowest settings 1080p 144hz with my i7-930 @ 3.7ghz and gtx1070 lol. Lowest fps is around 110, but that's because of CPU bottleneck.
This is so informative! Other reviews and benchmarks for cards have led to me to be skeptical of GPU performance for cards based on FPS for the Ultra setting, when really, the difference in visuals between high and ultra is SO marginal, but the change in the performance is SO drastic! Makes me wonder what all of the countless cards I've looked at do at High settings compared to every Ultra benchmark ever that I've looked at! I genuinely have a new way of looking at benchmarks. Thanks a ton!
I'm about to watch this video, but my conclusion beforehand is no. I played Fallout 4 at medium/low all the way through on a shitty pc. When I went back at it on ultra, I only noticed godrays as all the textures looked about the same.
If your computer can run it... then yes. It is worth it to have settings on Ultra. 🤣 Edit: Here's a graphical tip... If you want ultra settings and high FPS, turn down or turn off bloom, Anti-Aliasing and shadows. These 3 options have the highest strain on framerate. If you want a bit more, some optional settings to change are turning on motion blur and depth of field. :)
@@noir371 Yes, but it's really only noticeable up close and really far away. Turning it off could improve framerate a ton but if you still want it to look decent, turning that down instead, may be a better option.
Thank you. Someone talking sense about smooth gameplay being more important than very small details being improved on. Especially if players are trying to have a good experience on the latest games with three, four or even five year old hardware.
Ultra settings are designed to obligate people to buy new GPU. Everything is business so companies give us these ultra settings in order to buy more with the time and this becomes a vicious circle that never ends.
at 3:15 check the bricks on the wall of the building on the right, thats actually some lovely detail in there, you can see separation between layers of bricks and even indivisual ones (even though it's all just a single texture with a bump map applied to it)
Yeah i have a 1080Ti and i usually turn shadows down to high or very high (whatever is 1 less then ultra) and maybe some AA down so i can get a good 4K 60.
1080ti owner here, with an ultrawide with 75 hz. I run every game on ultra everything with one exception - the shadows on some games I also turn down 1 notch. With my monitor, the 1080ti is perfect for even the most demanding new games, running them at 60-75 hz and looking/playing great.
I prefer high over ultra for its consistent performance, and I agree that while showing ultra benchmark results is nice for window shopping, most consumers would get more use out of the realistic high settings in 1080P / 1440P @60FPS / 120FPS benchmarking.
My problem with PC gaming is that I am a spoiled brat and always want ultra settings. If I don’t have them on, I feel like I’m missing out and I obsess over it, even if there is little visual difference. Whereas on console, the experience is set and I don’t even think about the visuals because there is no changing them.
haha i played resident evil 6 and outlast and black ops on pentium 4 and geforce 210 between 3 and 11 fps so 40 or 50 fps are like a dream that i can't reach ... worth salutations ?! ;)
I wonder how many graphics enthusiast would actually get a better visual quality increase by investing on some glasses as the first thing. And no, I'm not joking. People who look at screens a lot, tend to often be in need of some vision correction as a result. Very often, actually.
7 ปีที่แล้ว +6
Me at 0:30 Most Game Devs optimize for High not Ultra: "What??!!" Thx for that info m8
I miss the times when I was really young and when someone would install a game on my PC, all I'd want to do was start and play it without tinkering with the options. Now, I can't start a game without checking out the settings menu first. When a game starts with a forced intro/tutorial which doesn't let me access the menu until I finish it or reach a certain point in the game, I feel a bit frustrated.
walter z the ryzen 5 1600 has wayy better value than the i5. Intel fucked up. Instead of reacting to amds ryzen 5s, they introduced the x299 i5 and i7 that are exactly the same as the normal chipset ones. Epic fail.
Your probably right, let me rephrase that i5 or the Ryzen 200$ ish equivalent CPU. My point still stands the best bang for the buck is still high-ish mid-range. you get the best bang for buck... BTW "Epic Fail"? for who? I hold no Vested interest in any company, I'm platform agnostic I only care for the best bang for the buck. I think these company competing is an Epic win for the costumer.. to choose sides because you need to justify your purchase is just lame ~no offense
walter z Epic fail for intel. Haven't you seen all the criticism that the new x299 boards with the i9s got? Intel did some stupid anti consumer shit. They sell the i9 on a new chipset that's only on fancy expensive at least 250$ boards, ok. But what the fuck is this i5 and i7 that have a fancy x in their name and fancy boxes that just very slightly outperform the normal i5 and i7 on the normal cheap boards for the same price (250$ for the i5, 350$ for the i7)? My 100$ b350 mobo + 220$ r5 1600 combo outperforms even the i7 for way less money. I mean cmon, the i5 doesn't even have hyperthreading? Quad cores on a 250$ mainboard in 2017? They don't even support all pcie lanes. And what the fuck are these raid keys? Dlcs for hardware? Amd is just looking way better and honest with their offerings with the ryzen, threadripper and epyc. Bang for the buck and honesty. Intel looks like it's about to collapse under the pressure of the competition that they haven't felt for many years. I don't want a monopoly. I want good and honest competition and good products.
The ryzen 5 1600 stands no ground for the i5 7500. In overall value the Ryzen 5 beats the i5 but wea re talking about gaming here no content creation. If you are exclusively gaming on your pc and not working creatively with it, buying an i5 is a no brainer. Intel still has better single core peformance. If you don't believe me, go look at some benchmark. The i5 7500 beats the Ryze 5 1600 in every game.
really depends when and where you buy as the cost of a 1070 is shooting up very far very fast that the 1070 and the 1080 at some online retailers cost the same to this getting a 1080 would be better if that is still so but day by day the prices change
tbh i just want my games to run at high for the modern titles so that when future titles come out i will still be able to play them at 60fps on lower settings instead of needing an upgrade to play them on 60.
Nicknack only pro players care about being competitive for me i play cs go but i don't care about losing or winning i just have fun.if you are just a normal person who have nothing with pro gaming,streaming or being any any field of gaming why i stress my self for nothing
I play CSGO and Overwatch on low since they are competitive games and i don't want to get destracted with VERY AMAZING FLASHY LIGHTS GRAPHICS and that shit, i just want to play with simple graphics and a solid fps
They never were, they will never be. They are always meant to tank the FPS as much as possible just because it's possible and to force people who have this "pride"/requirement to invest in hardware. There is a reason the console versions do not use the ultra preset. The FPS/image quality ratio is not there, it's a big premium you have to pay for almost nothing in return.
totally agree with you about the high setting. As casual gamer, i also could not easily able to differentiate which one is Ultra setting or High Setting during the gameplay.
Unless you can afford a top tier graphics card. I don't need imaginary framerates higher than my monitor's refresh rate, I'll take ultra settings instead.
In most cases the Ultra preset is basically just turning the resolution of certain effects up beyond the maximum the developer intended so they can check a box. i.e. we are talking about what amounts to a brute strength approach to image quality which normally doesn't make sense from a performance perspective (especially if you like to play at above 60fps). Many games basically look nigh on identical at high, with the exception of shadows which are a bit jaggier (though this is something that will only really matter in close-ups in cutscenes). Textures I'm not factoring in because they are more of a VRAM issue than a speed issue (provided we aren't talking about a really old card). The difference between High and Ultra might not be very appreciable, but the difference between 60fps and 90fps sure is.
Watching settings on youtube vs actually seeing them on a PC is a completely different story, just as listening to a high quality FLAC or WAV uncompressed music file is far different than an mp3 or youtube file rip.
Right on man. I have a Core i5 7400 with a stock cpu fan, 8GB of ram, and a 3GB GTX 1060. I can play most games at 1440p on high (even one or two ultra settings) on Killing Floor 2 for example, and this machine crushes it with the lowest I've seen on a custom map to drop at its lowest to roughly ~65 fps. Many reviewers on TH-cam have the "Linus" mindset that if it its not 4K at 144 fps, you're not gonna have a good gaming experience which is false. Nice video man. You earned my sub.
+Joker Productions, thank you, thank you, THANK YOU for this video. I consider myself a PC enthusiast, chasing after the best performance in ULTRA settings, and feeling ashamed of my rig whenever it wasn't getting above 60fps on the latest title in the highest settings. In fact, I recently upgraded my video card, unnecessarily, from a GTX 980 to a GTX 1080, which yielded better benchmarks, but didn't really change my gaming experience all that much.
I love your point. While I may game with different settings (between medium to max), I hardly ever game with everything on ultra (unless its an older game or well optimized game, like doom or overwatch). So watching a benchmark of a card with everything on ultra for everything doesn't really help me in a real world sense. At least everything on high gives more of a realistic, "in the middle" approach (that I'm probably going to end up using). One big case of the higher settings not mattering is tessellation. Most of the time there's hardly a difference between medium and high tessellation settings, much less high and ultra. And as for lighting (and by extension, shadows), I sometimes think the highest settings can be a little jarring, due to the light and/or shadows having an "uncanny valley" effect.
Tom Clancy's division ultra settings were blatantly obvious, its sharper and crisper with better contrast. The high settings look like Ultra but with a fine layer of silicone grease smeared over the screen; some of the later games, not so much difference. I'm a photographer so maybe having a trained eye helps me see the differences.
1:32 I 100, 200% agree with you there. I play Overwatch at 720p low with 50% render scale on my laptop, and although it looks pretty ass the 45-60 fps gameplay I play at is infinitely more playable and enjoyable than the 15-30 I get with a higher res.
Performance = casuals. How about you think about your wording next time? I am a hardcore gamer, but I cannot invest thousands upon thousands of ad revenue/patreon + job money on computers. I am a hardcore gamer, but I have to be realistic. Unlike you, not everyone is living the mile high club. Just because people prefer playing on High/Medium it doesn't make them "ew filthy casuals".
I am after the visual. If I play game like Witcher 3 or BF1 I would go Ultra cause its look somewhat better, u immersv urself with the lightning and I atleast want solid 60fps.. if my GPU cant take it, ill just buy stornger one. U dont have money? Get urself a job, done.
fuck no it doesn't! Been a console gamer all my life then I finally fell into the hole pcmasterrace bullshit. got a good job started buying the most expensive components and I concluded that other than the vast amount of game discounts that exist on PC, there is literally no reason to go from a console to a PC. I literally notice no difference other than small visual increase on PC when compared to consoles. Pisses me off
although this is somewhat true, at the same time the visual increase can be pretty damn nice when coupled with 60fps which consoles just cant do......... until xbox one x of course, and from what ive seen it seems the one x will perform pretty much the same as my gtx 1070 rig which cost me over the price of the one x lol
Mmmmm thanks for the reply. Never heard of reshader and similar software. Been doing research after you mentioned it and have to say that it sounds like a really cool way to customize the gaming experience. Do have to say though, seeing a lot of posts saying that such tools can be a pain due to them being buggy. Thanks :)
Depends on the hardware you're using. When I play the witcher 3 at ULTRA settings, 4K, 60fps, 4:4:4 chroma on my LG OLED TV, it's the best visual experience I've ever had in gaming. Mass Effect Andromeda, Arkham Knight, and other games look fantastic maxed out at 4K/60fps.
I did that PC gaming thing in the early 2000's, tried it again in 2011. Never again, total waste of money and the experience is still the same. These idiots that think that just because you can PAUSE AND ZOOM IN a video to see the differences makes it worth it lol. I remember when they used to port arcade games to console in the 80's and early 90's you'd have sprites half the size, half the animation, way lower sound quality, and you'd be missing the ENTIRE FUCKING BACKGROUND!!! Now people jizz over an object in the background that looks a little clearer if you squint, lol!!!
DigitalHaze65536 it's about the FPS. I can play on very high settings at 144fps giving me a super smooth experience as well as better visuals compared to consoles 30fps. It works out cheaper pc gaming over 5 years than it does on consoles too. No need to pay for online services. Games are generally cheaper especially if you wait for sales.
Thanks for the video. It's been a while, but I'm getting back into PC gaming (vs. console). Mostly saving and playing around with PCPartsPicker to see what I can (pre)build. This helps out a lot.
For anyone trying to spot differences, look at any part that uses shaders (reflections, water, close up shots of lighting/shadows. You should be seeing a slight difference in accuracy of the shader there.
Very good video thank you. I did not realise the difference was so insignificant.. I'm one of those that tries to play everything on Ultra... I will now review that :-) Subbed!
Recently got a gaming PC, admittedly it usually hits above 60 fps on ultra settings for most games, with very few occasional drops. But on the 2016 Doom, I noticed heaps of particle and lighting effects in some levels, particularly the "VEGA" level - for those who've played it. It's the only snow level in the game and I experience huge fps drops in ultra settings. My two GTX 760 GPUs just couldn't handle the snow, even when working together. So, I turned it down to "high". The game still looked gorgeous, and the snowy VEGA Mars lab was portrayed on my screen in beautiful 1080p at over 100 frames per second. Huge difference it makes sometimes.
Totally agree with you. I run my games normally on High settings to get the better performance over visuals unless Ultra settings do not effect performance that much.
I use a mixture of both high and ultra, thus is what I do Textures-ultra lods i.e loading distance- ultra (I hate pop in) post processing effect like motion blur, dof, lens flare- off everything else- high
In any multiplayer game- competitive or not I rather rock low settings with higher res + AA simply to have an edge over players who don't. Ultra quality in the end only gives you visual stimulus and excitement for a short while however realistically in any game. Be it a shooter, mmorpg, etc.. after a certian point when you're no longer playing for fun but rather to win; you as a person start ignoring details in the game world. Start focusing on only what is necessary. When you hit this phase, winning gives much more excitement than visuals and to do so, most of the time you'll need to turn off distractions like crazy amounts of lighting effects, motion blur, etc... to maintain higher frame rates and reduce visual burnout.
use high settings for future benchmarks!
joker , thats the right way :)
Joker Productions that's reasonable. great job man!. I'm your huge fan. you don't talk b.s , you'll just jump to the point.
good because that's why i subbed way back when. Before Gimpworks crossed your lips
TonyPSR dirty causal
Benchmark games at all the presets available I would say
Biggest and most noticeable difference between high and ultra is that they are spelled differently.
Yo! You're right! I'll think about using ultra settings thx to you! XD
A Random Lantern lol no problem
PowerUpTo360 holy shit great eye I'll notice that everywhere now u broke my glass ceiling
Oh but there is one more. Fps difference. Other than that, nope.
FPS difference and thats about it.
They`re playing on the pride and competitiveness of gamers, so they have to upgrade to newer cards.
High vs Ultra , biggest "spot the difference" contest ever.
Well good for you.
Francois thats it buddy roast'em good
lul i fucking died
I Hate Humans It took you a month to recover from that burn.
If he could just tell me exactly what difference he could clearly see, instead of being sarcastic i would try to look more in depth to see for myself. But in my opinion if there is a difference , it is so small that it isn't worth the FPS drop.
Ultra settings are just for marketing to sell more video cards.
Shhhh not so loud :D
Nick Gaydos Winner Winner Chicken Dinner!
i agree at most cases
I agree but the manufacturers don't have any say when the benchmarks are being done by reviewers. And I would think that Amd or nvidia would want to showcase a lot of there cards at medium or high settings. If they had the choice. They would sell more. I Just hate how the whole industry chases this "ultra" dream. Every review every benchmark every TH-cam channel every website everyone showing ultra. No one showing what the cards can do at lower settings. When most cards struggle with frame rate at ultra. For the future there should be a graph showing ultra and showing high and medium as well. That would give us way more info.
That only works when there are incentives. I fail to see it on developer's point of view. Perhaps you can clarify?
Modern games even at medium look good.
Edward Teach Ghost Recon Wildlands does not
Ghost Warrior 3 does even at lowest settings.
i disagree
Well it's all about optimization.
Rainbow six siege is easy to run and looks good on low settings
Ultra makes great screen shots, High gives smooth frame rate (therefore better gameplay). It's icing vs. cake again. #TeamCake
Totally agree. love your channel!
you better answer to your own comments
ok?
Gopher 4k makes a big difference with ultra settings. It stands out the most in 4k.
Not really no. In fact one of the best ways to make things stand out in gaming is to reduce details, texture quality, shading etc. Oh and remove anti-aliasing. Aliasing is horrible to look at but actually draws the eyes to movement which is usually a benefit.
Of course it does depend on the game and what you are trying to achieve. But for PvP type games a good trick is to set textures low (especially non character ones) and massively simplify lighting options. An example of that is in Left4Dead where lowering those settings makes it easier to see after you have been 'boomed'.
I love that analogy, and you're right, essentially Ultra is there for marketing AAA games, Top-flight GPUs and other hardware, they make the appeal of "You either game at the maximum settings, or you absolutely suck." which, unfortunately works for the most people and you can read a lot of comments on this video defending that fallacious argument.
While gaming I think isn't very fun if you play at LowSpecGamer settings (unless you love tweaking stuff), it's not at all bad to try and find a good balance between sustainable frame-rates and good visuals.
Ultra means nothing like it did 10 years ago. The Ultra meaning lost its bark I would say about halfway through the last generations cycle. All it is now is to dump a lot of unoptimized garbage into the game and call it "gpu killing"
CaptToilet some years ago graphic settings REALLY made a difference. Just look at the first crysis and compare it to battlefield 1. crysis goes from potato to high end game (for its time) while battlefield goes from high end game to high end game. Even on low theres not really much difference to higher settings.
Pubg 4k 144hz 😂
60 FPS > rather than ultra
144 FPS > rather than ultra
Ichigo Kurosaki 144fps on ultra >>> 4k
Lol ultra most of the time is just a penalty to the fps rather than any visual upgrade.
1080 Ti. Both at 4K
this is your Tastes not all people
Ahhh a 570 for 200$ good old days :(
and the good 6 gig gtx 1060 for under $250, now %320+ even $800 lol
1st Acc even the might 1050 ti got hit
Just copped an asus strix gtx 1080 08g for $435 lel
October 2018, RX 580 for $180
Just got a 570 USED for $160 locally and I consider myself *LUCKY*
Barely any games have a visual difference between High and Ultra. Fallout 4 has legit no difference, lol.
Chrono™ to be perfectly honest? Fallout 4 looks almost the same at LOW vs Ultra.
Trust me, it does not. This is coming from someone who upgraded his GPU and went from playing on low-med to ultra. Huge fkn difference.
Chrono™ I've tested every setting in the game and the only thing I noticed was godrays.
deathbat6916 Did you restart the game? Certain settings will not change during run-time.
Mark Jacobs Fallout 4 can only have settings changed on startup in the launcher. And he must be playing at 480p or something to not see the difference. There are huge differences, especially the textures. Look like cancer on low.
textures on ultra if you have the VRAM, and everything else on high. looks nearly identical and gets much better fps in many games.
agreed ^^
josh Subet 👌🏻👌🏻
this is me too except for Rise of The Tomb Raider the difference between high and very high on texture quality is about 10fps for me on a gtx1060.. don't know why..
i have a 970 4 gb can i run on textures on ultra ?
david milla That depends on the game & resolution. They usually have recommended video memory in the settings :)
There will always be someone who will say “There is a BIG and significant difference with ultra vs. High settings” I think those guys are the sort of poking their eyes with a microscope then looking in their monitors to actually see the “Real” difference.
Joseph Acosta
Stomps madara is shit kaguya is better
Japan loves kaguya, Because Kaguya is based on Princess Kaguya(a princess from the moon an alien). a 10th-century japanese folklore. (search google)
so your word won't reach Japan.
Japan likes Kaguya otsutsuki as a final villian because her mythology, her history is legend.
Lol
Definitely playing on high is best bang for the buck you get for money you spend on gaming rig, cause going in the high end zone you pay premium and get fucked by Nvidia anyway.
I watched this video on 27" Acer Predator 1440p monitor(same as Joker main monitor) and i can see the difference if i pause it and check it carefully, the ultra one has better texture colors and smoother edges, but you might not see it on a 1080p monitor.
Some games you can see it. Others not so much.
Jalous..... 😂 Can't buy a real hight end rig?
When it comes to graphics, if it's a multiplayer shooter I'll turn down all the fancy settings to ultra-low, no reason for the flim flam when I enjoy winning more than immersion. If it's single player, say fallout 4, I'll jack up what I'm capable of running without
abrupt and noticeable frame divots. I think you'll find that when you die out of nowhere, it's most likely from someone playing with foliage on ultra-low. Word to the lesser wise.
Keatononame like most players in PUBG
interesting. never thought of that
Keatononame whats your specs?
Amen to that.
ultra for single player games
high for multiplayer games
For me it's the other way around.
Singleplayer games tend to consume more from the machine resources, especially the CPU.
Well battlefield games play battlefield games on low settings due to spot people easier. Utra with ray tracing in campaign.
We really needed this video Joker.
Ive been trying to make these points for ages.
Cards like that 570 and the 1050ti really do make our high end cards look like dumb purchases when compared like this.
Not going to stop me buying big though. ;)
not dumb when going for 4k 60+ on high to ultra settings .
Darkice Presents obviously, you're not going to buy a 470/570 if you plan to play at 4K
Darkice Presents those cards still good. think u missed the point of video then. hardly a portion of gamers game at that resolution which will force them to upgrade each yr just coz of tht resolution.
Evelyn X 1050ti would have to be replaced in a year or 2 as it's not futureproof at all. I want my parts to last longer.
you won't see real difference unless you play the game,low and medium on witcher is huge in game and high is even more ultra isn't that much from high,but if you watch a youtube video low looks like ultra ,when in game medium is already day 'n night
i only put ultra for textures others r on high for best performance and visuals
Answer:
Neither. CUSTOM settings are worth it. Real PCMR dont do Ultra presets, they either go above Ultra or use custom settings where there is zero visual difference between it and pure Ultra for higher FPS than the High Preset.
I agree with Alexander on this -- custom is much better than a preset, especially if you dislike Depth of Field and Motion Blur (like I do). While it's ""cinematic"" (to some), I feel as though it's unnecessarily taxing your system even more than you already are. [Given that in most games it's not reducing the load // power needed for LoD or far-off sights.]
This. I set every individual setting myself how I prefer. Custom settings where it's at. Ultra presets are a waste of GPU power.
I agree Joker, but I mean to say that the "Press Ultra preset" guys arent the real enthusiasts either :)
I run WItcher 3 for example at higher settings than Ultra. I can also use custom settings that look like Ultra but perform around 15% better. That is real enthusiast level :D !
I beg to differ. In the 90s when I was a poor gamer. I lived in the options to get the best possible visual vs framerate. There's always poor gamers. Not knowing what card you have has nothing to do with fiddling with the options.
I started knowing about settings BECAUSE I was poor. I wanted to get the MOST out of my terrible hardware.
textures are always worth on ultra. rest is fine @ high. many years ago however this was not the case and every setting had huge differences.
The truth has been spoken!
Potato settings are for those like me who have computers that are also potato. Seriously tho, I get ~15 fps in Firewatch at 640x480 with the lowest settings. idk how I can stand playing it, but I do
My stance, go ultra if you play solo games or non-competitive games. Also, some games doesn't seem that well optimized for ultra settings, hence you can see the performance difference between high and ultra. Other factors that can be taken into account, high FPS is also a go to for many gamers!
Best however is CUSTOM... that way you can either enable/disable anything you want! That's the power of PC gaming!
Retarded advice.
You can't tell the difference between high and ultra and you prefer to lose 20-30fps for nothing?
Playing solo games is not a reason to play with crippled framerate.
@@bobmarl6722 Or unstable ones.
Once you go beyond 'Medium' settings in most games: Framerate > High/Ultra Settings
yes but medium is shit
NARWHAL_ DEDSEC - Not as shit as 30fps.
I'd rather play on High@100fps than on Ultra@60fps, and I'd rather play on Medium@60fps than High@30fps.
Well if you have 30fps at medium settings, then your setup isn't the best
Not at all actually, I've been playing BF4 on Medium for a long time and recently i switched to high and noticed no difference.
Than you need to go check eyes
They should make GPUs for mining, so the ''normal'' gaming GPUs would drop the price a lot...
Wrong obviously
They test GPUs at Ultra for the same reason they test at 4K. Because the idea is to compare GPUs that are being pushed as far as they go
Tuchulu while I agree with you about testing at ultra to compare results apples to apples, the point of this is that testing mid-range gpus on high shows a more real world scenario for the consumer.
Jordan Koehn Yes. but if you test high end cards at medium/higj settings you just run into a CPU bottleneck.
I think this is a topic for Digital foundry
The Horror Sony foundry???? What about Microsoft sponsored foundry.
weechord
I guess I'll use high from now on. I always use ultra on my 1060 and rarely encounter any noticeable frame dips. But if ultra is putting more strain on my card for a very little change, to hell with it.
Some people just want to be able to say that they play a game on ultra....
Why though?
They can instead say their machine runs game at ultra, but despite that they run them at high for the stability.
Can't stand all these selfish Minecraft players, all their mining has caused inflation, shame on them !
Rumors has it there exists a hidden feature called 720p and the one who finds it shall rule over the 60fps/ultra gamers.
Exactly this. I've been thinking about this for a while since I replaced my 1080p monitor with a 1440p one. And my good old GTX980 started struggling to give me 60 frames per second at that resolution. I'm a frame rate maniac and I had to lower all my settings from ultra to high, 59 FPS is not acceptable for me. And you know what, bam, I noticed no major difference in image quality, only difference that mattered was the lack of AA on objects in far distances, something that I can live with. I also have a 6700K so I could upgrade to any graphics card on the market w/o having to upgrade any other component. After testing high presets I decided not to, fuck it theres almost no difference between high and ultra. Thanks for the video, approval of my thoughts on this by someone like you was a relief. Keep up the good work!
You won't actually notice 59 tbh. The problem is probably the sudden drops to 40s or low 50s at times. It's the 1% low that's the experience killer. When you are average 59, your 1% low will actually reach 40s or low 50s. Whereas if you are averaging 90, the 1% low usually won't even enter the 50s, which means on a 60Hz monitor you will not notice any dips.
For slower paced games, sometimes capping refresh rate to avoid sudden dips (always low as opposed to sudden lows) might provide a better experience. Consistency can be more important for the experience.
So what you really want is frame rate "headroom" so that you won't see any dips.
Young Trey Just stop using FPS counters while in game FFS.. I am also a FPS maniac before and disabling any fps counter while in game. Helps alot with the FPS stress... 50-60 fps is decent at most. Unless you’re playing in a fcking Esport or what.. It’s because your always monitoring fps makes you so paranoid about fps drops. Of you know you have a capable PC just play the game and enjoy..
I can even notice frametime variations dear console peasant, get yourself a pair of eyes.
I upgraded to 1070 sadly.
Stop replying you mad kid
I'm a casual gamer. My favorite GPU is gforce with 7 in its series. For example GTX 570, 970. If my GPU can handle ultra, I will choose, it but if it can't, I will drop some graphic setting until I get acceptable frame rate. I don't upgrade GPU annually because I still have to buy gears for my other hobbies. Basically, If a game has a nice gameplay, setting down some graphic setting is a worthy trade off.
Rendy Andrian Go for the 6 series (gtx 960, or gtx 1060) they are cheaper, and can handle modern games at an acceptable framerate and good graphics
Lucky, I have a gt 710 pc but I don't play on it since I get much better experience on console.
Only one thing I will say and that is
Pcrichpeoplerace I can afford it but then I couldn't buy any games so I have a beast pc and I'm not playing many games on it
Maybe some good free games but paid ones are where the deal is.
I don't realy car, IF I can play at 120~144fps in shooters at 1080p, I'm Happy !
My rig : i7 2700k ( 4,9 ghz OC with AIO ) sli gtx 970 coupled with 16 go of RAM. Above 140fps in BF1 ( high-medium settings ).
When I play to RPG or non-shooting games, I put the graphics to Ultra.
Yeah i don't care about ultra, as long as i can have at least medium to high settings 4K 60fps i'm fine.
depends on the game yes. Online FPS = MORE FPS = BETTER
Solo games like The Witcher , Far Cry 50 fps ultra is OK
It's in fact in developers' hands how their games are tested. AotS named their highest possible preset "crazy" and that's exactly what it is - crazy! It's not intended to be played at, but the benchmarkers somehow feel to test at the maximum possible settings. Let me explain, why it's silly.
Back in the day, around 2006-2007 I was goofing around in Serious Sam: TSE. The game looked dated by that time and my hardware by far surpassed the requirements for the maximum settings. Then I found a Unlock Insane Textures (I don't know what it was called) option and decided "hey, I will turn it on, the game will look better! yay!", but it didn't. The FPS dropped from over 100 down to 15 WITH NO VISUAL IMPROVEMENT!
That's the point where devs make the difference: either they unlock these crazily stupid options as presets for benchmarkers to use (minimal visual fidelity improvement at best) or they hide them in configs, because they're nothing but testing grounds. It's not that games are "optimised for HIGH, not for ULTRA", no, games are optimised. It's simply that the HIGH settings very often are the best performance/quality ratio the devs chose. And Ultra is meant for the future, _just in case_ you have a lot of unused horse power.
Another thing about graphics settings: with rising resolutions anti-aliasing makes less and less difference. It's unnecessary to crank it up to 16x on a 4K display. Say if you needed 16x for 800x600, you'd only need 8x for 1280x1024 and then only 4x for 1920x1080 and maybe just 2x for 4K. Everything else is performance that is well used for other graphics option rather than excessively removing jagged corners. The figures are not correct, but you get the idea.
PS: I'd be happy if you left a notice that you read it. That's quite a text and I find it not very usual for people to read comments long after a video was released.
Short : NO ! Good video Joker ! Keep up the good work !
Happy with my 1050 ti gaming X 😘
1050 ti is not that good.
Omenify Yes it is.
Got 1050 2gb :D
If you are going to buy a new graphic card at least buy 1060 6GB
Arzex considering there’s no point in a 3gb one obviously
The only setting to always have at Ultra is texture quality settings bc as long you have enough VRAM it will not effect performance that much but will effect image quality greatly.
Nice to see the discussion you and I had a while back had an impact.
The reason why there's not much difference between High and Ultra is because of consoles which has a set visual locked in (Mostly at a range of Mid-High settings) The devs who then make the PC version of that game with the console version in mind and do minor touch-ups to visuals and call it "Ultra" settings. And the result is what Joker has portrayed in his video. Little to no discernible difference but at a cost to FPS.
Bottom line is, consoles are holding back higher visual fidelity in games. Devs are catering to games on weaker consoles and not utilizing the full potential of PC hardware power for better visuals.
Fahim Doula why would a developer spend huge sums of money just so a handful of pc enthusiasts can stress their graphics cards? The idea that Console gaming is holding back the potential of 3d graphics is unreasonable given developers couldn't afford to reach the quality we see in games now if weren't for consoles providing affordable systems for Mass consumption.
I would say it's our rapid increase in screen resolution. First it was 720p/1080p then 1080/1440 now game engines need to optimise for 4k. That's so many pixels from 1080 (which was what consoles were shooting for at the start of this gen)to 4k and it's not even the end of a generation. If we can all settle on 4k and not push 5k or 8k, then maybe we can start having console games run at med/high settings 4k 60fps next gen and we can all be happy.
And as a man that has mostly has his cost of living expenses go mostly towards other thing I say thank God for consoles doing that. Even if there's a big decreable difference in visuals why take the hit on fps? So you can impress some rich 16yr olds and adults that have limitless budgets towards games in a comments section? Seems kind of ridiculous to me in all honesty but if your talking cpu wise than you actually have an argument in that case, especially when it comes to wanting better physics and a.I in a game. Yes of course console's are holding parts of the industry back. You know that actually thing that effects gameplay vs having over the top visual fidelity?
Usually on Ultra settings, the details appear on close up, for example you can see great details on the faces of the characters (yes i'm talking about you Witcher3). So when that cinematic pops up, ultra has better feel.
This is mostly a subjective thing. Some people will be bothered with even the slightest reduction in image quality, some (like me) can dial them down to medium-ish-high (without FSAA or with very slight FSAA) without even thinking about it. And some don't care at all. I guess that also depends on the hardware you get, I have noticed that people tend to set ultra details simply to "justify" their purchase, in spite of the fact that many of them can't actually tell a difference from ultra to high.
So, I call this a purely personal preference.This is mostly a subjective thing. Some people will be bothered with even the slightest reduction in image quality, some (like me) can dial them down to medium-ish-high (without FSAA or with very slight FSAA) without even thinking about it. And some don't care at all. I guess that also depends on the hardware you get, I have noticed that people tend to set ultra details simply to "justify" their purchase, in spite of the fact that many of them can't actually tell a difference from ultra to high. So, I call this a purely personal preference.
I think the point with reviewers using ultra settings is just to show the maximum the cards can do. It would actually be more detrimental if they didn't do this. What they CAN do however is include high settings as well to show how it would do on that setting and you can then figure out your bang for the buck etc. But seriously though products should be tested at their "best" capability. Especially high end ones. It is logical. I do agree that high settings is usually enough and if there is something missing then one can raise one or two settings to ultra to remove an obvious detrimental change.
i honestly don't see any difference between high setting and ultra settings
I do if i sit there with my eyes 2 centimeters away from the screen playing spot the difference, it does help telling the difference on higher resolutions though, i can tell slight differences because i'm playing at 4K, if you tried to find the exact same differences with 1080p you wouldn't be able to tell 1 tiny bit of difference because 1080p is too blurry to see the tiny nitty gritty details anyway.
6:01 there is a clearly a difference there
Yes the ultra has blue lights.. those things are easily noticeable, i mean the tinier things that aren't so obvious, and honestly that part looks way better without those blue lights anyway.
The only setting that I can usually tell the difference is textures...
except for prey of course.
What I've noticed between High vs. Ultra settings is ambience. The lighting quality, fog and particle effects and fullness but that seems to really be it. I have noticed minimal difference between textures and such with a small amount of sharpening around the edges. So basically what you are getting when it comes to High v. Ultra, are betting lighting and particle/fog/cloud effects. (As taken from Crysis 3 testing and studying) from what I could personally tell. Someone with a batter trained eye in the field of enthusiast may be able to sport even more differences, but those were the major ones that immediately popped out to me.
I agree on the Visuals part been playing overwatch at high settings with my 580 and the gameplay is smooth as heck on my 144hz panel. :)
I play overwatch on lowest settings 1080p 144hz with my i7-930 @ 3.7ghz and gtx1070 lol.
Lowest fps is around 110, but that's because of CPU bottleneck.
uhhh try turning up your settings some. Get the weight off of your cpu some and you should see some more fps. Hopefully anyway XD
This is so informative! Other reviews and benchmarks for cards have led to me to be skeptical of GPU performance for cards based on FPS for the Ultra setting, when really, the difference in visuals between high and ultra is SO marginal, but the change in the performance is SO drastic! Makes me wonder what all of the countless cards I've looked at do at High settings compared to every Ultra benchmark ever that I've looked at! I genuinely have a new way of looking at benchmarks. Thanks a ton!
I'm about to watch this video, but my conclusion beforehand is no. I played Fallout 4 at medium/low all the way through on a shitty pc. When I went back at it on ultra, I only noticed godrays as all the textures looked about the same.
And I see at the end of the video that Prey follows suit with the biggest difference being those godray type lighting effects.
I actually prefer playing without those godray type lighting effects.
This might actually be one of the most useful videos I have ever watched on TH-cam
If your computer can run it... then yes. It is worth it to have settings on Ultra. 🤣
Edit: Here's a graphical tip...
If you want ultra settings and high FPS, turn down or turn off bloom, Anti-Aliasing and shadows.
These 3 options have the highest strain on framerate.
If you want a bit more, some optional settings to change are turning on motion blur and depth of field.
:)
AA also has a huge effect on graphics though, so I wouldn’t sacrifice that too much
@@noir371
Yes, but it's really only noticeable up close and really far away. Turning it off could improve framerate a ton but if you still want it to look decent, turning that down instead, may be a better option.
Thank you. Someone talking sense about smooth gameplay being more important than very small details being improved on. Especially if players are trying to have a good experience on the latest games with three, four or even five year old hardware.
I at-least make sure textures are on ultra. Then if I need more fps I'll start by turning shadows down.
Ultra settings are designed to obligate people to buy new GPU.
Everything is business so companies give us these ultra settings in order to buy more with the time and this becomes a vicious circle that never ends.
it's either ultra high or ultra low.
if i wanted ultra low i would have gone with ps4
khaled nwilati BURN!!!
khaled nwilati Oh but good luck getting exclusives like God of War, the uncharted series and....oh wait nvm, I have a Pc too...go on.
nobody cares about god of war or uncharted...
Ekim Acromer good luck playing half life on console! oh wait...
Idk what all the fuss is about. I'm saying I own a PC. I guess everyone read the first half without seeing the sarcasm.
at 3:15 check the bricks on the wall of the building on the right, thats actually some lovely detail in there, you can see separation between layers of bricks and even indivisual ones (even though it's all just a single texture with a bump map applied to it)
FPS>GRAPHICS
ikr, some people just don't get it! I dont care about graphics, FPS is what counts because it gives you a better experience.
@@paoloh885 so, go and play call of duty 2 then if the graphics do not matter :D
May I refer to this video, including links on description to use a quick part of it? Thanks!
absolutely NO
dev put ultra because they can
even if i have 2 titan Xp i would turn those flashy distraction graphic down and prefer frame rate
Yeah i have a 1080Ti and i usually turn shadows down to high or very high (whatever is 1 less then ultra) and maybe some AA down so i can get a good 4K 60.
Exactly, optimization on PC is already really bad as it is, can't expect all game dev to put effort on optimizing best performance on ultra.
1080ti owner here, with an ultrawide with 75 hz. I run every game on ultra everything with one exception - the shadows on some games I also turn down 1 notch. With my monitor, the 1080ti is perfect for even the most demanding new games, running them at 60-75 hz and looking/playing great.
Wait what, you can achieve 60 FPS with a single 1080Ti? I thought you would need SLI for that. I haven't used a 1080Ti forgive me if i'm being dumb.
I prefer high over ultra for its consistent performance, and I agree that while showing ultra benchmark results is nice for window shopping, most consumers would get more use out of the realistic high settings in 1080P / 1440P @60FPS / 120FPS benchmarking.
Does anyone play baseball anymore? It was like 300fps+ real haptic feedback vr
I don't even play it on pc or console.
My problem with PC gaming is that I am a spoiled brat and always want ultra settings. If I don’t have them on, I feel like I’m missing out and I obsess over it, even if there is little visual difference. Whereas on console, the experience is set and I don’t even think about the visuals because there is no changing them.
The only thing I run ultra no matter what is textures and anisotropic filtering. everything else is a balancing act of whether I care or not
FeTi Productions I do the same, now that I'm playing at 1440p, I loweree te aas filtering
im so happy you having lots of views man you deserve it !!
OK I WILL WATCH THIS FOR THE THIRD TIME. *HAPPY TH-cam?*
One of the most honest and best vids i saw about gaming. Thanks.
It would be cool if you put both ultra and high because people can choose and see how much they gain by lowering just a little bit, good video.
Loved the video. U give opinions, info, data, direct experiences. Very informative. Have fun take care
haha i played resident evil 6 and outlast and black ops on pentium 4 and geforce 210 between 3 and 11 fps so 40 or 50 fps are like a dream that i can't reach ... worth salutations ?! ;)
3-11 cant even be called playing lol
That’s sad
I wonder how many graphics enthusiast would actually get a better visual quality increase by investing on some glasses as the first thing. And no, I'm not joking. People who look at screens a lot, tend to often be in need of some vision correction as a result. Very often, actually.
Me at 0:30 Most Game Devs optimize for High not Ultra: "What??!!" Thx for that info m8
Consoles usually run at a mix of high textures to medium settings, from digital foundary
So it makes sense
Alfonso Cariñena of course you think they goin to go the extra mile for ultra settings lol i think not
The best you can hope for is for PC exclusives (or first on PC) games to well optimized for it on Ultra.
I miss the times when I was really young and when someone would install a game on my PC, all I'd want to do was start and play it without tinkering with the options. Now, I can't start a game without checking out the settings menu first. When a game starts with a forced intro/tutorial which doesn't let me access the menu until I finish it or reach a certain point in the game, I feel a bit frustrated.
The best bang for the buck is still an, i5 - 1070 build. you get ultra at 1440p 60fps without spending ridicules amounts of money
walter z the ryzen 5 1600 has wayy better value than the i5. Intel fucked up. Instead of reacting to amds ryzen 5s, they introduced the x299 i5 and i7 that are exactly the same as the normal chipset ones. Epic fail.
Your probably right, let me rephrase that i5 or the Ryzen 200$ ish equivalent CPU. My point still stands the best bang for the buck is still high-ish mid-range. you get the best bang for buck... BTW "Epic Fail"? for who? I hold no Vested interest in any company, I'm platform agnostic I only care for the best bang for the buck. I think these company competing is an Epic win for the costumer.. to choose sides because you need to justify your purchase is just lame ~no offense
walter z Epic fail for intel. Haven't you seen all the criticism that the new x299 boards with the i9s got? Intel did some stupid anti consumer shit. They sell the i9 on a new chipset that's only on fancy expensive at least 250$ boards, ok. But what the fuck is this i5 and i7 that have a fancy x in their name and fancy boxes that just very slightly outperform the normal i5 and i7 on the normal cheap boards for the same price (250$ for the i5, 350$ for the i7)? My 100$ b350 mobo + 220$ r5 1600 combo outperforms even the i7 for way less money. I mean cmon, the i5 doesn't even have hyperthreading? Quad cores on a 250$ mainboard in 2017? They don't even support all pcie lanes. And what the fuck are these raid keys? Dlcs for hardware? Amd is just looking way better and honest with their offerings with the ryzen, threadripper and epyc. Bang for the buck and honesty. Intel looks like it's about to collapse under the pressure of the competition that they haven't felt for many years. I don't want a monopoly. I want good and honest competition and good products.
The ryzen 5 1600 stands no ground for the i5 7500. In overall value the Ryzen 5 beats the i5 but wea re talking about gaming here no content creation. If you are exclusively gaming on your pc and not working creatively with it, buying an i5 is a no brainer. Intel still has better single core peformance. If you don't believe me, go look at some benchmark. The i5 7500 beats the Ryze 5 1600 in every game.
really depends when and where you buy as the cost of a 1070 is shooting up very far very fast that the 1070 and the 1080 at some online retailers cost the same to this getting a 1080 would be better if that is still so but day by day the prices change
tbh i just want my games to run at high for the modern titles so that when future titles come out i will still be able to play them at 60fps on lower settings instead of needing an upgrade to play them on 60.
I play just csgo and Overwatch on ultra..
Other games (High)
No defense
GPU: GTX 1060 3GB
HU55IEN those are the games you shouldn't play at ultra lmao. Because they are competitive.
Nicknack only pro players care about being competitive for me i play cs go but i don't care about losing or winning i just have fun.if you are just a normal person who have nothing with pro gaming,streaming or being any any field of gaming why i stress my self for nothing
I play CSGO and Overwatch on low since they are competitive games and i don't want to get destracted with
VERY AMAZING FLASHY LIGHTS GRAPHICS and that shit, i just want to play with simple graphics and a solid fps
I'm also no pro player, but i just play better at low settings :D
They never were, they will never be. They are always meant to tank the FPS as much as possible just because it's possible and to force people who have this "pride"/requirement to invest in hardware. There is a reason the console versions do not use the ultra preset. The FPS/image quality ratio is not there, it's a big premium you have to pay for almost nothing in return.
any idea when this mining craze will end?" i need a GPU but they're so overpriced
hmm
Ok, who's gonna tell him >.>
@@michaelgabrielgarcia1004 I'll tell him
Me: dude...GPU prices are still up
Me: I know
Me: okay cool
@@OfficialRedTeamReview Man things couldnt be worse lol. Hope you upgraded before the rona came through with a silicon seeking missle.
The fact that this comment could habe been written far more recently than it was makes it obvious that some things just cycle. Cheers dude.
totally agree with you about the high setting. As casual gamer, i also could not easily able to differentiate which one is Ultra setting or High Setting during the gameplay.
Unless you can afford a top tier graphics card.
I don't need imaginary framerates higher than my monitor's refresh rate, I'll take ultra settings instead.
In most cases the Ultra preset is basically just turning the resolution of certain effects up beyond the maximum the developer intended so they can check a box. i.e. we are talking about what amounts to a brute strength approach to image quality which normally doesn't make sense from a performance perspective (especially if you like to play at above 60fps).
Many games basically look nigh on identical at high, with the exception of shadows which are a bit jaggier (though this is something that will only really matter in close-ups in cutscenes). Textures I'm not factoring in because they are more of a VRAM issue than a speed issue (provided we aren't talking about a really old card).
The difference between High and Ultra might not be very appreciable, but the difference between 60fps and 90fps sure is.
Watching settings on youtube vs actually seeing them on a PC is a completely different story, just as listening to a high quality FLAC or WAV uncompressed music file is far different than an mp3 or youtube file rip.
Johnny Boy Strikes Back
Very true
Depends on the mp3 file, try doing a ABX test and see how you fare. Lossless audiofiles generally aren't worth the space they consume.
neither are lossless audiophiles.
FLAC isn't lossless?
Right on man. I have a Core i5 7400 with a stock cpu fan, 8GB of ram, and a 3GB GTX 1060. I can play most games at 1440p on high (even one or two ultra settings) on Killing Floor 2 for example, and this machine crushes it with the lowest I've seen on a custom map to drop at its lowest to roughly ~65 fps. Many reviewers on TH-cam have the "Linus" mindset that if it its not 4K at 144 fps, you're not gonna have a good gaming experience which is false.
Nice video man. You earned my sub.
Well it depends on the damn game. Obviously.
+Joker Productions, thank you, thank you, THANK YOU for this video. I consider myself a PC enthusiast, chasing after the best performance in ULTRA settings, and feeling ashamed of my rig whenever it wasn't getting above 60fps on the latest title in the highest settings. In fact, I recently upgraded my video card, unnecessarily, from a GTX 980 to a GTX 1080, which yielded better benchmarks, but didn't really change my gaming experience all that much.
Haven't seen the video yet, but the answer is NO!
I love your point. While I may game with different settings (between medium to max), I hardly ever game with everything on ultra (unless its an older game or well optimized game, like doom or overwatch). So watching a benchmark of a card with everything on ultra for everything doesn't really help me in a real world sense. At least everything on high gives more of a realistic, "in the middle" approach (that I'm probably going to end up using).
One big case of the higher settings not mattering is tessellation. Most of the time there's hardly a difference between medium and high tessellation settings, much less high and ultra. And as for lighting (and by extension, shadows), I sometimes think the highest settings can be a little jarring, due to the light and/or shadows having an "uncanny valley" effect.
Tom Clancy's division ultra settings were blatantly obvious, its sharper and crisper with better contrast. The high settings look like Ultra but with a fine layer of silicone grease smeared over the screen; some of the later games, not so much difference. I'm a photographer so maybe having a trained eye helps me see the differences.
wow you get cool shadows, ill take my 20fps ty
1:32 I 100, 200% agree with you there. I play Overwatch at 720p low with 50% render scale on my laptop, and although it looks pretty ass the 45-60 fps gameplay I play at is infinitely more playable and enjoyable than the 15-30 I get with a higher res.
If you can run it, then yes, ultra is worth it. If not, then it isn't. Didn't require a 7 minute video to explain that.
Awesome video. Exactly what i needed to see. Subbed
Performance = casuals. How about you think about your wording next time? I am a hardcore gamer, but I cannot invest thousands upon thousands of ad revenue/patreon + job money on computers.
I am a hardcore gamer, but I have to be realistic. Unlike you, not everyone is living the mile high club. Just because people prefer playing on High/Medium it doesn't make them "ew filthy casuals".
hey bro i need some help i gona buy laptop with 15 inch display with gtx 1070 what should i buy 1080 ips display or 4k ips display
I am after the visual. If I play game like Witcher 3 or BF1 I would go Ultra cause its look somewhat better, u immersv urself with the lightning and I atleast want solid 60fps.. if my GPU cant take it, ill just buy stornger one. U dont have money? Get urself a job, done.
Not everyone can just get a nice job and spend a ton of cash on PC components. Also, prices are different all over the world.
Johan Naudé please its so fucking easy to get 400 bucks together
Not in places where the minimum wage is $2 or $3 an hour and then you pay a quarter of that in rent. Not everyone lives in America you know.
Johan Naudé there you shouldnt game at all i guess^^
People need to escape from reality somehow xD
After watching this vid I took your advice and play all my games at High settings rather than Ultra, PC Specs: i7 6700, 12Gb RAM, GTX 980, Windows 10.
fuck no it doesn't! Been a console gamer all my life then I finally fell into the hole pcmasterrace bullshit. got a good job started buying the most expensive components and I concluded that other than the vast amount of game discounts that exist on PC, there is literally no reason to go from a console to a PC. I literally notice no difference other than small visual increase on PC when compared to consoles. Pisses me off
although this is somewhat true, at the same time the visual increase can be pretty damn nice when coupled with 60fps which consoles just cant do......... until xbox one x of course, and from what ive seen it seems the one x will perform pretty much the same as my gtx 1070 rig which cost me over the price of the one x lol
Mmmmm thanks for the reply. Never heard of reshader and similar software. Been doing research after you mentioned it and have to say that it sounds like a really cool way to customize the gaming experience. Do have to say though, seeing a lot of posts saying that such tools can be a pain due to them being buggy. Thanks :)
Depends on the hardware you're using. When I play the witcher 3 at ULTRA settings, 4K, 60fps, 4:4:4 chroma on my LG OLED TV, it's the best visual experience I've ever had in gaming. Mass Effect Andromeda, Arkham Knight, and other games look fantastic maxed out at 4K/60fps.
I did that PC gaming thing in the early 2000's, tried it again in 2011. Never again, total waste of money and the experience is still the same. These idiots that think that just because you can PAUSE AND ZOOM IN a video to see the differences makes it worth it lol. I remember when they used to port arcade games to console in the 80's and early 90's you'd have sprites half the size, half the animation, way lower sound quality, and you'd be missing the ENTIRE FUCKING BACKGROUND!!! Now people jizz over an object in the background that looks a little clearer if you squint, lol!!!
DigitalHaze65536 it's about the FPS. I can play on very high settings at 144fps giving me a super smooth experience as well as better visuals compared to consoles 30fps. It works out cheaper pc gaming over 5 years than it does on consoles too. No need to pay for online services. Games are generally cheaper especially if you wait for sales.
Thanks for the video. It's been a while, but I'm getting back into PC gaming (vs. console). Mostly saving and playing around with PCPartsPicker to see what I can (pre)build. This helps out a lot.
Real men go 4K, Ultra.
And lose every match
nah real men go 1080p low settings cuz it gives more fps ^_^
Joker don't forget about in some games. ULTRA can mess with your input lag too...frametime is always more important.
For anyone trying to spot differences, look at any part that uses shaders (reflections, water, close up shots of lighting/shadows. You should be seeing a slight difference in accuracy of the shader there.
Very good video thank you. I did not realise the difference was so insignificant.. I'm one of those that tries to play everything on Ultra... I will now review that :-) Subbed!
Recently got a gaming PC, admittedly it usually hits above 60 fps on ultra settings for most games, with very few occasional drops. But on the 2016 Doom, I noticed heaps of particle and lighting effects in some levels, particularly the "VEGA" level - for those who've played it. It's the only snow level in the game and I experience huge fps drops in ultra settings. My two GTX 760 GPUs just couldn't handle the snow, even when working together. So, I turned it down to "high". The game still looked gorgeous, and the snowy VEGA Mars lab was portrayed on my screen in beautiful 1080p at over 100 frames per second. Huge difference it makes sometimes.
Totally agree with you. I run my games normally on High settings to get the better performance over visuals unless Ultra settings do not effect performance that much.
I use a mixture of both high and ultra, thus is what I do
Textures-ultra
lods i.e loading distance- ultra (I hate pop in)
post processing effect like motion blur, dof, lens flare- off
everything else- high
In any multiplayer game- competitive or not I rather rock low settings with higher res + AA simply to have an edge over players who don't. Ultra quality in the end only gives you visual stimulus and excitement for a short while however realistically in any game. Be it a shooter, mmorpg, etc.. after a certian point when you're no longer playing for fun but rather to win; you as a person start ignoring details in the game world. Start focusing on only what is necessary. When you hit this phase, winning gives much more excitement than visuals and to do so, most of the time you'll need to turn off distractions like crazy amounts of lighting effects, motion blur, etc... to maintain higher frame rates and reduce visual burnout.