The differences in this video can be a bit hard to spot, I recommend watching in 4K (when TH-cam gets around to processing it) or pausing the video at times to take a closer look. A high bitrate version for our Patreons will be available soon.
Tim.. Great review as per usual.. I have something juicy for ya :D translate.google.com/translate?hl=en&sl=auto&tl=en&u=https%3A%2F%2Fwww.tomshw.de%2F2019%2F07%2F11%2Fungefesselt-radeon-rx-5700-xt-auf-ueber-22-ghz-uebertaktet-break-the-limits-mit-den-neuen-softpowerplaytables-fuer-die-rx-5700-und-rx-5700-xt%2F3%2F
I think it'd be nice if you circled the differences since I don't really want to spend time freezing frames and busting out the magnifier to spot them myself... I think I noticed at some point part of a tree that was being sharpened while the rest was being motion blurred but every other differences I may have noticed didn't feel significant; that suggests these features aren't very important.
So Navi sharpening has better game support, image quality, and performance on release than DLSS does after nearly a year in the wild. You could almost say of AMD's implementation... It just works?
To be fare it does take more effort as a user to take advantage of, testing res scales back and forth in every game to find the perfect spot. But it is a nice extra feature to be sure.
@@everythingfeline7367 While I do agree in general, I was more talking about when being used in up-scaling (as this is what interests me the most), he clearly didn't use a standard res scale per game and stated settling for certain scales therefore I assume he had to test many different ones per game to find the optimal one. In this case it does require a per game implementation just by the user not the company. which makes me wonder 2 things, time investment vs a 1 click option. and quality variation between finding the perfect sweet spot vs as long as its within x% res scale it doesn't matter.
@@ps2232 If I was buying a Navi card and running it I would just set it to 1800p across the board. It was the highest resolution he used, so no lack of quality, whilst still providing a performance uplift
@@ps2232 I've seen a video before with different resolution scaling in games with 4k. Generally there is a range where you won't spot difference (or hardly notice) while down scaling 4k which is different between games. This feature simply needs a couple of tries with the scaler to get the best of it. Don't forget, when using scaling you generally (afaik) don't need to restart the game. But if we're talking about getting that '60fps' spot with 4k, I'd say some tweaks are worth it.
Agreed though I do wonder at how much time/effort it will take to find the perfect res scale for every game to get a good effect, each game clearly requires something different for optimal settings kinda wish he elaborated on how many he tested before settling on the ones shown, how much difference there is if you don't spend hours tweaking to find the perfect one.
@@backupplan6058 xD you got me with this meme, but actually, it is working, sadly i will not be able to use that on vega 56 ;( atleast i will have freesync
But but AMD bumped the prices too or rather seconded Nvidia. They don't even have the excuse of having to add tensor cores or any extra hardware but that is ok?
@@Code-n-Flame it's called marketing. You want to position yourself at or near your competitors price to not seem like that bargain brand. It's also why Samsung prices their device similar to Apple, because they want to be seen as the premium Android device, even though Samsung's devices may be cheaper to produce or whatnot. AMD just followed Nvidia in their pricing, because even if Radeon 5700xt were superior, people would still buy way more Nvidia cards. Happened in the 400 series era. The Radeon cards were superior at every price point, but people didn't buy them, kept buying Nvidia. So price is what it is because of marketing, and that people will pay whatever for Nvidia card anyways, why not price yours the same, while beating the equivalently priced card
sharp images are ALWAYS brighter! This is because blurriness merges nearby colors into the edges of what youre looking at. This isnt a flaw, its physics. Its even present in real life if you need glasses. I noticed it a couple nights ago and then to see it in this video -- gotta love that timing
@@railshot888 yes there are some that just believe because Jensen Says 'It Just Works', I prefer to believe because Steve & Tim who work so hard to get the facts to Show me that 'It Just Works' or in the case of DLSS 'It Just Doesn't'
Not really. There are a LOT of dx11 games out there and none work with this. Something tells me by the time they finally get their act straight Nvidia will have improved DLSS enough to compensate.
@@FcoEnriquePerez True but they are always playing catch up which is not a good thing. As the market share between the two clearly reflects that. As good as this video tries to make it sound, it will not reflect in sales. AMD needs to beat Nvidia to the punch & until that happens, Nvidia is not going to lose any ground.
@@Keith2XS If you really watch the video you will know why it's not working in DX11😂😂 As TIM said, currently it only supports DX9 and DX12, other API follows soon.. The beauty of RDNA architecture which is seldom mention by reviewers is that FidelityFX ( Anti-Lag, RIS ) are "backwards" compatibles with older games that runs base on GCN architecture and handles other feature forward such as "Raytracing". This is why PS5 will be the best console, adopting RDNA as graphics ensuring old playstation games playablein PS5.
It definitely shows that all they want to do is sell gimmicks, not anything that actually adds worth to your gaming experience. That is also seen in the fact that all of Nvidia's gimmicks are closed source and proprietary. That means that only Nvidia can access the source code, and so game developers implementing those features can't optimize them for their game. AMD, however, releases all of their features open source ( gpuopen.com/ ), for free. And being open source, developers can optimize them to work equally well on Nvidia GPUs too. But Nvidia is the one with the money, sinking their claws into various game studios, forcing them to bog their games down with GameWorks gimmicks when these free and open source solutions are right there, free for the taking. Nvidia does not offer value to the gaming ecosystem, despite their marketing prowess. They rig the system and milk it for all the money they can.
We'll see long-term but Image Sharpening causes its own issues. I downloaded the 4K version of this video and at 12:07 if you PAUSE the sharpened image was very grainy and in motion it caused an obvious flickering (may not be obvious if video is compressed so may need to see it at 4K)... DLSS has a lot of potential but I agree it's not so great right now. However, I also have a good, basic understanding of how DLSS works and it's actually quite fascinating. It's not a "gimmick" though it may or may not succeed. And DLSS may be proprietary in a sense, however NVidia will develop a DLSS profile for free so that's not an issue... I'm not going to call AMD or NVIdia's stuff B.S. until it's been given some time to improve.
@@photonboy999 Yeah, stop shilling dude. DLSS like all of Nvidia specific black box software are a blight on the industry. They spend far too much time trying to make it so that they are the "only" company capable of this and that and then are always outed for their shenanigans. Just what are you defending, DLSS is rubbish, it had limited resolutions it can be used with, higher than would be tolerable performance hits and is only implemented in a handful of titles because developer have to go back in to implement it. What's to like about this rubbish? You Nvidia fans try too hard to defend even the most indefensible positions. High performance hit, limited scope of use by resolution and even limited titles can use it at all, while RIS negligible performance hit, works in all DX9, DX12 and Vulcan titles, not perfect but better than DLSS at nearly even metric they matters and you still find a way to brown nose. Disgusting.
@@photonboy999 you are not going to pause the games and notice that couple different pixels in real life scenario, this technology is actually better than nvidia's dlss for players that wanna have 4k experience but in a budget
@@evilformerlys4704 well in metro exodus its outstanding it works super duper amazing however in battlefield V isnt cause that game hasnt fixed it. But why not this ray tracing for example is something only nvidia has now and hey for me shadows and reflections aint a big deal but global illumination is a big change for me like Quake II and Minecraft (it doest use the RT cores at least yet) and Metro exodus this realistic lighting is outstanding and its the most demanding feature that even the GTX1080Ti falls behind from the RTX2060 when using RT cores. Nvidia isnt trash they try to make money like any company would do so i love the fact that amd gets better so they can force nvidia to release better cards cause Nvidia is far ahead in GPUs technology even these RX5700 with 7nm actually have higher consumption than the RTX cards that have 12nm.
@@serfillustrated4018 its not ai based at all, its a post process filter, 3rd party software was able to do that already so its not amd revolutionary tech thing going here, it looks quite bad, some ppl love oversharp stuff ik, but its more like reshade, specially that bf V clip it WAS reshade footage i swear, that oversharpened texture, ughh.. but yes, reshade is meant to LOOK better so, who actually cares of how, or if its more like image "faking" as long as it looks better and performs as promised, but this scalers and things barely do much, if they do something its barely noticeable or u go oversharp, ofc mhe vs absolute trash (dlss) is fairly easy to decide, but this kind of sharpenning is NOT new while dlss IS new, does that matter for the final consumer, no, but just like AA implementations, first AA implementations murdered perfm, first "low cost" AA destroyed image quality..... nowdays good AA looks better and costs little to enable, dlss has potential, for now its worse, just like raytracing, is it useless or bad, no in fact its the future, is RTX 2000 series good... thats completely diferent but no, and yes, if u never push a new tech u never get it, so, i like the raytracing, this is just the first gen of both dlss wich btw mens deep learning (and since its new, not much... learning done) and rtx, also new... give these more years and we'll see, that being said as of now ANY post process filtering (sweet fx, reshade, or this amd sharpen>dlss)
@@arencorparencorp2189 Yes it is... He even says it in the video..lol But like you said, the consumer only cares about the final result. Thats why ps4"s checkerboard upscaling is awesome even if its Faux-K... www.amd.com/en/technologies/radeon-software-image-sharpening
Currently playing Monster Hunter World, and constantly being dissatisfied with the AA options. Image is either too aliased or blurry. Really hope image sharpening provides some decent stuff on that front.
The only AA option that doesnt suck really hard on MHW is FXAA, the other options makes it seems like we are with motion blur, but without motion... rofl
If you are running an AMD card, you might try VSR. Virtual super resolution is a setting in the driver that allows you to Run at a resolution above typical 1080p monitor such as 1440p and 1800p and then downscale it back to your monitor. It basically has the same impact as SSAA and can be combined with other in game AA. I use it often for well optimized games like DOOM or older games where the performance hit is negligible to clean up the image such as Boarderlands 2 with the new HD texture pack.
@@hitmarkler honestly this does change things. "4k"with Ray tracing resolution could just be 1440p with image sharpening. That would save performance and probably enable 60fps ray tracing at good resolution.
RT is basically a proprietary version of DXR which is a feature on DX12. If I have a thing to complain about AMD right now: Their video encoder sucks. Also OpenCL is kinda flop compare to CUDA support from developers.
@@Archmage1809 Ray Tracing (DXR) is not "just" a feature of DX12, it's an API that allows your Ray-tracing hardware (whether be it Nvidia, or AMD in the future) be compatible with games. The RTX cards just have proprietary *hardware* that allows them to more effectively ray-trace through an API like DXR. Hence why even GTX cards can ray-trace, but at crawling frame rates.
@@Archmage1809 only for certain encoding works... There is a video from Wendell from level1tech and eposvox regarding the massive encoding performance of the 5700xt...
@Dex4Sure Again, that depends on the GPU they have. Most use CUDA because most have Nvidia GPUs. But if they have Radeon GPUs, which don't have CUDA (it's a proprietary Nvidia feature), they can use AMD's open source ROCm.
@@VoldoronGaming I would go checkerboard over native for consoles regardless, only because a stable framerate is more important than detail when you're chilling out on a sofa
@@abirneji I think they will be able to do 4k60 just fine. It will be a custom navi chip. And we didn't hit the top of navi performance here. There is no way they are going to give up on the opportunity.
>some people still think the super sampling ai is better because oversharpening like this does literally nothing unless you stare at the tree that's 300 yards away
@@zade6828 They can't fix the oil paint effect, it's AI logic, they don't understand what's "good" and what's "bad" the AI tries to recreate the image and does it but it costs the "realism" our productions have, until someone sets the parameters for "realist quality" DLSS won't improve at all.
@@Rhino123freak i am fan of the underdog here as well (at least for now), more because of the intel oem buyout they did in late 2006 when they couldnt compete. that cant be talked right in my book. same with nvidia and GPP last year. i wish for risc-v and arm to lift off into maigstream, as x86 is a powerhungry isa.
@Alistair not even talking about Intel here. I belive the super series cards launch was pretty damn good but even before Navi launch the overall opinion on the super cards by HU was more or less 'eh' while other reviewers agreed it's pretty solid. Again, nothing substantial, just a hint of a love for team red that edges out others and that's fine by all means.
@@Rhino123freak You would have never got that hint of 'love for AMD' during the FX years. If we seem biased towards AMD today, that's probably because they're offering good products, the second that changes so will our bias ;)
It is also really funny to see them trying to pretend like they are impartial. Honestly, my only grip with the NAVI cards are their cooler, but at the same time, I don't really have any intention to overclock them, so I don't know if the base cooler will be enough.
Well it is tech that has been out there for years such as reshade which has sharpening filters and other things like aa which I have been using for years I recommend using it
@@marstondu67 the base cooler is perfect if you're not going to overclock. Just under volt it and set a fan curve under 2000 rpm and you'll not hear it and will game perfectly fine
@@toonnut1 I just saw a video from the channel Not an appel fan, where he was talking about undervolting for the RX 5700XT and how quiet and performant it is. I didn't realize it was that simple to undervolt the card. I've also an very old PC with an i5 2400 and a GT 545, so I'm pretty much used to a very loud PC overall ( to the point where I thought my PC would launch to space ), so the noise is not a problem for me. I will still just be waiting for 3rd parties solutions's prices and if it is significantly higher, I will go for the stock version, especially since I am not that interested in overcloking the card. The fact that I could get 6 months worth of Game Pass by buying a Ryzen 3700x and a 5700XT is also something worth mentioning I think.
Like when Nvidia introduced GSync and to be countered by freesync and now DLSS with Radeon Image Sharpening. I will laugh if in the near future if Nvidia will force to support the image sharpening like what happen to free sync.
nvidia already support image sharpening with their Freestyle. and unlike AMD where the support is only exclusive to Navi nvidia image sharpening work with all of their GPU.
They will, right now RIS just destroys DLSS, they had to do the same in monitors cause the % of monitors using Freesync was far higher than the ones using G-Sync, here the same happens, you can literally use RIS in anything (also DX11 if you are using the Reshade version that just got implemented using the Open source RIS code)
Yea I'm unsubbing from this channel after seeing this. Clear AMD pandering at it's worst. I have BFV. I play it every damn day and in no situation with DLSS turned on have I noticed textures get THAT mucked up. Medium? Try low.
I tried it but honestly, not great results. It's good for sharpening native 1080p or 1440p but if you're downsampling the pixel count gets a bit low to resolve enough detail for sharpening to 'fix'
@@Hardwareunboxed Thanks for the reply! You've cleared up everything that I wanted to know about this. Edit: Seems like there is no easy way to achieve 1440p 144fps without throwing money at it. There are now more affordable 1440p 144Hz Monitors available but not so many affordable ways to feed them :/
Hardware Unboxed so .. 1080P sharpened, on a 1080P Monitor of course. Would still provide some clarity improvement? Not 1440P level but better than 1080P.
I also got the RTX Super ad. It seems Nvidia are indeed worried if they now are putting ads on AMD Radeon videos. Nvidia: "Just tryna keep that mindshare, fam! It just works!"
The shitty reference cooler is a valid reason. Sure, the best thing to do would be to wait a month for the custom cards, but some people are going to want or need a card right now, and once again AMD's dreadful attempt at a cooling solution is letting the side down.
@@DeosPraetorian It's mostly an internal decision. AMD is also embracing open-source on their technology and on their drivers while Nvidia only play the open source game with their Jetson ARM SBC and nothing else.
@@defeqel6537 i'm dying to see a Morpheus 2 review on this, i'm surprised no youtuber did it. It could give a approximation of the results of a custom board, you could literally have the only review of a custom board online.
The guys over at Lvl1Tech were already super enthusiastic about AMD CAS, and watching this, I can see why. I'm curious to see what a 1080p sharpened image can become to get those higher FPS on 144Hz refresh 1440p monitors.
@@erejnion Shit-but-works is a quality class all of its own lol. Russian chemistry is often the same way, or was - it's all high pressure, high heat, bake and pray, but works amazingly well with heavy duty machinery lol
There are reasons why XBOX and Playstation stick with AMD all these years Just wait for Aftermarket versions and driver update 👌 RTX is Overprice RT is Overrated RTXS ??
@@TheConst4nt well rdna will be used in the ps5 in some way soo games brought out to consoles are easily optimised for navi cards raytracing is a nice concept i welcome any day but i wont pay a fortune for a feature thats barely being used i havent had a amd card in years and i dont plan to get one too soon either because in the end all we want is good performance for a affordable price glad competition is back
@@NajCarnage Way too weak for proper consoles, also nVidia hasn't been a great partner to either Sony or MS, so no surprise they don't want anything to do with them
Thanks for the review. 1 thing that made me curious after watching the video is the possible impact of this feature for the upcoming APUs after their update. That could have some interesting effect on playability and could turn out to be an interesting way to counter possible future intel's igpu improvements. But we shall see it at some point in the future. Anyway, good luck in the future and have fun.
LMFAO, so Nvidia strapped on expensive and data center focused Tensor cores to their already overpriced GPU's, spend a ton on R&D to develop machine learning algorithms to reconstruct images...and AMD got better results with post processing.
I don't think DLSS is really training anyways. If it was it must be really slow. And the initial product is significantly worse. Or Jensen can say "we do not have enough sample size, for our AI to train effectively, so everyone should get their friends to buy 2 RTX cards and help pushing this along". I still think Jensen just threw in "Deep Learning" to make it sound cooler and as a reason to justify why they look like Vaseline smeared all over monitor. Nvidia can just say "doh nubs it's deep learning, and eventually it will be just as good as real 4k with no specific time frame." It could be tomorrow, or in 2025 or it could be 2030 when their "AI" completes the learning process or it could be never.
it's not like Nvidia developed tensor cores and stuff like that for gaming exclusively. It's really useful in machine learning in general, science, industries, ML/DL/AI is the future and Nvidia is driving it. Of course, it can be used in games and as AMD provided no competition for quite some time, Nvidia can charge whatever they want for their cards to cover the R&D costs a bit. But I guess that doesn't really interest mere gamers. Not saying Nvidia is "saint", but we can't judge them just from the gaming perspective. Well done AMD.
@@LilljehookThe One X already achieved games in native 4K, like RDR 2 and Forza Horizon 4, so yes there will be games in native 4K on both PS5 and Scarlett. Plus, if we take new technologies from AMD and optimization methods specifics to consoles, 4K 60 FPS is totally feasible.
@@gamernation1400 Honestly, much more details and 60 FPS for the game. Open World games will also have many improvements in terms of framerates and loading times.
* Watching 4k comparison in crappy YT 720p * Another triumphant victory for AMD! Seriously, you can see improvement even in compressed 1080p and it's almost free(hello, RTX)
The Battlefield V one is insane, AMD is even beating native 4k in texture detail while NVIDIA looks like it has 480p textures, just look at the tank it looks like a motherfucking PS2 game.
@@defeqel6537 After looking at it for a while there is definitely something weird going on, not only the textures look like if they belong to GTA San Andreas but even the 3D model has less polygons on the NVIDIA side.
@@stargazer162 yeah, someone on r/AMD on a post about this video, posted a 4K picture from a 2080Ti, and the textures were way better, but it still clearly had fewer polygons
@@mix3k818 Sounds more like it. I am super curious what they are going to do about power though. And how well Navi will scale with more CUs. Vega 56 and 64 were more or less the same at the same clocks after all.
Nice to see a company implementing things that actually make sense for today's games. Absolutely love the fact that, "it just works", we don't have to rely on special implementations, and we get a real perceivable improvement to our gaming experience without having to sacrifice on performance or quality to get it.
1080p sharpening vs. 1440p both XT and 1800p 5700XT sharpening vs 2080 and 2080 Ti 4K please please please! This would change the value proposition incredibly for the XT
You could extrapolate the data from the info provided in the video. He mentioned almost 20-25% higher performance to get roughly the same quality. So it would probably perform in between a 2080 and 2080 ti at 1800p sharpened compared to native 4k on the other cards... The 1080p results he has mentioned in another comment that the 1080p native and 1440p native see good quality improvement but undersampling the 1440p didn't work as well.
Its still just a sharpen filter, you cannot compare it to another gpu running at native. Here if you want to try it www.reddit.com/r/Amd/comments/cc0575/i_ported_fidelityfx_cas_to_reshade_so_anyone_can/
That's really impressive. I was very hyped for DLSS until we saw how big a hit it took in image quality relative to native resolution, but this looks a lot more promising. Thanks for this video. You earned a sub.
It's a little harder on video because of the compression artefacts (Radeon's technique needs a clean image to work with for good results), but you should check out new video codecs like VVC and AV1. Eventually TH-cam will be able to halve bitrate while slightly improving quality through intelligent codecs.
Have you tried MPC-BE + madVR + youtube-dl? Though, it only works up to 720p, sadly, even if you open a higher resolution video it will load at 720p, but for upscaling, denoising and debanding low resolution videos it is very impressive.
@@claritoresdiano1021 If you download the youtube-dl file and copy-paste it into MPC-BE's folder, it works and lets you open TH-cam links on MPC-BE (as well as other supported sites). If on top of that you have installed and configured madVR you also could use high quality real time upscaling, denoising debanding and more.
Great job Tim! However, I feel these cards are more suitable for 1440p gaming. Could you please test these with 1440p 144Hz monitors, and down sample from there? For example: if the 5700XT hits 90fps @ 1440p, how much uplift can we gain by downsampling and using the sharpening filter and at what visual cost? Thanks for all your hard work!!
Yo HU, in the battlefield 5 one, is the Nvidia DLSS one running at the same detail/texture settings? The model and textures look almost completely different, is DLSS really that bad?
I've started using RIS for the first time with Cyberpunk 2077 and I'm impressed. Traditional sharpening doesn't usually look great, like the setting on a monitor or TV but this RIS contrast adaptive technique is really really good to my eyes.
Same here! I overlooked this feature and never gave it a chance until now while trying to find the best performance/quality settings for Cyberpunk 2077. I even took screenshots side-to-side comparsion, because I couldn't believe the difference and thought it was placebo... but man, does it make a difference.
Technically yes, but the thing is that the lower the resolution the less effective the sharpening becomes. So going from 1080p to 900p and sharepning it will look significantly worse than going from 4k to 1700p and sharpening. Purely on a lack of pixel information for the sharpener to work with.
Yes to textures, no to detail/aliasing. Of course, still much better, especially when we get mobile Navi APUs, where gaming resolution and texture quality will still be low - so it will benefit greatly from this. But on PC- I think we should aim for at least 1440p to 70%, which would be great for textures, and already offer ~enough detail.
I have done that before using Reshade and while it gets kinda close to native 1080p, the difference is still obvious. I don't know how it would work in AMD RIS case, but I don't expect much.
How about getting 1440p quality with a lower render scale? There are definitely games that don't hit high refresh rates at 1440p on the 5700xt, but if it could run 1080p or 1200p could really give the 5700xt the edge. This would also be very helpful for people running 1440p ultrawide or double ultrawide, where the pixel count starts hitting 4k'ish levels.
Ive seen the review of anti lag and guess what it works. Yet some shill youtubers who will not be named still push raytracing as a thing. Anti lag and image sharpening is more useful than what nvidia is pushing.
Nostra-Tim calls it! LOL! Great work Tim and Steve. I know you guys have been working overtime lately! Steve and Tim are Cylons, confirmed! Can the new NAVI cards be run in CrossFire? How does it handle VR? I would be interested to see how NAVI's image sharpening works in Elite: Dangerous. The game has such massive contrasts, some extremely thin lines that look like staircases, and quite a bit of text. Thanks again for all your hard work! o7
I have Samsung Magic Upscale in my monitor, basically the same thing? It has 2 levels of sharpening to choose from. It looks pretty good in my opinion but I haven't been gaming "real" after I bought this monitor so haven't used it, as it looks bad in normal apps. Nice video from you as always!
This is the first time I've seen DLSS. It's awful. It's the opposite of a "feature". As for Radeon Image Sharpening, to quote a certain GPU CEO, "It just works". At least for the most part it works as advertised.
I hate that blur in modern games. Recently played Mafia 1 and COD1 and was shocked how crisp they are: I didn't need to squeeze at my 27" screen to see smaller objects.
I know TH-cam compresses the video here, but even then, the DLSS images looks so soft in low contrast areas, like really muddy. Whereas the RIS really boost up those areas and makes them look crispy with no ringing... I am impressed.
Update (in case, you were wondering): Radeon Image Sharpening is now supported in DX11. It is also supported in Vega GPU's AS WELL AS RX 570, 580, 590 and 470, 480 Source: www.amd.com/en/technologies/radeon-software-image-sharpening Even though it might not be apparent in this video, Image Sharpening does in fact make things look a lot better. Crisper and brighter. I have seen the effects in Witcher 3, and also even in Benchmark softwares like Unigine's Superposition. I have noticed absolutely no fps drops whatsoever. I own an RX 570.
Hey Tim. Amazing content. I wasn't expecting such results. This is great news and suddenly want to go a notch higher non my upcoming GPU upgrade. Thank you.
I've used Reshade for quite along time now, and it can make games look alot better aswell, but of course it does have a performance hit, and not always guaranteed to work in all games.
I discovered Reshade relatively recently and used some of its stock shaders in a couple of games. For me, the combination of adaptive sharpening and "colorfulness" filters provided a huge improvement in image quality in Skyrim and Transformers war for the cybertron, both of which I've been playing lately. Naturally, sharpening produces some artifacting, but it's minimal in the way I configured it, while makes the overall picture look a lot clearer and in my experience highlights some of the fine details, sometimes even making textures look better than they actually are. Considering that those Radeon image enchancing features provide better results and are easier to use, then... I see it as an absolute win!
You can also do the same in the Nvidia Geforce Experience with the built in game filters. 4K to 1440P, use the sharpening and clarity sliders. Looks good and works in DX 11 as well.
Radeon Image Sharpening is at the moment the one feature that gives me buyers remorse for my Vega56, even though I would have paid 160€ more. I could probably put my r9 285 cooler on the Navi card. Hmm.
Another great video. Could you guys try make a series of benchmark for real life situations? 1. Running youtube video/twitch.tv while gaming. shows which Cpus is bedst at that because most people who game while watching listing to video do care which cpu can do that.
This is much more advanced than that (Navi has dedicated CAS algorithm hardware; said algorithm being MUCH more complex than anything used by ReShade [hence needing that hardware]), though both are of roughly similar "means" (post-process sharpening solutions).
I'd rather use this if I can, since it's better, but if you don't have a Navi GPU or you're playing a DX11 game that's a decent alternative, and it's still better than using DLSS.
Awesome review! With a future AIB card there should be a fairly large chance of a +5% overclock without to much effort. Add that to the 27% performance boost gained with 70-80% GPU scaling & RIS and you have a total performance rating of +166% (compared to a GTX 1080) whilst the 2080TI comes in at 169%? This would mean that 70-80% GPU Scaling with no significant quality loss with a decent overclock gives you a 10k card for just around 4,5k? Request: I would love to see a direct comparison with a 5700 XT 80% GPU Scaling + RIS with a 2080TI card in like 10 games.
Even at 1080p (4k isn't available yet) it looks considerably better. I've always turned AA off in games because I hate that blurry look but this fixes that perfectly and even allows you to upscale intensive games with minimal quality loss. Bravo AMD. This plus anti lag might be enough to get me to come back to team red (with an aftermarket cooler of course). Does anyone know if AMD will be releasing anything faster than the 5700XT in the near future though? I've already got a 1070Ti and a 30% performance bump isn't REALLY worth it to me.
Look like this has already been ported to ReShade, and it works with Nvidia cards (even with DX11). Just the CAS sharpening though, the scaling will have to be done in-game or in-driver though. Be interested to see how it looks and performs compared to the real thing.
@@Elixylon Well, that's my point, the 2070 Super is almost at the level of the 2080. It certainly doesn't make sense to buy the 2080 now that the 2070 Super is out. And because it certainly doesn't make sense to buy the 2070 Super now that the 5700 XT is out, I'm thinking it doesn't make sense to buy the 2080 now that the 5700 XT is out.
I watched this at 720p and often couldn't tell the difference. Really love the work HUB does, I can never get enough time to test this all for myself. Would love if you added some very zoomed in "pixel peeping" side by side comparisons that are very obvious to just illustrate some of the differences for those of us with poop internet.
Does it work for vr gaming? Can someone please check?! That would obviously be an awesome advantage for vr gamers who currently normally avoid amd cards as it seems.
It will work with anything using either one of the DirectX 9/12 or Vulkan graphics .api's. So as long as said VR game/software was coded for one of those (instead of say DirectX 11 or OpenGL), then yeah, it'll work just fine through a VR headset, just like it would any "monitor".
Looking forward to your ANTI LAG investigation! Ive got a rx580 and its hard to notice on or off in bf5. That said, if it does indeed improve the feel of gameplay im all for it
NVidia "How do we create a feature that can latch on to the recent AI craze?" AMD "How can we deliver nicer graphics for lower cost?". Think that says it all.
The shirt should have an image of a waffle with a line dividing it in half. On one side, it says DLSS and the image is smeared, and on the other side, it should say Radeon Image Sharpening, and it should be crisp.
The differences in this video can be a bit hard to spot, I recommend watching in 4K (when TH-cam gets around to processing it) or pausing the video at times to take a closer look. A high bitrate version for our Patreons will be available soon.
Video about image quality, TH-cam autos to 480p...........
Shows up clearly enough to me on my phone.
Tim.. Great review as per usual.. I have something juicy for ya :D
translate.google.com/translate?hl=en&sl=auto&tl=en&u=https%3A%2F%2Fwww.tomshw.de%2F2019%2F07%2F11%2Fungefesselt-radeon-rx-5700-xt-auf-ueber-22-ghz-uebertaktet-break-the-limits-mit-den-neuen-softpowerplaytables-fuer-die-rx-5700-und-rx-5700-xt%2F3%2F
Watching at 4k hmm... here i am with my samsung 720p phone 😃😃
I think it'd be nice if you circled the differences since I don't really want to spend time freezing frames and busting out the magnifier to spot them myself... I think I noticed at some point part of a tree that was being sharpened while the rest was being motion blurred but every other differences I may have noticed didn't feel significant; that suggests these features aren't very important.
So Navi sharpening has better game support, image quality, and performance on release than DLSS does after nearly a year in the wild. You could almost say of AMD's implementation... It just works?
To be fare it does take more effort as a user to take advantage of, testing res scales back and forth in every game to find the perfect spot. But it is a nice extra feature to be sure.
@@ps2232 the fact that it doesn't require per game implementation means it has a much bigger spread
@@everythingfeline7367 While I do agree in general, I was more talking about when being used in up-scaling (as this is what interests me the most), he clearly didn't use a standard res scale per game and stated settling for certain scales therefore I assume he had to test many different ones per game to find the optimal one.
In this case it does require a per game implementation just by the user not the company. which makes me wonder 2 things, time investment vs a 1 click option. and quality variation between finding the perfect sweet spot vs as long as its within x% res scale it doesn't matter.
@@ps2232 If I was buying a Navi card and running it I would just set it to 1800p across the board. It was the highest resolution he used, so no lack of quality, whilst still providing a performance uplift
@@ps2232 I've seen a video before with different resolution scaling in games with 4k. Generally there is a range where you won't spot difference (or hardly notice) while down scaling 4k which is different between games. This feature simply needs a couple of tries with the scaler to get the best of it. Don't forget, when using scaling you generally (afaik) don't need to restart the game. But if we're talking about getting that '60fps' spot with 4k, I'd say some tweaks are worth it.
Radeon Image Sharpening = Perfect for new Playstation and Xbox
I'm hyped for it. Major 4k frames around 80% density.
So true
That’s what I was thinking , it will definitely help with games that are checkerboarded or use no aa
I suspect this is one of the reasons it's so well done already: Sony and Microsoft might have paid for a lot of RnD
I suspect that's how the PS5 will pull off "4K" gaming at an acceptable price point
Watching this at 360p, as intended
*144p
Did you enable Radeon Image Sharpening? It makes a world of difference.
Me too.
@@tim3172 LOL
TH-cam goes higher than 360p? What?!
unlike NVIDIA Dlss game developer doesnt require to implement just go settings and turn it on and works.
well you have my money AMD
moreover, AMD's implementation is much better without that annoying oil painting effect.
Are you telling me AMD “ it just works”
@@backupplan6058 LMAO!
Agreed though I do wonder at how much time/effort it will take to find the perfect res scale for every game to get a good effect, each game clearly requires something different for optimal settings kinda wish he elaborated on how many he tested before settling on the ones shown, how much difference there is if you don't spend hours tweaking to find the perfect one.
@@backupplan6058 xD you got me with this meme, but actually, it is working, sadly i will not be able to use that on vega 56 ;( atleast i will have freesync
Holy damn, such a simple solution kills nvidias DLSS which took soo many tensor cores and machine learning, bumping the price noticeably... Bravo AMD
You know... Sometimes, simplicity is just better :)
Very well put; Nvidia got well and truly rekt on this front.
But but AMD bumped the prices too or rather seconded Nvidia. They don't even have the excuse of having to add tensor cores or any extra hardware but that is ok?
@@NatrajChaturvedi AMD can do no wrong with these shmucks.
@@Code-n-Flame it's called marketing. You want to position yourself at or near your competitors price to not seem like that bargain brand. It's also why Samsung prices their device similar to Apple, because they want to be seen as the premium Android device, even though Samsung's devices may be cheaper to produce or whatnot.
AMD just followed Nvidia in their pricing, because even if Radeon 5700xt were superior, people would still buy way more Nvidia cards. Happened in the 400 series era. The Radeon cards were superior at every price point, but people didn't buy them, kept buying Nvidia. So price is what it is because of marketing, and that people will pay whatever for Nvidia card anyways, why not price yours the same, while beating the equivalently priced card
"Deep learning" done right. But by the Radeon developers who learned things from experience. Kudos to team red.
Yes, they learned on nvidia deep failure with RTX&DLSS crap.
DLSS 1440p to 4K has less aliasing than native 4K with 4X MSAA. Why Nvidia doesn't put a shapening filter on top of that is beyond me.
@@allansh828 less aliasing cuz its a blurry mess
Maybe in metro its better but only a lil probably
@@branchprediction9923 It is definitely not blurry like some bad TAA implementation. Edges are well defined without shimmering.
By experience I think you mean the consoles' 4K upscaling techniques. Wouldn't be surprised if AMD drew from those and made their own sharpening tool.
sharp images are ALWAYS brighter! This is because blurriness merges nearby colors into the edges of what youre looking at. This isnt a flaw, its physics. Its even present in real life if you need glasses. I noticed it a couple nights ago and then to see it in this video -- gotta love that timing
10:32 wow look at that difference vs DLSS, crazy. And DLSS still gets hyped by some...
Because Jensen said DLSS is like magic wand free 4K that's better than 4K. And kids jumped on that.
I thought this might be a texture issue for DLSS but I captured the footage twice and it looked the same both times
@@railshot888 yes there are some that just believe because Jensen Says 'It Just Works',
I prefer to believe because Steve & Tim who work so hard to get the facts to Show me that 'It Just Works' or in the case of DLSS 'It Just Doesn't'
This is image sharpening not dlss. Nvidia has had image sharpening in freestyle since 2018.
Marketing is a strong thing lol
RIP DLSS
You certainly will not be missed..
Ahahahahah
# legend
@Salt Maker calm down and go pay that rtx tax
@Salt Maker ironically Hybrid tracing makes the 2060 crawl down to 970 lvl
RIP you idiots who dont know how to PC.. you could play at lower res + sharpend it up with reshade for years now lol "amazing"
I mean all I can say about Radeon Image Sharpening is that '' It just works '' ?
Lol a good one thumbs up.
Nvidia DLSS = Nvidia Did Lose (to) Simple Sharpening
Most underrated comment.. and epic..😊😁😂
NVIDIA got played again in their own game.
That's what I like about AMD.... They do come slower, but they do it better.
Not really. There are a LOT of dx11 games out there and none work with this. Something tells me by the time they finally get their act straight Nvidia will have improved DLSS enough to compensate.
@@FcoEnriquePerez True but they are always playing catch up which is not a good thing. As the market share between the two clearly reflects that. As good as this video tries to make it sound, it will not reflect in sales. AMD needs to beat Nvidia to the punch & until that happens, Nvidia is not going to lose any ground.
@@Keith2XS If you really watch the video you will know why it's not working in DX11😂😂
As TIM said, currently it only supports DX9 and DX12, other API follows soon.. The beauty of RDNA architecture which is seldom mention by reviewers is that FidelityFX ( Anti-Lag, RIS ) are "backwards" compatibles with older games that runs base on GCN architecture and handles other feature forward such as "Raytracing". This is why PS5 will be the best console, adopting RDNA as graphics ensuring old playstation games playablein PS5.
@@Keith2XS LOL
You guys never fail to deliver :-)
Been searching for this since reviews dropped.
DLSS, but better, and it's essentially just a Reshade profile that takes no performance. That shows how bad Nvidia was with DLSS.
It definitely shows that all they want to do is sell gimmicks, not anything that actually adds worth to your gaming experience.
That is also seen in the fact that all of Nvidia's gimmicks are closed source and proprietary. That means that only Nvidia can access the source code, and so game developers implementing those features can't optimize them for their game.
AMD, however, releases all of their features open source ( gpuopen.com/ ), for free. And being open source, developers can optimize them to work equally well on Nvidia GPUs too.
But Nvidia is the one with the money, sinking their claws into various game studios, forcing them to bog their games down with GameWorks gimmicks when these free and open source solutions are right there, free for the taking.
Nvidia does not offer value to the gaming ecosystem, despite their marketing prowess. They rig the system and milk it for all the money they can.
We'll see long-term but Image Sharpening causes its own issues. I downloaded the 4K version of this video and at 12:07 if you PAUSE the sharpened image was very grainy and in motion it caused an obvious flickering (may not be obvious if video is compressed so may need to see it at 4K)... DLSS has a lot of potential but I agree it's not so great right now. However, I also have a good, basic understanding of how DLSS works and it's actually quite fascinating. It's not a "gimmick" though it may or may not succeed. And DLSS may be proprietary in a sense, however NVidia will develop a DLSS profile for free so that's not an issue... I'm not going to call AMD or NVIdia's stuff B.S. until it's been given some time to improve.
@@photonboy999 Yeah, stop shilling dude. DLSS like all of Nvidia specific black box software are a blight on the industry. They spend far too much time trying to make it so that they are the "only" company capable of this and that and then are always outed for their shenanigans.
Just what are you defending, DLSS is rubbish, it had limited resolutions it can be used with, higher than would be tolerable performance hits and is only implemented in a handful of titles because developer have to go back in to implement it. What's to like about this rubbish? You Nvidia fans try too hard to defend even the most indefensible positions. High performance hit, limited scope of use by resolution and even limited titles can use it at all, while RIS negligible performance hit, works in all DX9, DX12 and Vulcan titles, not perfect but better than DLSS at nearly even metric they matters and you still find a way to brown nose. Disgusting.
@@photonboy999 you are not going to pause the games and notice that couple different pixels in real life scenario, this technology is actually better than nvidia's dlss for players that wanna have 4k experience but in a budget
@@evilformerlys4704 well in metro exodus its outstanding it works super duper amazing however in battlefield V isnt cause that game hasnt fixed it. But why not this ray tracing for example is something only nvidia has now and hey for me shadows and reflections aint a big deal but global illumination is a big change for me like Quake II and Minecraft (it doest use the RT cores at least yet) and Metro exodus this realistic lighting is outstanding and its the most demanding feature that even the GTX1080Ti falls behind from the RTX2060 when using RT cores. Nvidia isnt trash they try to make money like any company would do so i love the fact that amd gets better so they can force nvidia to release better cards cause Nvidia is far ahead in GPUs technology even these RX5700 with 7nm actually have higher consumption than the RTX cards that have 12nm.
Upsampling with AI calculation losing against a sharpening filter with contrast control... Hey Jensen, it just works.
Nvidia likes to make things automated. Like ray tracing. AMD uses more traditional ways
Su wins this time. Poor uncle.
This is A.I. based as well.. But its done right.
@@serfillustrated4018 its not ai based at all, its a post process filter, 3rd party software was able to do that already so its not amd revolutionary tech thing going here, it looks quite bad, some ppl love oversharp stuff ik, but its more like reshade, specially that bf V clip it WAS reshade footage i swear, that oversharpened texture, ughh.. but yes, reshade is meant to LOOK better so, who actually cares of how, or if its more like image "faking" as long as it looks better and performs as promised, but this scalers and things barely do much, if they do something its barely noticeable or u go oversharp, ofc mhe vs absolute trash (dlss) is fairly easy to decide, but this kind of sharpenning is NOT new while dlss IS new, does that matter for the final consumer, no, but just like AA implementations, first AA implementations murdered perfm, first "low cost" AA destroyed image quality..... nowdays good AA looks better and costs little to enable, dlss has potential, for now its worse, just like raytracing, is it useless or bad, no in fact its the future, is RTX 2000 series good... thats completely diferent but no, and yes, if u never push a new tech u never get it, so, i like the raytracing, this is just the first gen of both dlss wich btw mens deep learning (and since its new, not much... learning done) and rtx, also new... give these more years and we'll see, that being said as of now ANY post process filtering (sweet fx, reshade, or this amd sharpen>dlss)
@@arencorparencorp2189 Yes it is... He even says it in the video..lol But like you said, the consumer only cares about the final result. Thats why ps4"s checkerboard upscaling is awesome even if its Faux-K... www.amd.com/en/technologies/radeon-software-image-sharpening
Currently playing Monster Hunter World, and constantly being dissatisfied with the AA options. Image is either too aliased or blurry. Really hope image sharpening provides some decent stuff on that front.
The only AA option that doesnt suck really hard on MHW is FXAA, the other options makes it seems like we are with motion blur, but without motion... rofl
@@dsm828 I tend to just turn it all off. I may just force msaa through control panel if it doesn't kill my performance.
If you are running an AMD card, you might try VSR. Virtual super resolution is a setting in the driver that allows you to Run at a resolution above typical 1080p monitor such as 1440p and 1800p and then downscale it back to your monitor. It basically has the same impact as SSAA and can be combined with other in game AA. I use it often for well optimized games like DOOM or older games where the performance hit is negligible to clean up the image such as Boarderlands 2 with the new HD texture pack.
Lmfao its because they have the shit volumetric lighting, if you dont want to have blurriness turn it off
Does reshade work with mhw? You could try that
Another reason to buy Navi card. Ray tracing using image sharpening will be PS5 and Xbox scarlett's secret sauce.
@@hitmarkler honestly this does change things. "4k"with Ray tracing resolution could just be 1440p with image sharpening. That would save performance and probably enable 60fps ray tracing at good resolution.
Or like, not-at-all-secret-sauce given its open sourced
@@mduckernz www.reddit.com/r/Amd/comments/cc0575/i_ported_fidelityfx_cas_to_reshade_so_anyone_can/
It has already been ported to reshade...
@Matt D "Secret sauce" is an idiom, it doesn't mean something is literally secret.
So now when someone says "but Radeon doesn't have RTX features", I can say "well RTX doesn't have Radeon features"
RT is basically a proprietary version of DXR which is a feature on DX12.
If I have a thing to complain about AMD right now: Their video encoder sucks. Also OpenCL is kinda flop compare to CUDA support from developers.
@@Archmage1809 Ray Tracing (DXR) is not "just" a feature of DX12, it's an API that allows your Ray-tracing hardware (whether be it Nvidia, or AMD in the future) be compatible with games.
The RTX cards just have proprietary *hardware* that allows them to more effectively ray-trace through an API like DXR.
Hence why even GTX cards can ray-trace, but at crawling frame rates.
@@Archmage1809 Radeon has ROCm, allowing Radeon processors to implement Tensorflow. So you don't really need CUDA.
@@Archmage1809 only for certain encoding works... There is a video from Wendell from level1tech and eposvox regarding the massive encoding performance of the 5700xt...
@Dex4Sure Again, that depends on the GPU they have. Most use CUDA because most have Nvidia GPUs. But if they have Radeon GPUs, which don't have CUDA (it's a proprietary Nvidia feature), they can use AMD's open source ROCm.
Well, looks like all future PS5/Xbox Scarlet Games will be sharpened :)
exactly what i was thinking. thats the 4k for consoles .
if a 5700XT can almost match a 1080Ti for 400
I imagine consoles could pull off 4k30 next generation without much issue
@@abirneji they could already do it but with less eye candy than PC games. Baked textures checkerboard rendering etc are conole friendly stuff.
@@VoldoronGaming I would go checkerboard over native for consoles regardless, only because a stable framerate is more important than detail when you're chilling out on a sofa
@@abirneji I think they will be able to do 4k60 just fine. It will be a custom navi chip. And we didn't hit the top of navi performance here. There is no way they are going to give up on the opportunity.
>when your free reshade filter is superior to deep learning super sampling ai
iT jUsT wOrKs!
>some people still think the super sampling ai is better because oversharpening like this does literally nothing unless you stare at the tree that's 300 yards away
I'm hearing stuff like since dlss is AI based, it'll improve eventually. But nvidia needs to do something with that oil paint effect asap
@@zade6828 They can't fix the oil paint effect, it's AI logic, they don't understand what's "good" and what's "bad" the AI tries to recreate the image and does it but it costs the "realism" our productions have, until someone sets the parameters for "realist quality" DLSS won't improve at all.
AMD Image Sharpening, "It Just Works". ;-)
"It Actually Works" ;) did you see the tank textures at 10:35 it was interesting
Lmao
"It Just Works Better"
@@markosz22 yep, I guess I should have added a " ..." And "better" at the end. ☺
steve&tim are a rare breed of unbiased techreviewers, unlike most muppets out there. keep it up! (happy to be patron too)
Hear hear!
I do agree but I still think they're like a slight bit in favour of AMD and for good reason though, we all love the underdogs.
@@Rhino123freak i am fan of the underdog here as well (at least for now), more because of the intel oem buyout they did in late 2006 when they couldnt compete. that cant be talked right in my book. same with nvidia and GPP last year. i wish for risc-v and arm to lift off into maigstream, as x86 is a powerhungry isa.
@Alistair not even talking about Intel here. I belive the super series cards launch was pretty damn good but even before Navi launch the overall opinion on the super cards by HU was more or less 'eh' while other reviewers agreed it's pretty solid. Again, nothing substantial, just a hint of a love for team red that edges out others and that's fine by all means.
@@Rhino123freak You would have never got that hint of 'love for AMD' during the FX years. If we seem biased towards AMD today, that's probably because they're offering good products, the second that changes so will our bias ;)
nvidia's shill tech tubers didnt even bother to talk about these things good job hardware unboxed ty for the video
It is also really funny to see them trying to pretend like they are impartial. Honestly, my only grip with the NAVI cards are their cooler, but at the same time, I don't really have any intention to overclock them, so I don't know if the base cooler will be enough.
Well it is tech that has been out there for years such as reshade which has sharpening filters and other things like aa which I have been using for years I recommend using it
@@marstondu67 the base cooler is perfect if you're not going to overclock. Just under volt it and set a fan curve under 2000 rpm and you'll not hear it and will game perfectly fine
@@toonnut1 I just saw a video from the channel Not an appel fan, where he was talking about undervolting for the RX 5700XT and how quiet and performant it is. I didn't realize it was that simple to undervolt the card. I've also an very old PC with an i5 2400 and a GT 545, so I'm pretty much used to a very loud PC overall ( to the point where I thought my PC would launch to space ), so the noise is not a problem for me. I will still just be waiting for 3rd parties solutions's prices and if it is significantly higher, I will go for the stock version, especially since I am not that interested in overcloking the card. The fact that I could get 6 months worth of Game Pass by buying a Ryzen 3700x and a 5700XT is also something worth mentioning I think.
Like when Nvidia introduced GSync and to be countered by freesync and now DLSS with Radeon Image Sharpening.
I will laugh if in the near future if Nvidia will force to support the image sharpening like what happen to free sync.
nvidia already support image sharpening with their Freestyle. and unlike AMD where the support is only exclusive to Navi nvidia image sharpening work with all of their GPU.
They will, right now RIS just destroys DLSS, they had to do the same in monitors cause the % of monitors using Freesync was far higher than the ones using G-Sync, here the same happens, you can literally use RIS in anything (also DX11 if you are using the Reshade version that just got implemented using the Open source RIS code)
Yeah, I fell for it. Got a G-sync monitor but want to get the 5700XT but obviously Freesync is not going to work on the g-sync monitor, feelsbadman
Nvidia: Worse looks for better FPS.
AMD: Better looks with actual FPS.
now Nvidia DLSS 2.0: Better looks and better FPS
@@ThemasasterAdolfo No depends on the game take BF5 Nvidia's DLSS is broken and performs horribly and looks worse
DLSS 2.0 is easier to implement and update, if you want DICE can update the DLSS 2.0 implementation in BT5, you don't have to send much information.
At 10:35 look at the tanks texture on DLSS (BLUR) compared to Sharpening
Almost as if the texture was set to medium in comparison.
yep noticed that as well
"4K" DLSS hahaha lol
It looks like console
Yea I'm unsubbing from this channel after seeing this. Clear AMD pandering at it's worst. I have BFV. I play it every damn day and in no situation with DLSS turned on have I noticed textures get THAT mucked up. Medium? Try low.
@@Code-n-Flame HAHA lol wow, a dlss user got triggered that he found out he was scammed, but he doesn't own up to it XD
How does it look when sharpening a 1080p image vs. native 1440p? This might be a way to achieve 144fps for some games while keeping the image quality.
I tried it but honestly, not great results. It's good for sharpening native 1080p or 1440p but if you're downsampling the pixel count gets a bit low to resolve enough detail for sharpening to 'fix'
Hardware Unboxed are there any image results available?
@@adas2051 watch AMD @ E3, they have some images of that, and they actually came clean about it in matter of image quality.
@@Hardwareunboxed Thanks for the reply! You've cleared up everything that I wanted to know about this.
Edit: Seems like there is no easy way to achieve 1440p 144fps without throwing money at it. There are now more affordable 1440p 144Hz Monitors available but not so many affordable ways to feed them :/
Hardware Unboxed so .. 1080P sharpened, on a 1080P Monitor of course. Would still provide some clarity improvement? Not 1440P level but better than 1080P.
And I get AMD ad about Image Sharpening Technology 😂
I got it also
I got an rtx super ad. Lmao thats gonna be a no from me jenson
LoL got a ryzen ad
I got TWO RTX Super ads Watching thjs video alone! Making about 4 or 5 total today - I wonder if Nvidia are worried now?
I also got the RTX Super ad. It seems Nvidia are indeed worried if they now are putting ads on AMD Radeon videos.
Nvidia: "Just tryna keep that mindshare, fam! It just works!"
Right now I can't justify why someone would buy a "Super" card over AMD's Navi cards.
Because they are loud and hot as hell. Once AIB's come out, I agree completely
Because they're faster?
The shitty reference cooler is a valid reason. Sure, the best thing to do would be to wait a month for the custom cards, but some people are going to want or need a card right now, and once again AMD's dreadful attempt at a cooling solution is letting the side down.
Lighting and Ray Tracing related workloads?
someone who already owns a gsync-monitor
finally not a vaseline texture cough DLSS cough
Radeon Image Sharpening - It Just Works..
....Better than DLSS
Great video. Completely forgot about radeon image sharpening but it looks promising, way better so far compared to Dlss
Just install reshade, been using it for years on games by adding in sharpening filters and aa best add on program for games
I hope AMD (as a company) only gets better from here on out.
Yeah, they still have a lot to learn from Nvidia in terms of power efficiency and driver updates...
R&D spending at Nvidia and Intel is a fraud. AMD has better results with 15% of competitors budget.
Well it's the fact that they don't have lots of money to throw around that helps them come up with stuff like this
@@DeosPraetorian It's mostly an internal decision. AMD is also embracing open-source on their technology and on their drivers while Nvidia only play the open source game with their Jetson ARM SBC and nothing else.
slow down mates i might just have to go and get a 5700xt if you keep this up
@Yuck Foutube good idea
@Yuck Foutube I have a Morpheus II lying around, so I'm getting really tempted here..
@@defeqel6537 i'm dying to see a Morpheus 2 review on this, i'm surprised no youtuber did it. It could give a approximation of the results of a custom board, you could literally have the only review of a custom board online.
You did an amazing job here, Tim!
_Really impressive_ , will buy the 5800XT !
5800XT?
You mean 5700XT😂
@@lukegrundlingh6070 think there was a leak somewhere about a 5800xt which if its even a little bit better than 5700xt then im down to buy it
@@lukegrundlingh6070 no I really mean 5800xt
I could be buying RX5950XTX or whatever is the most high end model that comes out.
@Billy With that many X's it's got to be porn!
The guys over at Lvl1Tech were already super enthusiastic about AMD CAS, and watching this, I can see why.
I'm curious to see what a 1080p sharpened image can become to get those higher FPS on 144Hz refresh 1440p monitors.
Rest in peace DLSS, along with PhysX and another Nvidia gimmick
Greg Goodenough if nvidia bought it then it’s nvidia’s. Lol
@@miyagiryota9238 they bought it and make it work bad on CPU's by disabling newer command extensions (SSE).
Greg Goodenough I don’t need to know answers that I already know thank you.
Greg Goodenough thanks dr psychiatrist.
Greg Goodenough not lol
How is that train even staying on the tracks ? That must be the worst rail track in the world...
Russian technology. Both crap, and sturdy enough to withstand the crap it has to face.
I was thinking that the whole time!
I kept getting distracted by that & not looking at the detail comparison too 😁
@@erejnion Shit-but-works is a quality class all of its own lol. Russian chemistry is often the same way, or was - it's all high pressure, high heat, bake and pray, but works amazingly well with heavy duty machinery lol
Wow, that's actually very impressive! DLSS reduced to a meme.
This is actually useful.
There are reasons why XBOX and Playstation stick with AMD all these years
Just wait for Aftermarket versions and driver update
👌
RTX is Overprice
RT is Overrated
RTXS ??
yes, you absolute mongoloid, they stick with amd because they use APU's in consoles, and nvidia doesn't do APU's. They aren't even an option.
@@TheConst4nt well rdna will be used in the ps5 in some way soo games brought out to consoles are easily optimised for navi cards
raytracing is a nice concept i welcome any day
but i wont pay a fortune for a feature thats barely being used
i havent had a amd card in years and i dont plan to get one too soon either because in the end all we want is good performance for a affordable price
glad competition is back
@@TheConst4nt The Nintendo Switch uses Nvidia.
@@TheConst4nt Tegra anyone?
@@NajCarnage Way too weak for proper consoles, also nVidia hasn't been a great partner to either Sony or MS, so no surprise they don't want anything to do with them
They should have called this feature AMD Fine Sharpend Wine.
Thanks for the review. 1 thing that made me curious after watching the video is the possible impact of this feature for the upcoming APUs after their update. That could have some interesting effect on playability and could turn out to be an interesting way to counter possible future intel's igpu improvements. But we shall see it at some point in the future.
Anyway, good luck in the future and have fun.
Wow, great job! Never knew what Radeon image sharpening did, but seems extremely useful now that I understand!
LMFAO, so Nvidia strapped on expensive and data center focused Tensor cores to their already overpriced GPU's, spend a ton on R&D to develop machine learning algorithms to reconstruct images...and AMD got better results with post processing.
DrearierSpider1 DLSS was a mistake.
I don't think DLSS is really training anyways. If it was it must be really slow. And the initial product is significantly worse.
Or Jensen can say "we do not have enough sample size, for our AI to train effectively, so everyone should get their friends to buy 2 RTX cards and help pushing this along".
I still think Jensen just threw in "Deep Learning" to make it sound cooler and as a reason to justify why they look like Vaseline smeared all over monitor.
Nvidia can just say "doh nubs it's deep learning, and eventually it will be just as good as real 4k with no specific time frame." It could be tomorrow, or in 2025 or it could be 2030 when their "AI" completes the learning process or it could be never.
Its almost like using ReShade and sharpening the image
it's not like Nvidia developed tensor cores and stuff like that for gaming exclusively. It's really useful in machine learning in general, science, industries, ML/DL/AI is the future and Nvidia is driving it. Of course, it can be used in games and as AMD provided no competition for quite some time, Nvidia can charge whatever they want for their cards to cover the R&D costs a bit.
But I guess that doesn't really interest mere gamers. Not saying Nvidia is "saint", but we can't judge them just from the gaming perspective. Well done AMD.
DLSS -- just the vaseline nvidia used to lube up their fanboys before FUCKING THEM.
5700 XT is looking better and better. Very tempted to replace my gtx 1070 with it.
wow, don't be dumb :D
I think Sony and Microsoft had a hand in AMDs R&D for some upcoming products of theirs :)
yes, because there wont be any 4k games in native resolution.
@@LilljehookThe One X already achieved games in native 4K, like RDR 2 and Forza Horizon 4, so yes there will be games in native 4K on both PS5 and Scarlett. Plus, if we take new technologies from AMD and optimization methods specifics to consoles, 4K 60 FPS is totally feasible.
John Kenobi What could this do for a game like Bloodborne?
@@gamernation1400 Honestly, much more details and 60 FPS for the game. Open World games will also have many improvements in terms of framerates and loading times.
John Kenobi Wouldn’t the game need a patch to run at 60 FPS?
WOW, got a Vega64 here, didn't have any plans to change but this might be worth it alone,
* Watching 4k comparison in crappy YT 720p * Another triumphant victory for AMD!
Seriously, you can see improvement even in compressed 1080p and it's almost free(hello, RTX)
*watching 4k comparison in crappy TY 360p on a mobile phone* what am I doing?
The Battlefield V one is insane, AMD is even beating native 4k in texture detail while NVIDIA looks like it has 480p textures, just look at the tank it looks like a motherfucking PS2 game.
@@stargazer162 must be something wrong with that, not sure if it's HWU, or nVidia's bug / "optimization"
@@defeqel6537 After looking at it for a while there is definitely something weird going on, not only the textures look like if they belong to GTA San Andreas but even the 3D model has less polygons on the NVIDIA side.
@@stargazer162 yeah, someone on r/AMD on a post about this video, posted a 4K picture from a 2080Ti, and the textures were way better, but it still clearly had fewer polygons
I think we need a reshade (lumasharpen/adaptive sharpness) comparison against Radeon image sharpening.
gist.github.com/martymcmodding/30304c4bffa6e2bd2eb59ff8bb09d135
Speaking of AMD, if they launched the rumored RX 5950 XT, what will it compete against? The TITAN series? In that case, they'd win in a landslide.
I doubt we are getting a big 7nm die anytime soon.
@@Tjuri10 Certainly not this year. The leaks suggest 2020, but I feel like it is closer to the end of 2020 or just 2021.
@@mix3k818 Sounds more like it. I am super curious what they are going to do about power though. And how well Navi will scale with more CUs. Vega 56 and 64 were more or less the same at the same clocks after all.
@@Tjuri10 who cares about power in high end?
Nice to see a company implementing things that actually make sense for today's games. Absolutely love the fact that, "it just works", we don't have to rely on special implementations, and we get a real perceivable improvement to our gaming experience without having to sacrifice on performance or quality to get it.
1080p sharpening vs. 1440p both XT and 1800p 5700XT sharpening vs 2080 and 2080 Ti 4K please please please! This would change the value proposition incredibly for the XT
You could extrapolate the data from the info provided in the video. He mentioned almost 20-25% higher performance to get roughly the same quality. So it would probably perform in between a 2080 and 2080 ti at 1800p sharpened compared to native 4k on the other cards...
The 1080p results he has mentioned in another comment that the 1080p native and 1440p native see good quality improvement but undersampling the 1440p didn't work as well.
Its still just a sharpen filter, you cannot compare it to another gpu running at native. Here if you want to try it www.reddit.com/r/Amd/comments/cc0575/i_ported_fidelityfx_cas_to_reshade_so_anyone_can/
That's really impressive. I was very hyped for DLSS until we saw how big a hit it took in image quality relative to native resolution, but this looks a lot more promising. Thanks for this video. You earned a sub.
could this feature implementing on youtube ?
144p + image sharpening ?
🤔
Finally internet data could be saved if it happens
It's a little harder on video because of the compression artefacts (Radeon's technique needs a clean image to work with for good results), but you should check out new video codecs like VVC and AV1. Eventually TH-cam will be able to halve bitrate while slightly improving quality through intelligent codecs.
Have you tried MPC-BE + madVR + youtube-dl? Though, it only works up to 720p, sadly, even if you open a higher resolution video it will load at 720p, but for upscaling, denoising and debanding low resolution videos it is very impressive.
@@stargazer162 i've never use madvr + youtube-dl.
just use youtube-dl on SVP tube2
@@claritoresdiano1021 If you download the youtube-dl file and copy-paste it into MPC-BE's folder, it works and lets you open TH-cam links on MPC-BE (as well as other supported sites).
If on top of that you have installed and configured madVR you also could use high quality real time upscaling, denoising debanding and more.
Great job Tim! However, I feel these cards are more suitable for 1440p gaming. Could you please test these with 1440p 144Hz monitors, and down sample from there? For example: if the 5700XT hits 90fps @ 1440p, how much uplift can we gain by downsampling and using the sharpening filter and at what visual cost? Thanks for all your hard work!!
Yo HU, in the battlefield 5 one, is the Nvidia DLSS one running at the same detail/texture settings? The model and textures look almost completely different, is DLSS really that bad?
Yes.
Have you ever seen a positive review about dlss? Other than Jensen's
I've started using RIS for the first time with Cyberpunk 2077 and I'm impressed. Traditional sharpening doesn't usually look great, like the setting on a monitor or TV but this RIS contrast adaptive technique is really really good to my eyes.
Same here! I overlooked this feature and never gave it a chance until now while trying to find the best performance/quality settings for Cyberpunk 2077. I even took screenshots side-to-side comparsion, because I couldn't believe the difference and thought it was placebo... but man, does it make a difference.
So could you downscale 1080p to 70% and then apply this image sharpening and get close to the original quality in order to boost fps in fps games?
Technically yes, but the thing is that the lower the resolution the less effective the sharpening becomes. So going from 1080p to 900p and sharepning it will look significantly worse than going from 4k to 1700p and sharpening. Purely on a lack of pixel information for the sharpener to work with.
Yes to textures, no to detail/aliasing. Of course, still much better, especially when we get mobile Navi APUs, where gaming resolution and texture quality will still be low - so it will benefit greatly from this.
But on PC- I think we should aim for at least 1440p to 70%, which would be great for textures, and already offer ~enough detail.
May not be great but I'll definitely still do it with whatever low end 75w card they come out with.
Yes. Radeon Image Sharpening works at every resolution. :)
I have done that before using Reshade and while it gets kinda close to native 1080p, the difference is still obvious. I don't know how it would work in AMD RIS case, but I don't expect much.
How about getting 1440p quality with a lower render scale? There are definitely games that don't hit high refresh rates at 1440p on the 5700xt, but if it could run 1080p or 1200p could really give the 5700xt the edge. This would also be very helpful for people running 1440p ultrawide or double ultrawide, where the pixel count starts hitting 4k'ish levels.
Ive seen the review of anti lag and guess what it works. Yet some shill youtubers who will not be named still push raytracing as a thing. Anti lag and image sharpening is more useful than what nvidia is pushing.
Nostra-Tim calls it! LOL! Great work Tim and Steve. I know you guys have been working overtime lately! Steve and Tim are Cylons, confirmed! Can the new NAVI cards be run in CrossFire? How does it handle VR?
I would be interested to see how NAVI's image sharpening works in Elite: Dangerous. The game has such massive contrasts, some extremely thin lines that look like staircases, and quite a bit of text. Thanks again for all your hard work! o7
Who would win.
Deep-learning upscaling algorithm vs. One post-process sharp boye
I really like the fact that it has DirectX 9 support as there are older games that have a quite small UI in 4k
Thank god it doesn't apply Vaseline to my screen.
I have Samsung Magic Upscale in my monitor, basically the same thing? It has 2 levels of sharpening to choose from. It looks pretty good in my opinion but I haven't been gaming "real" after I bought this monitor so haven't used it, as it looks bad in normal apps. Nice video from you as always!
This is the first time I've seen DLSS. It's awful. It's the opposite of a "feature".
As for Radeon Image Sharpening, to quote a certain GPU CEO, "It just works". At least for the most part it works as advertised.
I hate that blur in modern games. Recently played Mafia 1 and COD1 and was shocked how crisp they are: I didn't need to squeeze at my 27" screen to see smaller objects.
DLSS is a gimmick. Change my mind.
gimmick is way to gentle
it's a total FAIL - those tensor cores are completely useless
DLSS is a total fail. AMD's Image sharpening is actually useful.
I know TH-cam compresses the video here, but even then, the DLSS images looks so soft in low contrast areas, like really muddy. Whereas the RIS really boost up those areas and makes them look crispy with no ringing... I am impressed.
Update (in case, you were wondering):
Radeon Image Sharpening is now supported in DX11.
It is also supported in Vega GPU's AS WELL AS RX 570, 580, 590 and 470, 480
Source: www.amd.com/en/technologies/radeon-software-image-sharpening
Even though it might not be apparent in this video, Image Sharpening does in fact make things look a lot better. Crisper and brighter. I have seen the effects in Witcher 3, and also even in Benchmark softwares like Unigine's Superposition. I have noticed absolutely no fps drops whatsoever.
I own an RX 570.
Hey Tim. Amazing content. I wasn't expecting such results. This is great news and suddenly want to go a notch higher non my upcoming GPU upgrade. Thank you.
So AMD, add DX11 support, and a per game toggle for it. Otherwise, awesome job!
There is a per game toggle in the Radeon settings
@@DeosPraetorian Sure that's not Anti-Lag?
compile DXVK for Windows ;)
Welcome to ReShade where this "Navi magic" has been ported to basically any GPU: gist.github.com/martymcmodding/30304c4bffa6e2bd2eb59ff8bb09d135
I've used Reshade for quite along time now, and it can make games look alot better aswell, but of course it does have a performance hit, and not always guaranteed to work in all games.
Ok this is the tenth 5700 video where I had a add for the super cards.
Right! This is the first time I've ever gotten a super ad and at the beginning of a positive Navi video.
Nvidia is trying really hard to not lose people to these far better value Navi cards, that even beat their DLSS vaseline!
Nvidia paying to advertise AMD products.
I discovered Reshade relatively recently and used some of its stock shaders in a couple of games. For me, the combination of adaptive sharpening and "colorfulness" filters provided a huge improvement in image quality in Skyrim and Transformers war for the cybertron, both of which I've been playing lately. Naturally, sharpening produces some artifacting, but it's minimal in the way I configured it, while makes the overall picture look a lot clearer and in my experience highlights some of the fine details, sometimes even making textures look better than they actually are. Considering that those Radeon image enchancing features provide better results and are easier to use, then... I see it as an absolute win!
Awesome! Thanks for the great comparison. DLSS really is perfect for making games slower and more ugly *LOL*
You can also do the same in the Nvidia Geforce Experience with the built in game filters.
4K to 1440P, use the sharpening and clarity sliders. Looks good and works in DX 11 as well.
IKR people are so dumb, they think its something new. :D
Me: Clicks on HWUB video about RIS,
TH-cam: *shows Nvidia Super ad*
Me: 😂
I think downsampling and upsampling got mixed up in this video (edit: I'm wrong, mixed that up with up/down-scaling).
You are describing upscaling and downscaling. Downsampling is when you reduce the sample count, which happens when you reduce the render resolution
@@Hardwareunboxed I see. Thanks! Never really gave a this thought, believed downscaling and downsampling were interchangeable.
Radeon Image Sharpening is at the moment the one feature that gives me buyers remorse for my Vega56, even though I would have paid 160€ more. I could probably put my r9 285 cooler on the Navi card. Hmm.
@@abeidiot Radeon Anti-Lag came to even GCN 1.0 cards, don't see why CAS wouldn't after some driver testing
It might. I am still waiting for the Navi whitepaper to see the differences to Vega. Hot Chips is too far away.
Another great video. Could you guys try make a series of benchmark for real life situations?
1. Running youtube video/twitch.tv while gaming. shows which Cpus is bedst at that because most people who game while watching listing to video do care which cpu can do that.
Install ReShade, turn ON Lumasharpen.
Vulkan/Dx12 support work is in progress.
This is much more advanced than that (Navi has dedicated CAS algorithm hardware; said algorithm being MUCH more complex than anything used by ReShade [hence needing that hardware]), though both are of roughly similar "means" (post-process sharpening solutions).
I'd rather use this if I can, since it's better, but if you don't have a Navi GPU or you're playing a DX11 game that's a decent alternative, and it's still better than using DLSS.
@@Cooe.
FidelityFX CAS shader is now available for ReShade.
www.reddit.com/r/Amd/comments/cc0575/i_ported_fidelityfx_cas_to_reshade_so_anyone_can/
Awesome review!
With a future AIB card there should be a fairly large chance of a +5% overclock without to much effort. Add that to the 27% performance boost gained with 70-80% GPU scaling & RIS and you have a total performance rating of +166% (compared to a GTX 1080) whilst the 2080TI comes in at 169%?
This would mean that 70-80% GPU Scaling with no significant quality loss with a decent overclock gives you a 10k card for just around 4,5k?
Request: I would love to see a direct comparison with a 5700 XT 80% GPU Scaling + RIS with a 2080TI card in like 10 games.
Looks like you just got new glasses and your eyes hadn't adjusted yet
Even at 1080p (4k isn't available yet) it looks considerably better. I've always turned AA off in games because I hate that blurry look but this fixes that perfectly and even allows you to upscale intensive games with minimal quality loss. Bravo AMD. This plus anti lag might be enough to get me to come back to team red (with an aftermarket cooler of course). Does anyone know if AMD will be releasing anything faster than the 5700XT in the near future though? I've already got a 1070Ti and a 30% performance bump isn't REALLY worth it to me.
This is good ! Let the companies fight for the consumer !
Look like this has already been ported to ReShade, and it works with Nvidia cards (even with DX11). Just the CAS sharpening though, the scaling will have to be done in-game or in-driver though. Be interested to see how it looks and performs compared to the real thing.
So basically the 5700XT is a better, cheaper version of the 2080?
2070, but yeah.
2070 Super, but yeah.
All we need now are better coolers and optimized drivers. And boom ultimate bang for your buck.
@@Elixylon Well, that's my point, the 2070 Super is almost at the level of the 2080. It certainly doesn't make sense to buy the 2080 now that the 2070 Super is out. And because it certainly doesn't make sense to buy the 2070 Super now that the 5700 XT is out, I'm thinking it doesn't make sense to buy the 2080 now that the 5700 XT is out.
It's going to be, but you'll have to wait until next month when the non blower cards release.
I watched this at 720p and often couldn't tell the difference. Really love the work HUB does, I can never get enough time to test this all for myself.
Would love if you added some very zoomed in "pixel peeping" side by side comparisons that are very obvious to just illustrate some of the differences for those of us with poop internet.
Does it work for vr gaming? Can someone please check?! That would obviously be an awesome advantage for vr gamers who currently normally avoid amd cards as it seems.
It will work with anything using either one of the DirectX 9/12 or Vulkan graphics .api's. So as long as said VR game/software was coded for one of those (instead of say DirectX 11 or OpenGL), then yeah, it'll work just fine through a VR headset, just like it would any "monitor".
This looks (!) great, any chance of testing mGPU in some titles that might support it when you've done the anti-lag?
That video. . .
Was amazing!
Looking forward to your ANTI LAG investigation! Ive got a rx580 and its hard to notice on or off in bf5. That said, if it does indeed improve the feel of gameplay im all for it
NVidia "How do we create a feature that can latch on to the recent AI craze?" AMD "How can we deliver nicer graphics for lower cost?". Think that says it all.
Can we have the same today with 6000 Series/Big Navi? And also... against DLSS 2.0? (or was that already done?)
You guys should sell a shirt that says "Enough waffling" and has a really well done illustration of a waffle.
With the syrup dripping off it lol
The shirt should have an image of a waffle with a line dividing it in half. On one side, it says DLSS and the image is smeared, and on the other side, it should say Radeon Image Sharpening, and it should be crisp.
Not using DLSS, otherwise it would look like a pile of 💩
Image sharpening vs Nvidia free style sharpening vs FXAA Injector Tool vs Reshade Sharpening would be great.