@@Kntl125You realize that NVidia's hardware does similar computations than AMDs software solution? Specialized circuitry/processors are always faster than software on general purpose processors.
Honestly, given how rushed this presentation looks ("temporary" instead of "temporal", grainy GIFs instead of proper videos), I think it's entirely possible that AMD were like "oh shit, we weren't going to announce this yet but we *have* been working on it, so let's push something out".
Decoupling Frame gen from upscaling is great news and I hope AMD delivers on their promise to upgrade FSR upscaling, but at least we'll be able to use frame gen with other upscalers if FSR keeps lagging behind. more options are always great.
Hopefully the Avatar game gets an update with the frame gen component decoupled so we can use DLSS with FSR frame gen, unless they just add DLSS 3 of course, but to me FSR 3 is very good, however it is strange Avatar STILL does not have DLSS frame gen 🤔
@@George-um2vc It's an AMD sponsored title and they often do not allow Nvidia tech to be put into the games. Probably because FSR Frame Gen is inferior to DLSS and they don't want to show it.
For sure! I just sold my 3050 Ti laptop and now have 6750 XT but I was always mad that Nvidia was gatekeeping FSR for 40 series cards when it's probs more useful for 30 series owners
I am glad there is finally an official support for that. Meanwhile you can use a mod DLSSG to FSR3 for games that support DLSS frame gen to still use DLSS for upscaling but FSR3 for frame gen on RTX 20 and 30 series.
Been using that DLSSG to FSR3 mod that allows DLSS upscaling while injecting AMD's frame gen and it works/looks pretty good on my 2080. Good to see this is being allowed by official means.
@@lilpain1997 If you werent aware, theres an anti-ghosting mod for cyberpunk FSR3 mod and it really cleans up the image, especially behind moving cars. 100fps 1440p pathtraced on a 3090 is liberating!
@@H3LLGHA5T any game you move fast in has issues. Witcher 3 also has same issue. Cyberpunk has a mod that can help but it doesn't fix anything and it's a really hacky way of doing it according to the mod dev.
[EDIT: AMD have fixed the typo on the blog post now. Original comment below...] Am I wrong to think that "temporary stability" was just a typo? They refer to the more usual phrase "temporal stability" in a graphic lower down in the announcement. It's too late, press has all quoted it now, but I have to believe it's a typo...
Sounds like AMD did an oopsie there, but journalists shouldn't just copy-paste things either, that just amplifies the damage. They should spot such an obvious mistake, and append a correction -- their duty is to spreading _the truth_ after all (well, supposedly).
@@ThePurplePassage Well, yesn't. AMD did make a mistake, but if a journalist knows that a mistake was made (which they should, because at the bare minimum they must know what _they themselves_ are talking about, which is enough to notice such an obvious typo), they should put an asterisk, and say "we assume AMD meant to say 'temporal stability' instead". AMD isn't blameless, but a journalist's job isn't to spread the misinformation either. They should've noticed this.
If upscaling at the quality setting gives me +30% frame rates but looks like a dog's dinner, I'm never using it. If upscaling gives me +15% frame rates but looks (figuratively) pristine, I'm using it every time. AMD/Nvidia love to put the performance figures first, but I'd argue that image quality is the highest priority.
It's not puzzling why frame generation and upscaling where coupled, they literally explained it: FG uses the optical flow analysis from the upscaler and so it needed that input. I gues the change now is that that optical flow analysis can run independently pf the upscaler, but I'd expect that to increase the resource consumption of the frame generation of course.
Grainy? Oh no, that's pretty accurate, actually. In 1080p FSR2 upscaling looked like that, has been for way too long. Also, that's zoomed. Of course it doesn't look the best, but the message is there.
@@akirajkrno it isn't. Those examples provided are not accurate at all even when compared to FSR 2 1080p/Performance mode/540p internal resolution. Aside from the usual upscaling artifacts, there are lots of gif artifacts mixed into it which is baffling considering that they are showing off the improvements they've made to their own technology. At this point, AMD is not even axing their own foot. It's more like they are purposely stepping on the axe themselves every time.
This is the update I've been waiting for. Recently switched from an Nvidia card to an AMD card and the only thing I miss is DLSS. FSR looks so poor in most games. I would just opt to lower settings rather than turning it on if I need higher FPS. I imagine even with this update, it still won't be quite as good as DLSS, but if it's close, that's good enough for me.
@@opreax2145 "buy a 250-300 USD monitor because you decided to save 1-10% (between 10-100 USD across all current gen GPUs) by getting an AMD GPU instead of the Nvidia equivalent (excluding 4060 Ti)." seems legit
@@rinsenpai135the cheapest 16gb nvidea gpu is rtx 4070ti super, except the 4060ti which is just too weak for 4k. Amd has 6800xt or 7800xt for almost half the price. I'd pay 20% more for DLSS personaly but simply no sub 500$ nvidea gpus with enough vram for the future
@@Jake-0011 That uses the existing FSR2.2 upscaler (3.0 is the same + frame gen). The developers likely didn't tune it well for the game, or the visuals of that game don't play nicely with whatever algorithm FSR2.2/3.0 uses.
An additional positive aspect would be that with a significant improvement in FSR 3.1, these heated, sometimes unobjective comparisons of the software features would hopefully become less and this endless discussion would be reduced.
I've tested FSR3 FG on LAD Gaiden and Infinite Wealth with my crappy 1650 Max Q laptop, and it honestly blows me away. With that turned off, it barely runs stable at 30-40 fps, but when i turn it on boom 70-80 fps all day all night and in congested area as well. It's extremely rare that it dips below 60 fps, and the visual dowgrade is negligible. It's magic, lol
I respect that you pronounce "gif" in both ways. Although it's actually pronounced "gif" and not "gif", enough people say "gif" to make it socially acceptable.
It's really funny that they talk about Temporal Stability in the upscaling in Ratchet and Clank, because in the game itself, Temporal Stability is also a thing, but it's something very different :P
Let me get this straight: In order to demonstrate barely noticeable differences in graphical quality someone thought it was a wonderful idea to use animated GIFs for the showcase, a heavily compressed, absurdly color-reduced, flickering low-frame "video" format for comparison. This has to be the most stupid use of a particular file format in digital history.
It's funny because I completed Ratchet & Clank on my PC, not knowing that the game enabled FSR on its own and I never noticed it. It's just super strange to me
using ancient gif for presenting anything around image quality is just silly. Every browser can do video playback these days so there is absolutely no excuse.
Your guys videos are always so clean and well put together I've really come to prefer you guys over LTT and J2C. And I TRUST what you say. Thats a big one for me. Keep up the great work guys!
@@LnDSuvGN is definitely good if you want extreme deep dives. the stuff that bores all but the most techy of us. Hardware Unboxed just has a way of packaging and delivering that I find very appealing and still gives me 99% of the info I needed.
I find it extremely funny how Tom was just on moores law is dead podcast talking about the quality of FSR. And not too long after that, this was announced.
And they did it all without AI, including FG, which Nvidia always maintained needed AI to work. I'd imagine the work on FSR is much more difficult given the hardware constraints, and they really are the peoples' hero for this stuff.
Make no mistake if AMD could charge more for their gpus then they would lol they're not heroes. They're "better" than nvidia currently but they're not our friend either and we have to remember that. Btw my rig is all AMD so I'm not a fanboy either.@@aaronriggs4430
@@aaronriggs4430The only reason why it's widely available is because that's the only thing that goes in their favour. Intel which is basically a newborn in the GPU market already had a superior upscaler. Let's imagine a case where FSR is not the worse upscalers out of the 3, do you think they will be as open?
This comment section is about to go insane at the pronunciation of GIF... AND RIGHTLY SO BECAUSE IT SHOULD BE PRONOUNCED GIF (no matter what the creator of it says)
As an owner of a cheap RTX 3090 Ti who chose it over an overpriced RTX 4070 Ti, I am really happy that I will be able to enable DLSS + AMD Frame Generation. That’s the only thing I missed!!!
I would still like to see a video on building a rig to run modern games @1080p native ~60fps vs taking the same funds to build a rig for 1440p but then having to use upscaling (or running games at 30fps). Which of those 2 solutions delivers the overall best image quality and gameplay experience?
AMD making FSR3 frame gen to work with other upscalers to benefit non AMD GPUs is of no advantage to AMD. People using older NVIDIA GPUs will use this and when they upgrade, they will move to newer NVIDIA cards and there is "0" benefit to AMD.
Tim switching between pronouncing it as jif and gif with the ease of a 4090 running Pac-Man in 320x240. Truly a man of the people. (Although, to be clear, it's gif.)
I'm Team Green (for it's proprietary shackles benefits), but I tip my hat to AMD on the upscaling department. They manage to upscale stuff using raw upscaling only. Impressive. I'm looking forward to see the result combined with AI.
There's still plenty of idiots out there that somehow think "Graphics" (ala "Graphics Interchange Format") is pronounced "Jiraphics". 🤷 Sad to see Tim join that group though. Lol you'd really think he'd know how to pronounce "graphics" as someone running a PC gaming hardware channel.
@@Osprey850The creator didn't/doesn't fucking understand how acronyms work. If he wanted it to be pronounced like that, he shouldn't have made it an acronym... But he did. So it doesn't matter WHAT the creator thinks as "Graphics" = hard G. No matter how much the creator wishes it was, "Graphics" is NOT pronounced "Jiraphics", end of story. 🤷
@@agent_galactic2856 As a matter of fact, the specification for the format includes the line "Choosy programmers choose GIF" in a nod to the JIF peanut butter slogan of the time, "Choosy moms choose JIF." 😀
me a RTX 2060 super owner watching this video joy knowing my fsr fram gen will work with dlss, won't have to sacrifice image quality this is great news!
Fsr should implement ai and advanced shader optimizations as a bare minimum, dual floating point as well as other advances in their more modern architectures with the current methods as a fallback
Mark papermaster pretty much confirmed the AI part a week ago across all their current latest platforms, he said something to the effect that they got they hardware out and 2024 is mass deployment for AI upscaling. So fsr 3.1 is probably a preview of the much better upscalers launching this year
Someone got a SERIOUS "talking to" about "managing expectations" after FSR3 was launched in a broken state in two games that sucked. They're making absolutely sure nobody can say "but it looked much better in AMD's presentation" and they're making sure it debuts in a game that has gotten generally favorable reviews and has a broad appeal. Life with AMD has always been "I like their products, but nobody can botch AMD's image like AMD". It has slowly gotten better under Dr. Su, but it's still far from perfect. Back in the day there used to be two rules that always applied: "The harder they hype, the harder they belly flop!" and "When AMD is quiet is when AMD is dangerous".
That’s really cool about frame gen. Especially if it works well with the other upscalers. I was down on the need for frame gen at first but now I see how beneficial it can be when you’re going for a visual in a game you can run but not super well but frame gen can put you over the top and make the gameplay look so much smoother. It’s awesome technology.
It's about time that Cyberpunk gets a good FSR implementation. I have no idea why it's taking them so long. In 1.60 they fixed a lot of issues plauging fsr-implementations, they had no ghosting behind cars anymore, cleaner resolve on neon signs etc. But then those things were gone with the 1.62 DLSS3 patch. I hope they do a proper implementation this time, and preferably with FSR3.1. I REALLY hope it's not that nvidia sponsorship that is causing delays and subpar implementations of FSR.
Could this be why Cyberpunk 2077 hasn't added FSR 3 frame generation yet? they're wanting to implement the newer 3.1 version so we can finally use it with DLSS?
They didn't implement it cause FSR 3 doesnt work 90% of the time and when it does its not really that worth it. So why waste money in adding feature that doesn't bring any benefit to the game?
Reminder that FSR 2.0 and later 2.1 modded into games, replacing the DLSS motion vector inputs, has looked amazing in most games for close to 2 years now. Unfortunately it's most of the official implementations of FSR2 that have been a let down. Modded FSR 2.1 looks great even at 1080 Performance upscaling, with a temporally stable image, no shimmering and no disocclusion. FSR 2.2 has been a downgrade from 2.1 somehow, even modded. Hopefully 3.1 brings improvements and makes it even better than 2.1.
If AMD can match DLSS with their upscaling, there will be no other reason to buy Nvidia, for me. DLSS is the only thing that makes me still consider Nvidia.
I'm not sure why you're putting the sad face - GIF is supposed to be pronounced with a soft G (as in gel) - the developers of GIFs confirmed this. (Nb. That's not to say you have to say jif, I prefer to call it gif with a hard G (as in golf) too, but you can't exactly suggest HUB is wrong for saying jif)
They need to integrate frame generation with Ray tracing, both nVidia and AMD. In ray tracing you have the BVH of all objects even those out of screen or occluded. If you use that data you can generate better frames when there is disocclusion. It would work only for ray tracing, but still something. Another possible improvement for frame generation is using accumulated data from uspcaling to handle disocclusion for frame generation, but this likely requires machine learning. It would be something similar to what nVidia does in ray reconstruction, but applied to frame generation instead. Instead of better denoising, better generation of specific parts of the image.
Thank jeebus! I hadn't realized it wasn't since 2022 that they didn't update. Even at 4k, the thing had some serious deficiencies, whereas DLSS was just more fps for a barely, if at all, noticeable difference. I just hope we don't have to remind them once a year or something lol
I really didn't have much of a problem with FSR2 at 4k quality settings. Not that it was as good as DLSS, but I never looked at it and thought "well this looks like shit" I have a 4080, but played a few games that only shipped with FSR2 (Jedi Survivor most recently), and lemme tell you that was the least of my issues with that game lol. Looked totally fine, if I pixel peeped I could spot the difference (checked once they added DLSS), but it was like... fine. Didn't detract from my enjoyment at all. The crashing and the stuttering? That detracted from my enjoyment. I am glad they're working on making it acceptable for people running at lower resolutions, though. Some of the comparisons there are jarring.
@@NamTran-xc2ip Yeah, shockingly bad. I could have dealt with the stutters, even though it's bullshit. Those were isolated to certain areas, at least on my system, which is fairly high end. But the crashing issues that game had were the worst I've ever seen. Have had a gaming PC since 2014, never experienced anything like that. I literally stopped playing on the last level because it crashed 3 times after prolonged sequences. Probably crashed on me 50 times in total. Only got one in the refund window too lol, which I assumed was a one-off thing.
I still don't understand the point of FG, at higher frame rates it just seems pointless, but at lower frame rates where it'd make sense to use it, it doesn't have enough data to work with and just feels like shit. genuinely what is the point?
@@kerokero3082 I feel like 60-70 FPS is high enough that it's just redundant to use it. sub 40 FPS seems like the place it'd make most sense to generate frames, but again there's not enough data to work with. I still don't see the point of this tool
I'm always excited for FSR updates so i can use it on my Nvidia Titan XP that can't do DLSS. Thankyou AMD, i'm glad you're not Nvidia. Also thankyou for Freesync. Also thankyou for Mantle which we use in Vulkan and Direct X 12 API.
This look like a massive improvement over FSR 3.0, especially considering they were showing 1080p performance mode, which if you did that with FSR 3.0, it would look really bad as the internal resolution would be something like 500p lol, that also bolds well for higher resolutions and quality settings that it might be getting to the point that the differences from FSR and DLSS are far smaller for it to not matter. But we'll have to wait and see it in the wild.
Finally! This looks much more usable than current FSR iterations. Let's hope that this can be updated on old games by the user because developers really don't care about updating it themselves.
I have to imagine that the reason why they didn't try to improve their upscaler for the past 15 months is because they were focused entirely on frame generation and also making it a driver level feature.
I'm fully subscribing to the idea that the people in charge didnt even use FSR3 themselves until a handful of weeks ago and now realized they put out rubbish. Wish them the best on the way of fixing it, its about time...
@@megamanx1291 No, helldivers 2 is legitimately underperforming, not nearly as bad as some of last year's games, but still underperforming(especially on amd cards), and while turning helldivers own version of upscaling *on* does give you more acceptable performance, it also looks really bad (even at *ultra quality* ), so much so that i would probably not even use it at 4k. I'd much rather have FSR, as that is already much better than whatever they implemented From what i can gather, they dont plan on implementing FSR nor DLSS tho (Or really optimizing the game more because ive looked at every patchnote, yet there is nothing besides making the game crash less) At the very least, the game doesnt stutter 99% of the time from what i saw in my 95 hours of gameplay
@@TotallyCreativeNameBtw that setting doesnt seems to be and upscaler. It just lowers your render resolution. By the way they phrased that setting made me think too that it was an upscaler. I play on a laptop with an rtx 3060 6gb graphics card on it and sometimes, even on lower settings, the VRAM limit tanks so hard that it is very unstable. I really want that they implement some type of upscaler because the visual sacrifices to make it playable are pretty big.
Native with lower settings > Upscaled But for real though, why is upscaling becoming such a big thing? You either just turn down the settings and compromise visual fidelity or upscale and also lose fidelity. I'm yet to find a scenario where I'd use upscaling. Genuinely curious what everyone uses it for
Because DLSS at 1440p and 4K quality setting is essentially free performance. Its nearly identical and in many cases better because you don't have to use TAA.
Still these are good news if games do add the option for using FSR native AA instead of in-game bad implementations of TAA, which are aplenty. With these upgrades FSR 3.1 AA could be a good way to improve visual fidelity at native resolution
I mean... at 1440p, DLSS at quality basically looks like native, except I get more FPS. On DLSS balanced at 1440p, the only time I can see the "worse" visuals is when I'm desperately looking for them. And even then I have to try hard, because I'm not pretending to be a pro gamer looking for visual artifacts I won't ever typically see in general gameplay
FSR is really hurting AMD, especially in the more expensive segment like 7900XT/XTX. At 4K using DLSS Quality is no brainer for me since, i see barely any visual diference when using it and the fps boost is huge, with FSR situation is much different, though it's usable of course it's just not as good, same story with Frame Generation really. And being almost as good as nvidia in a +600$ GPU market is just not good enough in my opinion they need to match green team at least and they are obviously aware of it or they wouldn't be working on AI upscaling.
I have an AMD GPU, planning to buy the 4080S in the near future. These fan boys/girls want me to spend $1000 USD for inferior technology. To me, the cutoff point for AMD is at the 7900GRE. Anything above that, I want all the BEST features. Not wait well over a year for a decent update. Not sure how AMD want us to take them seriously, when they don't even take themselves serious. I can't make this up.
@@Xilent1 I've the regular 4080 myself and it's a great card. It cost me a 1000$ so it was very expensive but i don't regret the purchase, i've not encountered any issues since i got it a year ago. AMD cards are fine as long as you're aware of the compromises and are fine with them. 7900XT is actually very good value on sales, i saw it at a price point of 4070super which is actually very compelling, since it got +30% rasto performance. But ye in high end segment there are a lot of holes in AMD cards - upscaling,fg,VR, power usage in idle and under load, RT and so on. It adds up and makes AMD hard sell to lots of ppl. Their selling point which is more v-ram doesn't work there since you don't need more than 16gb even in 4k/ultra. So they fight on value via pricing but it's a lost cause in general, people who spend a 1000$ want a top product not a product that's inferior in so many things. It all comes down to the user in the end. Everyone got different needs and expectations, i'm sure there are plenty AMD users that are happy with those cards ;)
I love this tech giving my old 2070 mobile and 6 core i7 cpu new life. Im excited to see the new improvements. Frame gen almost feels like when we went from hdd to ssd
When I saw articles about this a couple days ago, I was actually exciting. I mean, I know they added frame generation, but I was very confused when they didn't improve the upscaling quality along with this. Better late then never I guess 🤷♂️ The ghosting reduction is well welcomed.
Honestly. I think this can be really good. Here at HU you're focusing on comparing most noticeable aspects of upscaling - which are stability artifacts. But DLSS gets that stability while loosing a lot of detail even if the motion is minor. I don't think that NV can actually improve this since AI upscaling is much less customizable. So maybe in future if AMD keeps improving their techniques it will actually be better than DLSS.
Actually, ai upscaling can be delivered by Intel in their cpus or gpus or it may help with quality of upscaled games or even more like movies from HD to 4k
Here's a question I have, what if the developer doesn't want to implement the frame generation part of FSR 3.1, and only wants to implement the upscaling aspect. Is it still considered FSR 3.1, or is it considered like FSR 2.3 something like that?
I dont understand why its not possible to set FPS limit when FSR/upscaling is activated in amd drivers - its just utterly stupid. I want to be able to limit fps per game to save power and heat dissipation from my rig!!!!
I'll believe it when I see it. The fact that proof is being delivered via gifs and not actually in motion videos screams red flag to me. Happy to be proven wrong tho.
Hello, can you give me advice is it better to buy lg 27gp850p-b or dell G2724D for 1440p monitor?? I saw you didn't review dell but it has similar specs... Helpp 😂
I have a love-hate relationship with these technologies. The idea behind them was to allow gamers that don't have the top of the line 4090 and 7900XTX GPUs to sacrifice a bit of quality for a higher frame rate. But we're already seeing devs use it as a crutch. They set the same goals in terms of frame rates but with the assumption that everyone will be using frame generation and upscaling tech. And then we get unoptimized poo that's impossible to run at native resolutions at acceptable frame rates even if you do have the best of the best GPUs
I think it’s getting more and more obvious that Microsoft with direct sr would be a very nice help for game developers to implement every new feature that dlss, far, xess and hopefully others will have. Imagine PSSR on Pc.
Much needed update that should have already been announced and being released now. The pace of their updates is rather disappointing but I'm glad it's on the way
6:55 Steam Deck is better described as a handheld gaming PC than a console due to the fact the user has a similar level of freedom when it comes to the choice of software and configuration that you get on a desktop PC or a laptop.
allowing FSR 3 frame gen to work with other upscaling methods is such a huge win for us RTX 20 and 30 series users out there
There is already a mod that does that, but maybe a native implementation is better
@@clinten3131I played the entirety of Banishers and a bit of Alan Wake 2 with that mod and did not notice any issues with it on my RTX3090.
I just sold my RTX 3080 😢
what do u expect from software upscalling?! nvidia have a best upscaling from hardware named tensor cores.
@@Kntl125You realize that NVidia's hardware does similar computations than AMDs software solution? Specialized circuitry/processors are always faster than software on general purpose processors.
Great to see updates coming from AMD, especially incremental ones that reduce the flickering / shimmering.
That's the only reason why I haven't bought an amd card so far
They really need to improve the quality of the characters' hair in FSR. I hope it improves with the update
I'd like to imagine that last video made Amd realize they forgot something 🤣
"Hey Greg, did you push "send" on that update?"
"Ye- er..... now? 😅"
Lol literally days after HU tearing FSR a new one, part of me wants to think it's not a coincidence.
Gotta feel bad for hardware unboxed.
Honestly, given how rushed this presentation looks ("temporary" instead of "temporal", grainy GIFs instead of proper videos), I think it's entirely possible that AMD were like "oh shit, we weren't going to announce this yet but we *have* been working on it, so let's push something out".
😂😂
Decoupling Frame gen from upscaling is great news and I hope AMD delivers on their promise to upgrade FSR upscaling, but at least we'll be able to use frame gen with other upscalers if FSR keeps lagging behind. more options are always great.
Yeah, honestly I'm kinda fine with FSR sucking now. UE5's TSR is pretty great, I'll just turn that on and use it with frame gen
yeah framegen is just useless... upscaling can be decent if it doesn't look like cr@p
Huge win for gamers. We needed an update and more game support
Hopefully the Avatar game gets an update with the frame gen component decoupled so we can use DLSS with FSR frame gen, unless they just add DLSS 3 of course, but to me FSR 3 is very good, however it is strange Avatar STILL does not have DLSS frame gen 🤔
It's coming to more and more games which is great!
@@George-um2vc It's an AMD sponsored title and they often do not allow Nvidia tech to be put into the games. Probably because FSR Frame Gen is inferior to DLSS and they don't want to show it.
@@Kofferr😂😂 homie what? In almost most scenarios amds frame gen boost fps way more. Even on nvidia hardware
@@НААТ It has WAY more artifacts than DLSS. Plus you're forced into using FSR upscaling so that makes the image even worse.
Holy the decoupling is best of both worlds for 30 series owner. Better upscaling from DLSS and fsr3 frame gen. Thats pretty damn sick
For sure! I just sold my 3050 Ti laptop and now have 6750 XT but I was always mad that Nvidia was gatekeeping FSR for 40 series cards when it's probs more useful for 30 series owners
I am glad there is finally an official support for that. Meanwhile you can use a mod DLSSG to FSR3 for games that support DLSS frame gen to still use DLSS for upscaling but FSR3 for frame gen on RTX 20 and 30 series.
@@iitzfizz Nvidia is gatekeeping frame generation not FSR since that's AMD's branding.
Been using that DLSSG to FSR3 mod that allows DLSS upscaling while injecting AMD's frame gen and it works/looks pretty good on my 2080. Good to see this is being allowed by official means.
The mod is ok. I use it in Cyberpunk on a 3080 to take my 50 to 70fps higher but it's still got a lot of issues
@@lilpain1997 If you werent aware, theres an anti-ghosting mod for cyberpunk FSR3 mod and it really cleans up the image, especially behind moving cars. 100fps 1440p pathtraced on a 3090 is liberating!
@@lilpain1997it's highly game dependent and I don't think it's working great in Cyberpunk.
@@aaronriggs4430 it works but it's not perfect.
@@H3LLGHA5T any game you move fast in has issues. Witcher 3 also has same issue. Cyberpunk has a mod that can help but it doesn't fix anything and it's a really hacky way of doing it according to the mod dev.
Just a reminder that XeSS was promised to be open source 2 years ago, but that promise hasnt been kept yet.
At least they didn't lock it down behind arc graphics cards like DLSS
[EDIT: AMD have fixed the typo on the blog post now. Original comment below...]
Am I wrong to think that "temporary stability" was just a typo? They refer to the more usual phrase "temporal stability" in a graphic lower down in the announcement. It's too late, press has all quoted it now, but I have to believe it's a typo...
Yeah it's def a screwup, should be temporal stability.
They fixed the typo.
Sounds like AMD did an oopsie there, but journalists shouldn't just copy-paste things either, that just amplifies the damage. They should spot such an obvious mistake, and append a correction -- their duty is to spreading _the truth_ after all (well, supposedly).
@@danieltoth9742isn't the truth that AMD did indeed make a typo?
@@ThePurplePassage Well, yesn't. AMD did make a mistake, but if a journalist knows that a mistake was made (which they should, because at the bare minimum they must know what _they themselves_ are talking about, which is enough to notice such an obvious typo), they should put an asterisk, and say "we assume AMD meant to say 'temporal stability' instead". AMD isn't blameless, but a journalist's job isn't to spread the misinformation either. They should've noticed this.
If upscaling at the quality setting gives me +30% frame rates but looks like a dog's dinner, I'm never using it. If upscaling gives me +15% frame rates but looks (figuratively) pristine, I'm using it every time. AMD/Nvidia love to put the performance figures first, but I'd argue that image quality is the highest priority.
@@kadupse FSR nativeAA or with increased VSR is also available
FSR FG + DLSS Upscaling will be the best for RTX 3000 series users.
Modders are already doing it. FSR FG is decent but not perfect.
That ghosting reduction is pretty massive.
4:10 Yes, the image has less shimmering, but also is much more blurrier.
Hell yeah to FSR 3 frame gen decoupling!!
RTX 3000 & 2000 series owners like myself, Rejoice!
Cheers
This times 1000
indeed!!
Yes I’m very happy for you guys. Nvidia locked you out but you get back in the party. It’s awesome technology.
Shame Nvidia is the one whose sticking the fingers upto you though
My 1650 after my 3060ti fried:
"At last, we will have revenge."
Once again AMD being considerably more consumer friendly than Nvidia. Even supporting Nvidia owners
It's not puzzling why frame generation and upscaling where coupled, they literally explained it: FG uses the optical flow analysis from the upscaler and so it needed that input.
I gues the change now is that that optical flow analysis can run independently pf the upscaler, but I'd expect that to increase the resource consumption of the frame generation of course.
AMD is just unbelievable. Showcasing new visual improvements by grainy gifs.
Grainy? Oh no, that's pretty accurate, actually. In 1080p FSR2 upscaling looked like that, has been for way too long. Also, that's zoomed. Of course it doesn't look the best, but the message is there.
@@akirajkr effing AMD... this is almost an insult to Sony and Microsoft to fix it 3 years later after they are implementing their own solutions...
It’s an appreciation for the beautiful technology that is gifs. AMD gets it. : )
😂😂
@@akirajkrno it isn't. Those examples provided are not accurate at all even when compared to FSR 2 1080p/Performance mode/540p internal resolution. Aside from the usual upscaling artifacts, there are lots of gif artifacts mixed into it which is baffling considering that they are showing off the improvements they've made to their own technology.
At this point, AMD is not even axing their own foot. It's more like they are purposely stepping on the axe themselves every time.
This is the update I've been waiting for. Recently switched from an Nvidia card to an AMD card and the only thing I miss is DLSS. FSR looks so poor in most games. I would just opt to lower settings rather than turning it on if I need higher FPS. I imagine even with this update, it still won't be quite as good as DLSS, but if it's close, that's good enough for me.
Buy 4k monitor and problem solved.
@@opreax2145 "buy a 250-300 USD monitor because you decided to save 1-10% (between 10-100 USD across all current gen GPUs) by getting an AMD GPU instead of the Nvidia equivalent (excluding 4060 Ti)."
seems legit
LMAO
AMBug users: all coping 💨👾
@@Thejacketof-huang which bugs specifically are you referring to? Cause I haven't seen them.
@@rinsenpai135the cheapest 16gb nvidea gpu is rtx 4070ti super, except the 4060ti which is just too weak for 4k. Amd has 6800xt or 7800xt for almost half the price. I'd pay 20% more for DLSS personaly but simply no sub 500$ nvidea gpus with enough vram for the future
Hell yeah! FSR is finally moving it butt forward!
Part of me wants this to be influenced by PSSR or at least building up support for that switch over.
@retrowrath9374 Pretty much yeah. I'm just hopeful they'll be willing to share some of the improvements, at least in their own pc ports.
@DigitalJedi well there's the one that Insomniac made in Ratchet and Clank and it's absolutely horrible.
@@Jake-0011 That uses the existing FSR2.2 upscaler (3.0 is the same + frame gen). The developers likely didn't tune it well for the game, or the visuals of that game don't play nicely with whatever algorithm FSR2.2/3.0 uses.
I like how they announce it for Q2 for developers wouldn't surprise me if they realesed a half finished version in the next couple of weeks
An additional positive aspect would be that with a significant improvement in FSR 3.1, these heated, sometimes unobjective comparisons of the software features would hopefully become less and this endless discussion would be reduced.
Tim's outro always sounds mildly threatening. Like "you got away this time, you rascal! I'll catch you in the next one"
I've tested FSR3 FG on LAD Gaiden and Infinite Wealth with my crappy 1650 Max Q laptop, and it honestly blows me away. With that turned off, it barely runs stable at 30-40 fps, but when i turn it on boom 70-80 fps all day all night and in congested area as well. It's extremely rare that it dips below 60 fps, and the visual dowgrade is negligible. It's magic, lol
I respect that you pronounce "gif" in both ways.
Although it's actually pronounced "gif" and not "gif", enough people say "gif" to make it socially acceptable.
lol
It's really funny that they talk about Temporal Stability in the upscaling in Ratchet and Clank, because in the game itself, Temporal Stability is also a thing, but it's something very different :P
Let me get this straight: In order to demonstrate barely noticeable differences in graphical quality someone thought it was a wonderful idea to use animated GIFs for the showcase, a heavily compressed, absurdly color-reduced, flickering low-frame "video" format for comparison. This has to be the most stupid use of a particular file format in digital history.
It's funny because I completed Ratchet & Clank on my PC, not knowing that the game enabled FSR on its own and I never noticed it. It's just super strange to me
using ancient gif for presenting anything around image quality is just silly. Every browser can do video playback these days so there is absolutely no excuse.
Nice to hear about these future improvements and not "fixes".
FSR 3.1 sounds like a win for all gamers. Since FSR is available on both PC and Consoles, we'll be able to get better looking games on all platforms.
Your guys videos are always so clean and well put together I've really come to prefer you guys over LTT and J2C. And I TRUST what you say. Thats a big one for me. Keep up the great work guys!
HUB and GN are the endgame
@@LnDSuvGN is definitely good if you want extreme deep dives. the stuff that bores all but the most techy of us. Hardware Unboxed just has a way of packaging and delivering that I find very appealing and still gives me 99% of the info I needed.
I find it extremely funny how Tom was just on moores law is dead podcast talking about the quality of FSR. And not too long after that, this was announced.
Keep up the great work man
I love how even AMD admits that their FSR 2.0 was just a blurry mess.
Have they neglected or just been working on it. I don’t look past these are incredibly difficult things to solve.
And they did it all without AI, including FG, which Nvidia always maintained needed AI to work. I'd imagine the work on FSR is much more difficult given the hardware constraints, and they really are the peoples' hero for this stuff.
Make no mistake if AMD could charge more for their gpus then they would lol they're not heroes. They're "better" than nvidia currently but they're not our friend either and we have to remember that. Btw my rig is all AMD so I'm not a fanboy either.@@aaronriggs4430
@@aaronriggs4430The only reason why it's widely available is because that's the only thing that goes in their favour. Intel which is basically a newborn in the GPU market already had a superior upscaler. Let's imagine a case where FSR is not the worse upscalers out of the 3, do you think they will be as open?
@@NamTran-xc2ip Intel just like Nvidia, will keep their stuff proprietary as usual.
@@aaronriggs4430 They are such heroes, that they have 10% market share lmao... pls
Welcome to Hard Ron box! Gotta love dem subtitles 😂
This comment section is about to go insane at the pronunciation of GIF... AND RIGHTLY SO BECAUSE IT SHOULD BE PRONOUNCED GIF (no matter what the creator of it says)
It should be pronounced Yiff
As an owner of a cheap RTX 3090 Ti who chose it over an overpriced RTX 4070 Ti, I am really happy that I will be able to enable DLSS + AMD Frame Generation. That’s the only thing I missed!!!
You can use DLSSFG to FSR3 mod before the official implementation is ready.
I would really love to see better FSR as it really is tough to use in motion.
I would still like to see a video on building a rig to run modern games @1080p native ~60fps vs taking the same funds to build a rig for 1440p but then having to use upscaling (or running games at 30fps). Which of those 2 solutions delivers the overall best image quality and gameplay experience?
AMD making FSR3 frame gen to work with other upscalers to benefit non AMD GPUs is of no advantage to AMD. People using older NVIDIA GPUs will use this and when they upgrade, they will move to newer NVIDIA cards and there is "0" benefit to AMD.
Tim switching between pronouncing it as jif and gif with the ease of a 4090 running Pac-Man in 320x240. Truly a man of the people. (Although, to be clear, it's gif.)
I'm Team Green (for it's proprietary shackles benefits), but I tip my hat to AMD on the upscaling department. They manage to upscale stuff using raw upscaling only. Impressive. I'm looking forward to see the result combined with AI.
4:10 did...did Tim just say JIF with a soft G?
That's how it's pronounced, according to its creator. I've been saying it that way for 30 years, myself.
Peanut butter encoder
There's still plenty of idiots out there that somehow think "Graphics" (ala "Graphics Interchange Format") is pronounced "Jiraphics". 🤷 Sad to see Tim join that group though. Lol you'd really think he'd know how to pronounce "graphics" as someone running a PC gaming hardware channel.
@@Osprey850The creator didn't/doesn't fucking understand how acronyms work. If he wanted it to be pronounced like that, he shouldn't have made it an acronym... But he did. So it doesn't matter WHAT the creator thinks as "Graphics" = hard G. No matter how much the creator wishes it was, "Graphics" is NOT pronounced "Jiraphics", end of story. 🤷
@@agent_galactic2856 As a matter of fact, the specification for the format includes the line "Choosy programmers choose GIF" in a nod to the JIF peanut butter slogan of the time, "Choosy moms choose JIF." 😀
me a RTX 2060 super owner watching this video joy
knowing my fsr fram gen will work with dlss, won't have to sacrifice image quality
this is great news!
Fsr should implement ai and advanced shader optimizations as a bare minimum, dual floating point as well as other advances in their more modern architectures with the current methods as a fallback
Mark papermaster pretty much confirmed the AI part a week ago across all their current latest platforms, he said something to the effect that they got they hardware out and 2024 is mass deployment for AI upscaling.
So fsr 3.1 is probably a preview of the much better upscalers launching this year
AMD showing improvements in freakin gifs in 2024, smh.
Tbh still better than nvidia prerendered fake footage of fg they "recorded"
for cyberpunk fg showcase.
at least you can still see the difference
@@notenjoying666right? I remember it was fishy. I wonder what's the deal with that.
Someone got a SERIOUS "talking to" about "managing expectations" after FSR3 was launched in a broken state in two games that sucked. They're making absolutely sure nobody can say "but it looked much better in AMD's presentation" and they're making sure it debuts in a game that has gotten generally favorable reviews and has a broad appeal.
Life with AMD has always been "I like their products, but nobody can botch AMD's image like AMD". It has slowly gotten better under Dr. Su, but it's still far from perfect. Back in the day there used to be two rules that always applied: "The harder they hype, the harder they belly flop!" and "When AMD is quiet is when AMD is dangerous".
@@notenjoying666How is it better when FSR3 is literally worse?😂😂😂
as soon as is open sourced, all the MODs will be updated
And then just put a .ASI file in the game folder to update older FSR
That’s really cool about frame gen. Especially if it works well with the other upscalers. I was down on the need for frame gen at first but now I see how beneficial it can be when you’re going for a visual in a game you can run but not super well but frame gen can put you over the top and make the gameplay look so much smoother. It’s awesome technology.
It's about time that Cyberpunk gets a good FSR implementation. I have no idea why it's taking them so long. In 1.60 they fixed a lot of issues plauging fsr-implementations, they had no ghosting behind cars anymore, cleaner resolve on neon signs etc. But then those things were gone with the 1.62 DLSS3 patch. I hope they do a proper implementation this time, and preferably with FSR3.1. I REALLY hope it's not that nvidia sponsorship that is causing delays and subpar implementations of FSR.
I noticed the use of both 'gif' pronunciations
Could this be why Cyberpunk 2077 hasn't added FSR 3 frame generation yet? they're wanting to implement the newer 3.1 version so we can finally use it with DLSS?
They didn't implement it cause FSR 3 doesnt work 90% of the time and when it does its not really that worth it. So why waste money in adding feature that doesn't bring any benefit to the game?
@@vasilije94 Because It's been a while since they've announced it's coming later in an update.
@@vasilije94 wow, so edgy
Hopefully that’s the reason as that would be an incredible update for the game.
@@vasilije94How does FSR3 frame gen not work 90% of the time? The hell you talking about?
Reminder that FSR 2.0 and later 2.1 modded into games, replacing the DLSS motion vector inputs, has looked amazing in most games for close to 2 years now. Unfortunately it's most of the official implementations of FSR2 that have been a let down.
Modded FSR 2.1 looks great even at 1080 Performance upscaling, with a temporally stable image, no shimmering and no disocclusion.
FSR 2.2 has been a downgrade from 2.1 somehow, even modded. Hopefully 3.1 brings improvements and makes it even better than 2.1.
If AMD can match DLSS with their upscaling, there will be no other reason to buy Nvidia, for me. DLSS is the only thing that makes me still consider Nvidia.
Power efficiency, and ray tracing,
imagine watching this channel for years and only now finding out he says "JIF" instead of GIF :(
yiff
I'm not sure why you're putting the sad face - GIF is supposed to be pronounced with a soft G (as in gel) - the developers of GIFs confirmed this.
(Nb. That's not to say you have to say jif, I prefer to call it gif with a hard G (as in golf) too, but you can't exactly suggest HUB is wrong for saying jif)
They need to integrate frame generation with Ray tracing, both nVidia and AMD. In ray tracing you have the BVH of all objects even those out of screen or occluded. If you use that data you can generate better frames when there is disocclusion. It would work only for ray tracing, but still something. Another possible improvement for frame generation is using accumulated data from uspcaling to handle disocclusion for frame generation, but this likely requires machine learning. It would be something similar to what nVidia does in ray reconstruction, but applied to frame generation instead. Instead of better denoising, better generation of specific parts of the image.
if amd can improve FSR image stability, ghosting issues
I may end up switching to team Red in next upgrade.
Thank jeebus! I hadn't realized it wasn't since 2022 that they didn't update. Even at 4k, the thing had some serious deficiencies, whereas DLSS was just more fps for a barely, if at all, noticeable difference. I just hope we don't have to remind them once a year or something lol
Except that isn't entirely the case.
FSR 2.2.0: November 8, 2022
FSR 2.2.1: June 9, 2023
FSR 2.2.2: December 14, 2023
I really didn't have much of a problem with FSR2 at 4k quality settings. Not that it was as good as DLSS, but I never looked at it and thought "well this looks like shit"
I have a 4080, but played a few games that only shipped with FSR2 (Jedi Survivor most recently), and lemme tell you that was the least of my issues with that game lol. Looked totally fine, if I pixel peeped I could spot the difference (checked once they added DLSS), but it was like... fine. Didn't detract from my enjoyment at all. The crashing and the stuttering? That detracted from my enjoyment.
I am glad they're working on making it acceptable for people running at lower resolutions, though. Some of the comparisons there are jarring.
@@Skeames1214fsr quality at 4k is actually decent. But the game is not. It's a shitty optimized port.
@@NamTran-xc2ip Yeah, shockingly bad. I could have dealt with the stutters, even though it's bullshit. Those were isolated to certain areas, at least on my system, which is fairly high end. But the crashing issues that game had were the worst I've ever seen. Have had a gaming PC since 2014, never experienced anything like that. I literally stopped playing on the last level because it crashed 3 times after prolonged sequences. Probably crashed on me 50 times in total. Only got one in the refund window too lol, which I assumed was a one-off thing.
@@NaokiWatanabe 2.2.2 was just build version change since they added it in their FFX sdk, they didnt update it internally just changed number...
Decoupling frame gen from FSR is a HUGE news for rtx 2000/3000 owners. My 3060Ti may last longer than I thought!
I still don't understand the point of FG, at higher frame rates it just seems pointless, but at lower frame rates where it'd make sense to use it, it doesn't have enough data to work with and just feels like shit. genuinely what is the point?
60-70fps native for FG is the sweet spot, that's the point LoL
@@kerokero3082 I feel like 60-70 FPS is high enough that it's just redundant to use it. sub 40 FPS seems like the place it'd make most sense to generate frames, but again there's not enough data to work with.
I still don't see the point of this tool
I'm always excited for FSR updates so i can use it on my Nvidia Titan XP that can't do DLSS. Thankyou AMD, i'm glad you're not Nvidia.
Also thankyou for Freesync. Also thankyou for Mantle which we use in Vulkan and Direct X 12 API.
This look like a massive improvement over FSR 3.0, especially considering they were showing 1080p performance mode, which if you did that with FSR 3.0, it would look really bad as the internal resolution would be something like 500p lol, that also bolds well for higher resolutions and quality settings that it might be getting to the point that the differences from FSR and DLSS are far smaller for it to not matter.
But we'll have to wait and see it in the wild.
Jif... I can't even.
or the even more cursed "yiff"
Kills me how you need some nonsense to make a $800+ GPU do what you expect it should.
Now we just need the AMD developers to get it integrated directly into game engines like Godot for even easier one-click integration.
Finally! This looks much more usable than current FSR iterations. Let's hope that this can be updated on old games by the user because developers really don't care about updating it themselves.
this was worse case scenario with 1080P performance mode (540p)
That's a win for amd.
Any improvements are welcome. I was surprised Xess looked better than FSR when I tried it.
I have to imagine that the reason why they didn't try to improve their upscaler for the past 15 months is because they were focused entirely on frame generation and also making it a driver level feature.
yeah, showing off visual improvements by using visually limitied media doesn't make any sense unless the intention is to hide stuff.
I'm fully subscribing to the idea that the people in charge didnt even use FSR3 themselves until a handful of weeks ago and now realized they put out rubbish. Wish them the best on the way of fixing it, its about time...
Completely off topic and you probably won't notice this, but will you be doing a performance settings review for Dragon's Dogma 2?
I'm looking forward to seeing you test this.
Helldivers needs FSR
Is it laggy on pc?
@@GodisGood941No he has a potato
@@megamanx1291 No, helldivers 2 is legitimately underperforming, not nearly as bad as some of last year's games, but still underperforming(especially on amd cards), and while turning helldivers own version of upscaling *on* does give you more acceptable performance, it also looks really bad (even at *ultra quality* ), so much so that i would probably not even use it at 4k.
I'd much rather have FSR, as that is already much better than whatever they implemented
From what i can gather, they dont plan on implementing FSR nor DLSS tho (Or really optimizing the game more because ive looked at every patchnote, yet there is nothing besides making the game crash less)
At the very least, the game doesnt stutter 99% of the time from what i saw in my 95 hours of gameplay
@@TotallyCreativeNameBtw that setting doesnt seems to be and upscaler. It just lowers your render resolution.
By the way they phrased that setting made me think too that it was an upscaler.
I play on a laptop with an rtx 3060 6gb graphics card on it and sometimes, even on lower settings, the VRAM limit tanks so hard that it is very unstable. I really want that they implement some type of upscaler because the visual sacrifices to make it playable are pretty big.
@@megamanx1291I've got a 4090 and 13900k, Helldivers runs like garbage and desperately needs proper upscaling, be it fsr or dlss.
Native with lower settings > Upscaled
But for real though, why is upscaling becoming such a big thing? You either just turn down the settings and compromise visual fidelity or upscale and also lose fidelity. I'm yet to find a scenario where I'd use upscaling. Genuinely curious what everyone uses it for
Because DLSS at 1440p and 4K quality setting is essentially free performance. Its nearly identical and in many cases better because you don't have to use TAA.
Still these are good news if games do add the option for using FSR native AA instead of in-game bad implementations of TAA, which are aplenty. With these upgrades FSR 3.1 AA could be a good way to improve visual fidelity at native resolution
@@thelegendaryklobb2879 ahh, I see. That makes sense
i dont know about that boss i use dldsr with dlss and its way better than native
I mean... at 1440p, DLSS at quality basically looks like native, except I get more FPS.
On DLSS balanced at 1440p, the only time I can see the "worse" visuals is when I'm desperately looking for them. And even then I have to try hard, because I'm not pretending to be a pro gamer looking for visual artifacts I won't ever typically see in general gameplay
I hope more and more gamers keep sticking with Nvidia so AMD can get rid of their awful slow GPUs and focus on what really matters, CPUs and AI
RX7900xtx🥰
FSR is really hurting AMD, especially in the more expensive segment like 7900XT/XTX.
At 4K using DLSS Quality is no brainer for me since, i see barely any visual diference when using it and the fps boost is huge, with FSR situation is much different, though it's usable of course it's just not as good, same story with Frame Generation really.
And being almost as good as nvidia in a +600$ GPU market is just not good enough in my opinion they need to match green team at least and they are obviously aware of it or they wouldn't be working on AI upscaling.
I have an AMD GPU, planning to buy the 4080S in the near future. These fan boys/girls want me to spend $1000 USD for inferior technology. To me, the cutoff point for AMD is at the 7900GRE. Anything above that, I want all the BEST features. Not wait well over a year for a decent update. Not sure how AMD want us to take them seriously, when they don't even take themselves serious. I can't make this up.
@@Xilent1 I've the regular 4080 myself and it's a great card. It cost me a 1000$ so it was very expensive but i don't regret the purchase, i've not encountered any issues since i got it a year ago.
AMD cards are fine as long as you're aware of the compromises and are fine with them. 7900XT is actually very good value on sales, i saw it at a price point of 4070super which is actually very compelling, since it got +30% rasto performance. But ye in high end segment there are a lot of holes in AMD cards - upscaling,fg,VR, power usage in idle and under load, RT and so on. It adds up and makes AMD hard sell to lots of ppl.
Their selling point which is more v-ram doesn't work there since you don't need more than 16gb even in 4k/ultra. So they fight on value via pricing but it's a lost cause in general, people who spend a 1000$ want a top product not a product that's inferior in so many things.
It all comes down to the user in the end. Everyone got different needs and expectations, i'm sure there are plenty AMD users that are happy with those cards ;)
@@Xilent1💯
I love this tech giving my old 2070 mobile and 6 core i7 cpu new life. Im excited to see the new improvements. Frame gen almost feels like when we went from hdd to ssd
Still waiting on FSR3(+) to drop in pretty much any of the titles advertised from several months ago 😢
When I saw articles about this a couple days ago, I was actually exciting. I mean, I know they added frame generation, but I was very confused when they didn't improve the upscaling quality along with this. Better late then never I guess 🤷♂️ The ghosting reduction is well welcomed.
Honestly. I think this can be really good. Here at HU you're focusing on comparing most noticeable aspects of upscaling - which are stability artifacts. But DLSS gets that stability while loosing a lot of detail even if the motion is minor. I don't think that NV can actually improve this since AI upscaling is much less customizable. So maybe in future if AMD keeps improving their techniques it will actually be better than DLSS.
Actually, ai upscaling can be delivered by Intel in their cpus or gpus or it may help with quality of upscaled games or even more like movies from HD to 4k
For me it is always important what the frame rate is since that is one of the biggest factors in quality. Higher framerate brings better quality.
Here's a question I have, what if the developer doesn't want to implement the frame generation part of FSR 3.1, and only wants to implement the upscaling aspect. Is it still considered FSR 3.1, or is it considered like FSR 2.3 something like that?
HUB bullied those frauds into improving FSR im crying 😭😭
Fix FSR and uncripple the 7900 GRE
I dont understand why its not possible to set FPS limit when FSR/upscaling is activated in amd drivers - its just utterly stupid. I want to be able to limit fps per game to save power and heat dissipation from my rig!!!!
I'll believe it when I see it. The fact that proof is being delivered via gifs and not actually in motion videos screams red flag to me. Happy to be proven wrong tho.
Hello, can you give me advice is it better to buy lg 27gp850p-b or dell G2724D for 1440p monitor?? I saw you didn't review dell but it has similar specs... Helpp 😂
Thank you so much for the amazingly accurate information you provide regarding newest upscaling technologies on the market! ☺👏👏🙏
I have a love-hate relationship with these technologies. The idea behind them was to allow gamers that don't have the top of the line 4090 and 7900XTX GPUs to sacrifice a bit of quality for a higher frame rate. But we're already seeing devs use it as a crutch. They set the same goals in terms of frame rates but with the assumption that everyone will be using frame generation and upscaling tech. And then we get unoptimized poo that's impossible to run at native resolutions at acceptable frame rates even if you do have the best of the best GPUs
Huge win for gamers who like looking at building edges
I think it’s getting more and more obvious that Microsoft with direct sr would be a very nice help for game developers to implement every new feature that dlss, far, xess and hopefully others will have. Imagine PSSR on Pc.
_JIF?_ My heart, Tim! 💔
yiff
Much needed update that should have already been announced and being released now. The pace of their updates is rather disappointing but I'm glad it's on the way
6:55 Steam Deck is better described as a handheld gaming PC than a console due to the fact the user has a similar level of freedom when it comes to the choice of software and configuration that you get on a desktop PC or a laptop.
As long as they can match TSR I don't think people will complain as much.
I can't wait to play the whole two games that support it! You know, the ones nobody ever heard of!