@@diamonshade7484 as much as I prefer AMD, they probably wouldn't of fixed it. These companies seem to only move when there's an uproar and they're losing money.
Lol no. Its a race between Nvidias software and AMD's hardware. Generally even though amd hardware is better value they fall behind in terms of software. Frame pacing issues are still rampant with fsr frame gen. DLSS upscaling remains superior and so does dlss frame gen on the newer GPUs. And ofc theres the high end where amd hardware cant come close to nvidias.
He meant Amd upscaling does not require specific hardware it's trained by amd developers but dlss is working because of ai cores and cuda cores @RadialSeeker113
They're both software technically, FSR 3 is just software accelerated rather than CUDA accelerated. Technically, DLSS 3.5 doesn't require specific hardware either, as both the 20 series and 30 series have the same cores as the 40 series, just of different generations and amounts. That's why I don't buy Jensen's bs that "Ampere cannot handle frame gen". Nvidia could both create software accelerated frame gen for older cards or just allow cards with the needed hardware to run them. If FSR ever overtakes DLSS, Nvidia will have no choice but to capitulate and stop locking DLSS features.
@@aouyiu Of course you're not buying Jensen's "BS", after all, he known how frame gen works while you assume it works the exact same way upscaling works.
DLSS AND Frame gen. As an 3000 user, this pleases me immensely. This is NOT the first time that AMD save old nvidia GPUs. The GTX users can attest to this.
@@HotFreshBoxThe support for older cards is limited even with FSR 3 upscaling. FSR is designed for AMD cards only. No other GPU manufacture uses FSR or optimizes their drivers for FSR. FSR is 100% optimized for AMD hardware only. The issue with frame generation is that bad upscaling makes it look worse.
@@GodKitty677FSR will run on Nvidia and Intel as well and intel xess runs on amd and Nvidia as well. DLSS is the only one hardware locked. Because DLSS is hardware optimized it makes sense to use it over FSR on Nvidia so cards supporting DLSS should prefer it
I feel like AMD giving FSR 3 access to all GPUs is their way of flipping off NVIDIA for restricting DLSS 3 to 40 series as well as prompting people with older GPUs not to upgrade to 40 series and I'm here for it.
@@Blue-op6qvexcept it won't, flatscreen frame gen simply wouldn't work in vr and vr in fact already has an established frame-gen tech called asynchronous spacewarp (the name is different for all implementations but the concept remains the same, halve your fps and interpolate it to target framerate)
I don't mind the fake frames AMD is doing. They are enabling fake frames for everyone, providing even some extra life for old cards. What I do mind is what Nvidia did, making fake frames the main exclusive feature for a new generation of GPUs.
Even in this video he says why though. "Put shit in, get shit out." It makes sense that they wouldn't want their new expensive technology to get a bad reputation from gamers who think it is shit, because they are trying to framegen up from 15fps.
Gatekeeping frame gen was the saving grace of the 40 series because, without it, it's not a substantial enough upgrade from the 30 series. The only reason the 4070Ti "beat" the 3090 in non-4k gaming.
If you don't know how the tech Nvidia developed works you should probably stop talking or in this care writing. Seriously, every time AMD does something good you mentally challenged RTX 2000 and 3000 owners come out of the wood work complaining about how Nvidia kept innovating both software and hardware wise.
@@Lem_On_Lime the entire point of trying to use frame gen at 15 fps is to hit near 60 fps to make a better gaming experience. No matter how good the damn game is, playing at 15 fps is unbearable. The people trying to get more FPS from the starting point of 15 fps don't give a shit about how graphics look. People with high end PCs aren't going to use frame gen for more fps by sacrificing look of the game to get if they already running the game at high frame rates natively (which is best quality graphics you can get besides modding it).
I want AMD to really make FSR be 1 to 1 with DLSS. That way Nvidia can maybe realize that paying hundreds of dollars extra for better software isn't gonna cut it anymore. And I dont think as gamers that you needing to pay like 25% more for a gpu to have better ray tracing performance is gonna cut it anymore, once the upscalers are even.
better software is controversial, for example i like amd software more. and fsr can't be as good as dlss just because nvidia uses ai and to be more effictive they use cuda cores.
I don't understand why people think developing DLSS was easy or free. Somebody had to say "hey, AI upscaling might be the next big thing", and probably assign a team of people and paid them for a year IN HOPES that they produced something worth selling. I'm sure Nvidia has many projects like this that didn't pan out, and the premium from products like DLSS has to cover those costs associated with creating an entirely new product. AMD on the other hand gets to play the role of white knight, criticizing Nvidia and offering similar tech at a lower cost. Why yes, it's much cheaper to copy something that works because you just have to pay one team to make this one thing, instead of maybe 10 teams to figure out what actually works as a viable product.
Another improvement you did not mention: FSR 3.1 will be available for Vulkan and DX12. So that eases the usage on Linux based operating systems a lot if you don't want to rely on Proton or Wine.
@@Takintomori I'm someone who was working on compiling FSR2 for Linux easily when AMD provided Vulkan support but didn't check whether GCC or Clang on Linux have any issues with their build process. For I'm already waiting on the Vulkan implementation to arrive, so I can test and adjust the code to make it build on Linux. I think Godot developers are also eagerly waiting for FSR 3.1. The engine already supports FSR 2.2 across all platforms but the plan is to support FSR frame generation as well.
They announced that they will stop updating this game, the recent patches were just bug fixes without adding anything. Maybe they will finally add it soon but it also took them a few months to implement FSR 2 in update even though modders did that shortly after this technology launched.
Cyberpunk is basically NVIDIA tech demo. Of course they are not supporting AMD. Besides, I don't understand why people praise it so much in terms of graphics. IMO the game is fugly and looks "flat". Even with path tracing enabled.
@@TheMahtimursu they don’t have exclusivity deal, it still supports XeSS and FSR 2.1 and AFMF. So don’t see how you came to the conclusion it’s not supported. And it’s one of the most detailed games out there. From the reflections, to shadows and lighting to particles. There’s a reason why it’s used as a heavy GPU benchmarking game. It’s one of the best looking out there.
@@kineticbongos No it is definitely not one of the best looking games. I thought I was crazy when I started playing it and it just looks bad. I googled a bit and seems that many others agree. It just looks flat. That is the best way of describing it.
@@steadyclouds4614 I use it sometimes, but dude, he has a point, I don't want low res and fake frames, I want optimized games I can play at 2k native, FSR3 can look nice but by definition it can never look as good as a native 2K image, some people think FSR3 looks better but they are normally the type of people that like bloom and blur and all the fake GFX effects that are dog water, its fake GFX effects used with FSR3, I paid 500$ for a 2k Asus monitor with HDR and perfect colour correction, If I didn't mind how my image looked I would have went with a 70$ 1080p China number one monitor.....
Higher input lag, lower resolution and blurry TAA all with 0 large graphical leaps in a decade (raytracing is not a gamechanger lol) and Moores law no longer being valid , hell yeah the hecking future dude, time to spend 800 euro on a new card 😎
AMD is just straight up going "well...this might not convince everybody to buy our products even if we lock it to our hardware...so might as well give Nvidia the finger.". "Oh, you want people to buy 40 series to increase your market cap? That's cute. Hey, everybody, here's the shit you need to make your gpu last another 5 years. Have fun" And I'm here for it.
I wanted to thank you because I ended up buying an AMD gpu thanks to you and owens videos. I ended up like my amd card so much that I ended up building an entirely new PC with an AMD cpu to go along with it. Previously had intel/nvidia. I don't intend on going back to nvidia or intel assuming AMD continues to do right by their consumers. They have done right by me.
this is why i prefer AMD over NVIDIA, NVIDIA always feeling entitled by them restricting stuff that should be used freely by everyone, that has been proven
it wouldn't be so bad if nvidia decided to not be a scummy company, but here we are. they expect their bells and whistles to sell the product, not their performance. they still sell absolute garbage value cards, and get away with it because they are "the standard" when it comes to graphics cards.
Got a 7800XT this year and I'm, pretty excited about this news. Will have my eyes on this as it develops, can't wait to see for example how cyberpunk will look with FSR 3.1!
@@DL-iy7obOnly people who can't play ray tracing or path tracing says this... I ran around screaming the same shit I went out and got a 4090 and it's night and day difference at 4k.
I was all excited to use Frame Gen on Forbidden West today and it was only giving me an extra 15% fps. It's odd how some games give you a massive uplift while others it falls on it's face.
I don't see the point in endless fake frames and lag surely a small boost and no lag makes more sense a boost if you li,e rather than a doubling or triple so 45 to 60 60 to 75 etc make far more sense to me
Simply put : nvidia employees work against their customers. amd employees work to support all the gamers community. Buying proprietary solutions is voting against yourself. The worst move to date for me from nvidia is the vendor lock-in of their adaptive sync solution, making a monitor features to work on a specific gpu brand is really dirty.
Frame gen is awful. Whenever I enable it. Whether it's FSR 3 or AMD's FMF, It not only tanks my framerate but gives me micro stutters like hell. I go from 100FPS to like 40-50 in MW3 with Frame gen enabled and gives me 50% micro stutters. This is also being used with an RX 7900 XTX. Even though it *Says* it's 120fps the game looks awful and the latency is unplayable. If you want higher framerate then just lower the graphic settings or use FSR/DLSS upscaling. Frame gen is pointless.
Yeah that was my experience in last of us pt 1. There’s games where it works great, so it’s possible but needs a little work with the implementation. We’ll see how it goes
This was one of the most sane, most clear, and most honest presentations of FSR and DLSS that i have encountered, and i thank you for it. I just went through, and honestly am still going through, an internal battle about upgrading my gaming rig. I consider myself fortunate to have a 6800xt gpu with 16gb of vram. It has served me well,. But game studios are really pushing the limit to what graphic levels are in new games and it can be very easy to feel like you are being left behind. I do understand that a lot of this is market strategy to get consumers to constantly be needing to purchase new and "better" parts, but, DAMN, is it ever convincing! I am an older gamer (74!) and my main focus is not so much on a lot of the new games as it is on being able to play the games I already love to the best of my ability. One of my BIG favorites is to heavily mod Skyrim with about 2000 or more mods and see how smoothly I can get it to work well! It's a real challenge, and a lot of fun, and is always teaching me a lot about optimizing my graphics! I also love a lot of older rpg games like past Balder's Gate, Neverwinter Nights, etc., and these games are usually not that heavy on gpus. When i do play more recent games my pc can sometimes struggle, but I have always put a good story before graphics, and usually find a way to make them playable with decent graphics if I really want to. What AMD is exploring right now is very exciting to me, and especially if people can find ways to take the new ideas and learn how to adapt them to lower spec pcs. This has actually appended in the Skyrim modding world. Modders have found ways to make use of DLSS, upscaling, and some FSR techniques for use in heavily modded and very graphic intense veersions of the game. I really think that if this new FSR from AMD actually works well, Skyrim modders will find a way to use it in such an old game, and will help to make it possible for it to be used in other past games! So, I will try to wisely refrain from upgrading right now, and see what happens in the next couple of years. Apologies for such a lengthy post. Old people do tend to go on . . . . and on . . . and . . . . !!!
I know the feeling haha. I’m on a 5700 XT and since I mostly play cs and some Baldurs gate, civilization and other games that aren’t all that demanding, I got along pretty well so far. Hell Divers 2 got me thinking though 🙃 definitely want to update now but also kind of want to wait a little bit longer to see what’s next!
I literally just put together a rig the the 6800XT. I mainly play at 1080/1440 and it works flawlessly. Not sure why yours is struggling, possibly a CPU bottleneck? Driver issue? Regardless of my curiosity, I just stopped by to say it is definitely a terrible time to upgrade. 1. GPU prices are outrageous (hence why I bought an older GPU, the 6800 XT). 2. In every benchmark I can find the 6800XT actually beats out (in rasterized performance, non RT) the 7600/7700 and the 4060 and holds it's own against the 7800 (and XT)/4070(and ti). Losing out on some (varies game to game)but by what I consider negligible amounts (10-20FPS) for the cost. That leaves you with a reasonable upgrade path to a 7900 or a 4080. Both of which are, IMO, overpriced for your performance gain. I would look into possible performance issues as I don't honestly think your 6800XT is performing at optimal levels given the information at hand. Best of luck with whichever course you decide on!
@@ryanwilliams5705 I think you may be reading too much into my comment, or possibly just assuming too much. The ONLY time I mentioned my GPU struggling is this comment: "When i do play more recent games my pc can sometimes struggle" If you read the whole sentence, I do mention being able to find a way to make them playable also, and with decent graphics and fps. This usually comes down to a few very easy tweaks either within game settings or GPU settings. You make it sound like I am experiencing serious problems that are making my games uplayable, when I NEVER SAID THAT!!! What happens when we assume????? What you said about 6800xt benchmarks compared to newer GPUs and the prices of new GPUs for the slight increase is spot on, and is exactly why I am NOT upgrading at this time. What you didn't point out is that these new GPUs also need more power and run much hotter. So the hidden cost of moving up to these new cards can also mean upgrading your power supply and cooling system, and possibly a new cpu and motherboard as well Add to that higher electric bills and the few fps rates suddenly don't look that good any more! If my response to your assumptions sounds too harsh, I apologize. But people are going to read your comment and get an impression that simply is NOT true! I DO keep up with new drivers and I DO optimize my settings for my pc in general, my GPU, and my games. What 'struggles" i may experience are very brief and are usually resolved with a few easy tweaks. The 6800xt is an amazing GPU, and there are only 2 or 3 new GPUs that match or surpass its 16gb of vram or its performance!
@@denniswade4998 "If you read the whole sentence, I do mention being able to find a way to make them playable also, and with decent graphics and fps. This usually comes down to a few very easy tweaks either within game settings or GPU settings. You make it sound like I am experiencing serious problems that are making my games uplayable, when I NEVER SAID THAT!!! What happens when we assume?????" I read the sentence, rather paragraph, in its entirety. While I do agree my curiosity sparked some assumptions, context is also important. Per your comment: “I just went through, and honestly am still going through, an internal battle about upgrading my gaming rig. I consider myself fortunate to have a 6800xt gpu with 16gb of vram. It has served me well,. But game studios are really pushing the limit to what graphic levels are in new games and it can be very easy to feel like you are being left behind.” You stated you were and are currently considering upgrading your rig and specifically mention your GPU and the demand on GPU’s in newer games today. Which leads to my next points. “I do understand that a lot of this is market strategy to get consumers to constantly be needing to purchase new and "better" parts, but, DAMN, is it ever convincing!” “When i do play more recent games my pc can sometimes struggle, but I have always put a good story before graphics, and usually find a way to make them playable with decent graphics if I really want to. “Modders have found ways to make use of DLSS, upscaling, and some FSR techniques for use in heavily modded and very graphic intense veersions of the game.” After your statements about the confliction you were having about upgrading your rig you state that purchasing new and better parts is convincing. You go on to state that yes, your PC sometimes struggles, but again context is important. You then state you put a good story before graphics (implying your card is underperforming). You also state you can find a way to make them playable with decent graphics if you really want to. Implying your GPU is underperforming. Also implying you have to heavily change your settings with the making games playable with decent graphics with the if you really want to bit. Yes, I may have made some assumptions but the context you gave led me to those assumptions. "What you said about 6800xt benchmarks compared to newer GPUs and the prices of new GPUs for the slight increase is spot on, and is exactly why I am NOT upgrading at this time." That is great to hear. I honestly just can’t stand the business practices at hand here with all GPU manufacturers and hate to see people give their hard-earned money to greedy corporations at such an outrageous markup. It seems to determine how they price at launch as demonstrated with the GPU shortage when the mining boom occurred. I believe the only way to rectify the current pricing gouge is to stop supporting it all together (not upgrading), until the price becomes more reasonable. "What you didn't point out is that these new GPUs also need more power and run much hotter. So the hidden cost of moving up to these new cards can also mean upgrading your power supply and cooling system, and possibly a new cpu and motherboard as well Add to that higher electric bills and the few fps rates suddenly don't look that good any more!" You are correct! Not only that but, in your case specifically, unless you went to say a 6900 or 6950 XT, you would indeed need not only a new MOBO but also that new fancy DDR5 ram. Another hidden cost, and another reason I went with the previous generation of cards. "If my response to your assumptions sounds too harsh, I apologize. But people are going to read your comment and get an impression that simply is NOT true! I DO keep up with new drivers and I DO optimize my settings for my pc in general, my GPU, and my games. What 'struggles" i may experience are very brief and are usually resolved with a few easy tweaks." Not harsh but a bit condescending. It is ironic though, the condescending bit about making assumptions and now the shoe is on the other foot. I never said your drivers were not up to date. I said driver issues. Which could mean a corrupted driver from download or something less than ideal AMD put into the software causing drops in performance for certain games. So what happens when we assume????? (sorry couldn’t help myself). "The 6800xt is an amazing GPU, and there are only 2 or 3 new GPUs that match or surpass its 16gb of vram or its performance!" I agree wholeheartedly! The price to performance was in a league of it's own and the main reason I went with it. I still think it was too expensive though 😂😅 ($500).
It's funny that they say getting 60fps should be the minimum before using frame generation, cuz if I'm already getting a steady 60+fps, I'm gonna be LESS LIKELY to use frame generation.
For me anything above 80-90 fps i will upgrade resolution. 4k balance or quality look so good with fsr with the frame gain. Other than that i will choose native.
> if I'm already getting a steady 60+fps, I'm gonna be LESS LIKELY to use frame generation Why though? 100fps is so much more enjoyable than 60fps. Why would you even hesitate to use FG?
@@intelligentdesignmorty8112 I've tried this on a PC in my living room. I maxed out graphics in and was at 33FPS, turned on FG, and stayed at 60. On a controller sat far enough away it really doesn't feel terrible at all. However, I think that improvements to anti-lag+ would be ultra beneficial to ALL users who turn on FG, from low framerates to high. It would allow for a much larger userbase to get that 60FPS experience with far fewer drawbacks. After that, it just needs to be added to as many games as possible, so hopefully modders can find a feasible solution similar to the way AFMF and RSR work.
Because AMD understands gamers the more free stuff they give everyone the more people will like them and be open to buying a AMD GPU at the same time Nvidia is being anti consumer and competing with their own old cards so less people overall will need a GPU upgrade in the long run AMD will win because we're now seeing that Moore's law is dead and each generation is not that big of a improvement when everyone has 4k 120 fps nobody will care about new GPUs go to any Best buy and look at OLED 4k vs OLED 8k nobody is gonna upgrade to 8k its like a 5% difference for triple the price
Great to see AMD improve FSR. I wonder how common frame generation would be now a days, if it would have been released with Force Unleashed 2 back in the days. They had quite interesting and working frame doubling in it.
And people don't understand how the technology works. Why it probably works with every graphics card. Could be because the CPU plays a big role and not the GPU
I can turn on my 4090 and play my games (it just works) when I had amd (7900 XTX and 6950 Xt) I had driver timeouts and random shutdowns ... There's a lot more into these companies than just upscalers or RT... I had AMD for 15 years until 6 months ago and I'll never touch another AMD GPU I will only buy Nvidia's flagships for now on after I've seen the difference.
@77gaza I've probably spent more money on AMD products than you considering both 6950 and 7900 xtx were over $1,000 a piece and not including CPU'S... you making that comment shows who's a fan boy when something is said about AMD people get but hurt but you're probably running around with a mid tier card cause you're broke and talking about how "Good" something is 🤣😂 Ray tracing and Path Tracing are a gimmick also right? 🤣
Once FSR is up to par im switching immediately, best case scenario would be of course if these GPUs suppliers would stop acting like a 100% inflation compared to 6 years ago is normal.
As a Nvidia GPU customer (GTX 1080. _yes don't judge_), I'm not only rooting for AMD for their non-restrictive software solutions but I'm also strongly considering AMD Radeon for my next build. Nvidia has sadly disappointed me with their anti-consumer and anti-competitive (market-wise) practices. (Doesn't mean AMD is a saint either. I ain't no fanboy.)
The one game I used FSR in on my GTX 1650 GDDR6 is Deep Rock Galactic and maybe Enlisted depending on the map and they look really good and on medium (ish) settings im getting a good 90 to 107 fps- 100hz monitor
Genuinely, I think AMD has killed DLSS 3. Like upscaling tech is very specific, it does require machine learning for a really good result despite what AMD says. AMD's filtering approach does indeed prove it's possible to get an okay result, especially with older GPUs but nothing outstanding. However, fast frame generation is a tech we've had for a while now, with AI such as DAIN and the vastly superior RIFE which can run even on a CPU. Sure, with frame generation you can get a better result with machine learning as NVidia's DLSS 3 uses, especially with a lower base FPS, however it's not entirely accurate to suggest it's the only way, and indeed AMD's filtering-with-context can easily get great results as well for frame generation (it's hard for me to put it into words, so I can only hope I've made sense so far). Although... it should be noted, we'll probably still see some smearing. It's not going to be perfect by any means, but it's imperfections will be only noticeable when going frame by frame and a side by side comparison, not during normal gameplay and definitely not with something like TH-cam compression.
You did a pretty good job putting this into majority terms, although a TH-camr only has to say, "but the DLSS3 frame generation is, again, looking just a bit better than FSR 3.1 with less moiré; the shimmering and jaggies and overall fidelity of the game in motion with frame generation turned on when comparing the two technologies side-by-side." That's all consumers need to hear to alter and fortify their spend and endorsement to favor Nvidia GPU's nowadays.
Imagine buying something, not understanding how it works and then complain that your now technically obsolete product doesn't have the latest bells and whistles that come it the newer product.
you forgot one major thing that, the example they are showing on ratchet with FSR 3.1 is from 1080p performance mode. FSR is very bad at 1080p resolution regardless of the quality modes and lower quality modes are basically unusable. but with FSR 3.1 they improved so much just from observing a example of a game that using 1080p performance mode. we may see even more massive improvement in quality mode to the point those tiny ghosting and shimmering effect on FSR 3.1 ratchet example which were running at 1080p FSR performance mode become near non-existent
@@alexturnbackthearmy1907 True, and those are still selling for too much on the used market, sellers need to drop prices since even used 5700XTs are going for a hundred less than them, but yeah 3090 isn't going anywhere.
Depends for what you use 3090. RTX 4090 is already struggling in many AAA titles on 4K. Once the new 5000 series GPUs come, a 4090 will be a "high mid-range" card from Nvidia.
I game at high resolutions 7k-5k at 45fps -70fps on my overclocked rtx 3090 trinity so with with frame gen I can switch to am5 and to new Zen 9900x3d from 5900x and free up a 15% bottle neck and DDR5 upgrade so I will wait for rtx 5080 ti no problem.
you can use 3090 for a long time. i'm still using my Radeon VII (2019) playing at 1440p with games on high or max setting at around 60 fps atm but def can see it aging with newer games.
40 gen wasn't much of an uplift over 30 gen nvidia. So they made their software trickery part of the deal. It's understandable. Why shoot yourself in the foot by giving your previous gen extra longevity. That's no way to milk your customers! Thanks AMD. I'll be sticking it out with my 3080 just a little while longer.
fsr is finally worth using under 1440p base resolution...... it was time, dang it !!!!!!! seriously , an nvidia gpu was just worth it because dlss basically laminates fsr under 1440p, glad to finally see majors improvement on fsr , it was much needed
Amd fanboys before, "FG and dlss fake frames is rediculous i'will never use scanling bla bla bla...." Now amd with fsr and FG "omg thats amazing thats revolution tHaNkS AmD omg"
The problem is that Nvidia uses the tech exclusiveness to sell cards. FSR 3.1 is fantastic but it won't sell AMD cards, in fact it will probably sell Nvidia cards, as you will be able to mix the quality of dlss with the FSR frame generation.
This may be true, but nvidia overprices everything to an insane level, AMD is taking a step back, and making it the choice for budget/normal people, meaning by in large they will have the higher demand and supply, vs nvidia with high demand, but not many people able to or wanting to buy. If you’re content with what you have and it costs 1,000,500,300, heck even 100 dollars cheaper and you get the better future proof (vram etc), then why go big unless it’s needed for work.
@@Koki-qe7vz You missed the point. AMD really needs to stop being the "nice guy" and stop helping nvidia cards with extra features. Why would anyone buy an AMD card to use AMD features when they can buy an Nvidia card and use both Nvidia and AMD features? AMD needs to start being exclusive and make their features only work on AMD cards.
I promise when AMD fixes there encoder issues and make their gpus as good as nvidia and intel for editing. I will go with RX all day everyday and i say this as a RTX 3070 user.
There isn't really an issue with 7000 series GPUs and encoding, particularly as they have AV1 encoding available. There is a big issue however with Nvidia encoding, NVENC at the moment, for which they recently released a hotfix. It sounds like you don't realize that you have things the wrong way around and that it's largely Nvidia cards that are having problems with encoding, not AMD GPUs.
@@Hash6624 Or just use some optimization mods. Minecraft is hilariously broken in so many ways, that some mods can double or triple your performance, while using higher settings.
@@Hash6624not yet, the biggest issue is that Minecraft uses openGL to render, which can’t support fsr. However there is this guy making a mod so that Minecraft uses vulkan instead. Vulkan does support support fsr but the mod is still far from perfect
the main thing they have to work on now is reducing the latency FG introduces. You can really feel it when using FG from about 60 fps specifically on shooters with mouse movement, feels like I'm dragging my mouse through jello. If they get it down to around 5ms response time that would be perfect
As a 4090 user and future 5090 buyer.. This is an A-level update with a potential of it being an A+ update. Buyers from the Big three need this competition to keep going. And you never know, one day Radeon could have a Ryzen moment if NVIDIA falls asleep at the wheel for any length of time. Keep it up 🎉
Great optimism BUT.... LOL. Nvidia can't possibly fall asleep. They are super woke lol. They'd only lose marketshare if they chose to focus more on other AI applications besides gaming and AMD went down that route. But that IS NOT going to happen, bc Nvidia is not THAT stupid. They could say they just dont care as much about consumer gpu's but that's really silly. If AMD innovates more than Nvidia in the GPU space, it will hurt Nvidias mental marketshare as well, which would interfere with their plans to literally take over the world with accidentally nefarious or subversive AI that exists in everybody's home. I'm only half kidding, of course! 🍻
@@ehenningsen what is your complete setup by the way? I have one 3080 5900x gaming pc with a Gigabyte 1440p monitor, and another with a 7900xt, 5900x add 3440x1440 monitor. Hard fought and bought 5900x purchases during COVID, and can'[t even begin to fathom spending so so much on a 4090, but my semi-famous and multi-talented, frequently traveling family could possibly afford these things, we just aren't THAT into it to make it worth the difference right now. But we may start streaming...
@@TheModeRed nice! My PC is an Intel 13900kf, 96GB 6000Mts ram, 16TB SSD with an MSI Gaming Trio RTX4090. My monitor is a TV, a 65" QD-OLED Samsung S90C that can game at 144hz
It's quite amusing how, until recently, all the AMD fans were bashing Nvidia for their upscaling and frame generation capabilities. But now, they're the ones who seem most excited about these features. LOL, how the tables have turned!
I sincerely hope that Nukem with his FSR2FSR3 mod will be implementing the FSR 3.1 tech in his upcoming versions. One thing is I don;t understand how one will choose FG in games, officially, when selecting DLSS for upscaling and not FSR, will it autodetect that it is an RTX card in the system and by selecting FSR 3.1 it will show ability to enable FG but use DLSS as upscaling in the background?!?- I meant for the RTX 3xxx series, for an example.
Why do reviewers slow and even stop the game to show us some little imperfection? You do not play games like that. So if you "play" the game and don't just look for sh$$ it will look fine
I'm using it for Enshrouded to good effect. I'm aware of some minor artifacts around my character, but it really doesn't bother me at all. The smoothness feels good.
Can you test it in Path of Exile (with its custom game engine)? How (bad) is the image quality in fast moving game with lots of stuff moving on the screen?
This upscalers aren't just about performance but also just making the image quality in these games more stable. One game that really noticed actually looked bad when you turn upscaling off was MK1.
On any recent driver past 23.12, launching steam vr gives an error 309 which indicates that a part of steam vr isn’t loading correctly. Every recent driver even as of two days ago the most recent one released on 3/20/24 still gives this error for many of users and we have to use outdated drivers just to run steam vr. This for obvious reasons is a bad occurrence as we have to use outdated drivers from December which heavily affects all games in terms of artifacts, driver crashes, incorrect/missing textures, and so on.
And effing up their drivers? No, say it ain't so. Who would have thought that the company with a track record of effed up drivers would make effed up drivers? They must be some sort of psychic.
@sweetroll_riley Interesting, I'm on version 24.2.1, which is the latest rn. What's your GPU? The issue seems to be affecting 7000 series, which isn't too surprising because it's a new architecture. Still a shame it has all the issues to begin with.
It's crazy how a lot of people still don't know that what amd does is not frame generation, is frame interpolartion (the same tech used on TV from a few years ago), not that is a bad thing, it works, and that is the real reason why Nvidia can't put their FG features on other GPU's, there's hardware to generate those frames, I hope Nvidia makes a software FG for older GPU's, but I don't think that's happening.
Erm, both AMD and Nvidia would be frame interpolation, because it's using differences between frame 1 and frame 2 to make frame 1.5. Whether using dedicated hardware or general hardware changes nothing about it being frame generation.
@@quarterpounderwithcheese3178 I am a freelance video editor who doesn't even get minimum wage, if I was buying an eGPU I'd just much rather go for a new laptop instead lol
I have often heard that the human brain processes information in chunks based on changes in sensory input, and doesn't constantly process the newest inputs. It checks sensory inputs to get a general idea of what the surrounding is, then uses prediction and memory to draw the rest and save processing time. Maybe this really is the best path forward for GPUs if we want them to generate visual worlds based on lower resolution renders of those worlds. And if AI were to completely replace the GPU and become advanced enough, maybe it would even be possible to go back to using the CPU to generate a very low resolution image and piping it into an AI card that instantly converts it to a high resolution. Would likely require training for each individual game to make it work properly though.
PC game graphics quality is getting worse and worse.. From SSAA to AI generated fake crap .. Oh, you can improve framerate - if your framerate is high enough for makink it work decently. Oh, you can improve resolution - if your resolution is high enough to work decently. Oh, you can use Ray Tracing - if the near to invisible difference is worth dropping all the fps you could not improve with aforementioned technologies, because you have no 600$+ graphics card actually delivering the needed base fps and resolution for having any advantage from all these shenanigans. I can't see anythinng innovative in all the crap nvidia and AMD have been trolling their customers with for years now! Inlcuding the exploded pricing (average graphcis card still is at least twice an acceptable price..) all these crap technologies are cynical at best!
Realism isn't what people play either just look at the sales of GTA IV vs GTA V. GTA IV was too realistic & too hard for most people jump into play. GTA V however went back to a more arcade style game play & got a massive player base with online even though GTA IV had online too before it.
With Blue Sky you can use Frame Generation on Movies and its INSANE. Its very interesting that the generated frames are mostly a mess of pixels. But in Motion you won't see this.
Hopefully they can make it actually work now. I've seen multiple games lately drop FSR support right before launch because it's was causing stability issues. And not just while it's in use, but if it's even integrated into the game but turned off.
My issue with frame generation is that it increases input latency, and unfortunately games are releasing using frame generation as a crutch for a "playable" states on release. This should not be the case, and i hate it. Devs should have enough time to optimise without this (looking at you publishers and management who are pushing games out the door before these games are actually optimised).
so happy for showing rtx 3060 in video because i am just about to build my first pc with rtx 3060 and after seeing this video i am sure that this card will last me good 4 years
It's funny that AMD benefits in many ways from this, not only it's showing "look, we're not far away from what you're used to, consider buying our gpu next time", it also kills sales of budget-range nvidia 4000 series gpu's which are worse than 3000 series and were worth getting ONLY for framegen. Nvidia can either look dumb and claim their past gen cores are too weak for the feature or can play even dumber and in the upcoming months "figure out" a way to enable it on older gpu's despite intentionally locking it only with sales of their latest gen in mind.
Does DLSS with AI learning really worth more (present and the future) than AMD FSR? Just got sapphire pure 4090 gre but the alternate was 4070 (as my old psu doesn't support 4070s).
I have a dumb question, if I want to use fsr on a game, do I have to lower my resolution below my native? im using a 2560x1440p monitor, and this gpu is still somewhat new to me, my previous gpu was a rx 580 didn't really have any special features, now im using a 6700 xt and I don't understand freeysnc, upscaling or fsr
I have a gtx 1070 with a i7-6700 which runs pretty smooth with every game I throw at it, but in warzone its kinda less with an avg of like 50-70 fps which isn't bad, but I got all excited when I turned on frame generation I got 110 fps to 150 fps consistent but the game was choppy and frizzy as hell, hope they can fix this in fsr 3.1. Currently I use intel Xess, intel Xess has been extremely smooth for me with an avg of 60fps-70fps to lil jumps to 57 fps sometimes. But frame generation is gonna be crazy, I'm just imagining XESS, with frame generation thats gonna be heavenly smooth for me.
AMD can save us. We need them to save our community. Their Image Sharpening IS AMAZING, AFMF is a nice feature and more stuff I won't mention. If they figure the Encoder and Decoder, and the FSR, they are RULLING THE WORLD.
The stupid thing is this should all be able to be enabled from the drivers, without having to wait for games to support it, which also means that older games will be SOL.
Pleas help me understand - how is frame generation amazing, if you cannot cap it? Turning it on pushes your gpu to 100% and your temps through the roof.
And gave game devs a reason to not optimize their games. It's cool for handhelds and undepowered builds. But there is no reason to use frame generation with a $1,000 GPU.
dlss runs off hardware = better picture amd runs of software= in time might catch up or pass it. On my ally i love fsr and use the beta driver for fluid motion and diablo 4 looks little fuzzy but im always around 150 fps and hit up to 170 in dungeons. On my desktop with a 3090 i run everything maxxed out 4k and some times need dlss to hit the 120 mark for my LG c1 and its amazing still lol. I love how far we have come in just a few years.
What happened to "we're implementing FSR3 into DirectX so you can use it on most games"? You mentioned Steam Deck, Proton implements FSR2 into its comparability layer, that's how it upscales all games.
There's a big mistake in this video, while both AMD and NV's goals are the same, their approaches are different not just copy. AMD's framegen references to DX11/12 exisitng data and NV's references to after DLSS3 patched game data. They are totally two different technologies.
Just finished playing cyberpunk with an fsr 3.0 mod created by Lukefz, from 65-75 fps on high to 120-150 without any additional upscaling just using the frame gen (rx 5700xt)
Maybe after Intel created a better resolution scaling, AMD got embarrassed and finally improved FSR. I just cant stand the shimmering. My son loves his 6950 XT tho so at least AMD impressed him.
Credits to everyone who stayed on AMDs asses about fixing FSR, you are the real heroes in this
They would have to improve it anyway
@@diamonshade7484 as much as I prefer AMD, they probably wouldn't of fixed it.
These companies seem to only move when there's an uproar and they're losing money.
Wrong they were working on it people just can't wait for shit an AI version is coming too@@patrickstar3820
@@patrickstar3820 FFS y'all stop saying "of" instead of "have", what is wrong with your grammar?
I only stuck with AMD because they are the cheaper option and their driver software doesn’t look like it’s from the early 2000s
Nvidia "Screw our previous customers!".
AMD "Here you go boys, we won't kick you to the curb.".
Realest shit
AMD does that to get more customers and make more money. They would screw customers if they were dominant.
Nailed it!
If they want to surpass nividia sacrifice should made lol
@@itachi7976 Said the scorpion to the frog.
It’s a race between AMD’s software and NVIDIAS hardware. AMD is doing very well considering it requires no specific hardware
Lol no. Its a race between Nvidias software and AMD's hardware. Generally even though amd hardware is better value they fall behind in terms of software. Frame pacing issues are still rampant with fsr frame gen. DLSS upscaling remains superior and so does dlss frame gen on the newer GPUs.
And ofc theres the high end where amd hardware cant come close to nvidias.
He meant Amd upscaling does not require specific hardware it's trained by amd developers but dlss is working because of ai cores and cuda cores @RadialSeeker113
They're both software technically, FSR 3 is just software accelerated rather than CUDA accelerated. Technically, DLSS 3.5 doesn't require specific hardware either, as both the 20 series and 30 series have the same cores as the 40 series, just of different generations and amounts.
That's why I don't buy Jensen's bs that "Ampere cannot handle frame gen". Nvidia could both create software accelerated frame gen for older cards or just allow cards with the needed hardware to run them. If FSR ever overtakes DLSS, Nvidia will have no choice but to capitulate and stop locking DLSS features.
@@aouyiu Of course you're not buying Jensen's "BS", after all, he known how frame gen works while you assume it works the exact same way upscaling works.
Nice way to put it
DLSS AND Frame gen. As an 3000 user, this pleases me immensely.
This is NOT the first time that AMD save old nvidia GPUs. The GTX users can attest to this.
They could also support older gens like AMD does.
@@HotFreshBoxThe support for older cards is limited even with FSR 3 upscaling. FSR is designed for AMD cards only. No other GPU manufacture uses FSR or optimizes their drivers for FSR. FSR is 100% optimized for AMD hardware only. The issue with frame generation is that bad upscaling makes it look worse.
@@GodKitty677FSR will run on Nvidia and Intel as well and intel xess runs on amd and Nvidia as well. DLSS is the only one hardware locked.
Because DLSS is hardware optimized it makes sense to use it over FSR on Nvidia so cards supporting DLSS should prefer it
@@Rohndogg1AMD has a tiny market share FSR is dead on arrival. Soon FSR will use AI as well. Once AMD cards support the feature in hardware.
@@GodKitty677fsr is not dying any time soon 😊
I feel like AMD giving FSR 3 access to all GPUs is their way of flipping off NVIDIA for restricting DLSS 3 to 40 series as well as prompting people with older GPUs not to upgrade to 40 series and I'm here for it.
The fact that Nvidia wouldn't even make it available to RTX 30 series lol
i own a 3060 and this is amazing for me, i can play any AAA's games +70 fps already in 1080p but this gonna be the cherry on top for VR lmao
Yup and Nvdia fanbot still shit on Amd while using Amd open source tech on their old gpu's AmD has
nO TeCh
They don't have to. It's not in their company's best interest as they're more focused on AI and making 40, 50, and 60 series cards relevant.
@@Blue-op6qvexcept it won't, flatscreen frame gen simply wouldn't work in vr
and vr in fact already has an established frame-gen tech called asynchronous spacewarp (the name is different for all implementations but the concept remains the same, halve your fps and interpolate it to target framerate)
Fsr 3.0:”He does exactly what I do” Fsr 3.1:”But better”
497 likes, hearted, but no replys?? also nice joke
I don't mind the fake frames AMD is doing. They are enabling fake frames for everyone, providing even some extra life for old cards. What I do mind is what Nvidia did, making fake frames the main exclusive feature for a new generation of GPUs.
Not just a new generation of GPU but also games as DLSS doesnt work on all games like FSR
Even in this video he says why though.
"Put shit in, get shit out."
It makes sense that they wouldn't want their new expensive technology to get a bad reputation from gamers who think it is shit, because they are trying to framegen up from 15fps.
Gatekeeping frame gen was the saving grace of the 40 series because, without it, it's not a substantial enough upgrade from the 30 series. The only reason the 4070Ti "beat" the 3090 in non-4k gaming.
If you don't know how the tech Nvidia developed works you should probably stop talking or in this care writing. Seriously, every time AMD does something good you mentally challenged RTX 2000 and 3000 owners come out of the wood work complaining about how Nvidia kept innovating both software and hardware wise.
@@Lem_On_Lime the entire point of trying to use frame gen at 15 fps is to hit near 60 fps to make a better gaming experience. No matter how good the damn game is, playing at 15 fps is unbearable. The people trying to get more FPS from the starting point of 15 fps don't give a shit about how graphics look. People with high end PCs aren't going to use frame gen for more fps by sacrificing look of the game to get if they already running the game at high frame rates natively (which is best quality graphics you can get besides modding it).
I want AMD to really make FSR be 1 to 1 with DLSS. That way Nvidia can maybe realize that paying hundreds of dollars extra for better software isn't gonna cut it anymore.
And I dont think as gamers that you needing to pay like 25% more for a gpu to have better ray tracing performance is gonna cut it anymore, once the upscalers are even.
better software is controversial, for example i like amd software more. and fsr can't be as good as dlss just because nvidia uses ai and to be more effictive they use cuda cores.
Do you think that Nvidia which just became the third biggest company on the planet cares about that handful of moaning gamers?
@@shagohoddsThey aren't even a gaming company anymore.
@shagohodds maybe not, but they should, because the "moaning gamers" got them to that point
I don't understand why people think developing DLSS was easy or free. Somebody had to say "hey, AI upscaling might be the next big thing", and probably assign a team of people and paid them for a year IN HOPES that they produced something worth selling. I'm sure Nvidia has many projects like this that didn't pan out, and the premium from products like DLSS has to cover those costs associated with creating an entirely new product. AMD on the other hand gets to play the role of white knight, criticizing Nvidia and offering similar tech at a lower cost. Why yes, it's much cheaper to copy something that works because you just have to pay one team to make this one thing, instead of maybe 10 teams to figure out what actually works as a viable product.
Another improvement you did not mention: FSR 3.1 will be available for Vulkan and DX12. So that eases the usage on Linux based operating systems a lot if you don't want to rely on Proton or Wine.
FINALLY! SOMEONE THINKS OF THE PENGUIN!
@@Takintomori I'm someone who was working on compiling FSR2 for Linux easily when AMD provided Vulkan support but didn't check whether GCC or Clang on Linux have any issues with their build process.
For I'm already waiting on the Vulkan implementation to arrive, so I can test and adjust the code to make it build on Linux.
I think Godot developers are also eagerly waiting for FSR 3.1. The engine already supports FSR 2.2 across all platforms but the plan is to support FSR frame generation as well.
@@thejackimonster9689god bless you and your work
Finally!
How cyberpunk still doesn’t have 3.0 is wild. They were announced to get it when FSR 3 was announced
They announced that they will stop updating this game, the recent patches were just bug fixes without adding anything. Maybe they will finally add it soon but it also took them a few months to implement FSR 2 in update even though modders did that shortly after this technology launched.
Cyberpunk is basically NVIDIA tech demo. Of course they are not supporting AMD. Besides, I don't understand why people praise it so much in terms of graphics. IMO the game is fugly and looks "flat". Even with path tracing enabled.
@@TheMahtimursu they don’t have exclusivity deal, it still supports XeSS and FSR 2.1 and AFMF. So don’t see how you came to the conclusion it’s not supported. And it’s one of the most detailed games out there. From the reflections, to shadows and lighting to particles. There’s a reason why it’s used as a heavy GPU benchmarking game. It’s one of the best looking out there.
@@TheMahtimursu Stop huffing that copium bro
@@kineticbongos No it is definitely not one of the best looking games. I thought I was crazy when I started playing it and it just looks bad. I googled a bit and seems that many others agree. It just looks flat. That is the best way of describing it.
Future of gaming is lower resolutions and fake frames. Nice.
Who cares if it looks nice and smooth
@@tuna1867 Just that it doesnt but yea ok 😅
@@Yoshi92 In time it will. Right now it doesn't
@@steadyclouds4614 I use it sometimes, but dude, he has a point, I don't want low res and fake frames, I want optimized games I can play at 2k native, FSR3 can look nice but by definition it can never look as good as a native 2K image, some people think FSR3 looks better but they are normally the type of people that like bloom and blur and all the fake GFX effects that are dog water, its fake GFX effects used with FSR3, I paid 500$ for a 2k Asus monitor with HDR and perfect colour correction, If I didn't mind how my image looked I would have went with a 70$ 1080p China number one monitor.....
Higher input lag, lower resolution and blurry TAA all with 0 large graphical leaps in a decade (raytracing is not a gamechanger lol) and Moores law no longer being valid , hell yeah the hecking future dude, time to spend 800 euro on a new card 😎
AMD is just straight up going "well...this might not convince everybody to buy our products even if we lock it to our hardware...so might as well give Nvidia the finger.".
"Oh, you want people to buy 40 series to increase your market cap? That's cute. Hey, everybody, here's the shit you need to make your gpu last another 5 years. Have fun"
And I'm here for it.
I wanted to thank you because I ended up buying an AMD gpu thanks to you and owens videos. I ended up like my amd card so much that I ended up building an entirely new PC with an AMD cpu to go along with it. Previously had intel/nvidia. I don't intend on going back to nvidia or intel assuming AMD continues to do right by their consumers. They have done right by me.
this is why i prefer AMD over NVIDIA, NVIDIA always feeling entitled by them restricting stuff that should be used freely by everyone, that has been proven
@EveryGameGuru Haha that's a great way of putting it over 😂😂
Thank god there is nvidia.
It's they who rnd all this stuff that amd trying to copy.
@EveryGameGurunvidia fans are so dumb smh...
it wouldn't be so bad if nvidia decided to not be a scummy company, but here we are. they expect their bells and whistles to sell the product, not their performance. they still sell absolute garbage value cards, and get away with it because they are "the standard" when it comes to graphics cards.
Don't forget that NVIDIA is not gaming company anymore, they are (called themselves) AI Factory.
You can look at their latest GDC for more details.
They are going with the good old Apple's route: less hardware and more software focus
Sadly they left us gamers aside "cries in 1080 ti"
Got a 7800XT this year and I'm, pretty excited about this news. Will have my eyes on this as it develops, can't wait to see for example how cyberpunk will look with FSR 3.1!
you can always use xess till fsr 3/3,1 releases :) should work just fine in 1080p.
I think even ray tracing would work well with fsr3, making nvidias selling gimmick obsolete
@@DL-iy7obprobably not still tbh, but it's a win for AMD users for sure
@@DL-iy7obOnly people who can't play ray tracing or path tracing says this...
I ran around screaming the same shit I went out and got a 4090 and it's night and day difference at 4k.
I was all excited to use Frame Gen on Forbidden West today and it was only giving me an extra 15% fps. It's odd how some games give you a massive uplift while others it falls on it's face.
Probably goes to whether you were limited by gpu or cpu.
I don't see the point in endless fake frames and lag surely a small boost and no lag makes more sense a boost if you li,e rather than a doubling or triple so 45 to 60 60 to 75 etc make far more sense to me
@@audie-cashstack-uk4881 Well, I'm using it to go from 85-90 to 120-140 depending on the area so latency isn't exactly an issue.
@@audie-cashstack-uk4881 Nvidia reflex makes the lag problem a non issue
Doesn't Forbiden West use checkerboard rendering?
Simply put :
nvidia employees work against their customers.
amd employees work to support all the gamers community.
Buying proprietary solutions is voting against yourself.
The worst move to date for me from nvidia is the vendor lock-in of their adaptive sync solution, making a monitor features to work on a specific gpu brand is really dirty.
This channel is so underrated fr. Your videos are so detailed, informative, to the point and understandable
Frame gen is awful. Whenever I enable it. Whether it's FSR 3 or AMD's FMF, It not only tanks my framerate but gives me micro stutters like hell. I go from 100FPS to like 40-50 in MW3 with Frame gen enabled and gives me 50% micro stutters. This is also being used with an RX 7900 XTX. Even though it *Says* it's 120fps the game looks awful and the latency is unplayable. If you want higher framerate then just lower the graphic settings or use FSR/DLSS upscaling. Frame gen is pointless.
Yeah that was my experience in last of us pt 1. There’s games where it works great, so it’s possible but needs a little work with the implementation. We’ll see how it goes
This was one of the most sane, most clear, and most honest presentations of FSR and DLSS that i have encountered, and i thank you for it.
I just went through, and honestly am still going through, an internal battle about upgrading my gaming rig. I consider myself fortunate to have a 6800xt gpu with 16gb of vram. It has served me well,. But game studios are really pushing the limit to what graphic levels are in new games and it can be very easy to feel like you are being left behind. I do understand that a lot of this is market strategy to get consumers to constantly be needing to purchase new and "better" parts, but, DAMN, is it ever convincing!
I am an older gamer (74!) and my main focus is not so much on a lot of the new games as it is on being able to play the games I already love to the best of my ability. One of my BIG favorites is to heavily mod Skyrim with about 2000 or more mods and see how smoothly I can get it to work well! It's a real challenge, and a lot of fun, and is always teaching me a lot about optimizing my graphics!
I also love a lot of older rpg games like past Balder's Gate, Neverwinter Nights, etc., and these games are usually not that heavy on gpus.
When i do play more recent games my pc can sometimes struggle, but I have always put a good story before graphics, and usually find a way to make them playable with decent graphics if I really want to.
What AMD is exploring right now is very exciting to me, and especially if people can find ways to take the new ideas and learn how to adapt them to lower spec pcs. This has actually appended in the Skyrim modding world. Modders have found ways to make use of DLSS, upscaling, and some FSR techniques for use in heavily modded and very graphic intense veersions of the game.
I really think that if this new FSR from AMD actually works well, Skyrim modders will find a way to use it in such an old game, and will help to make it possible for it to be used in other past games!
So, I will try to wisely refrain from upgrading right now, and see what happens in the next couple of years.
Apologies for such a lengthy post. Old people do tend to go on . . . . and on . . . and . . . . !!!
I know the feeling haha. I’m on a 5700 XT and since I mostly play cs and some Baldurs gate, civilization and other games that aren’t all that demanding, I got along pretty well so far. Hell Divers 2 got me thinking though 🙃 definitely want to update now but also kind of want to wait a little bit longer to see what’s next!
致敬老游戏者,祝你身体健康😊
I literally just put together a rig the the 6800XT. I mainly play at 1080/1440 and it works flawlessly. Not sure why yours is struggling, possibly a CPU bottleneck? Driver issue? Regardless of my curiosity, I just stopped by to say it is definitely a terrible time to upgrade.
1. GPU prices are outrageous (hence why I bought an older GPU, the 6800 XT).
2. In every benchmark I can find the 6800XT actually beats out (in rasterized performance, non RT) the 7600/7700 and the 4060 and holds it's own against the 7800 (and XT)/4070(and ti). Losing out on some (varies game to game)but by what I consider negligible amounts (10-20FPS) for the cost. That leaves you with a reasonable upgrade path to a 7900 or a 4080. Both of which are, IMO, overpriced for your performance gain.
I would look into possible performance issues as I don't honestly think your 6800XT is performing at optimal levels given the information at hand. Best of luck with whichever course you decide on!
@@ryanwilliams5705 I think you may be reading too much into my comment, or possibly just assuming too much.
The ONLY time I mentioned my GPU struggling is this comment:
"When i do play more recent games my pc can sometimes struggle"
If you read the whole sentence, I do mention being able to find a way to make them playable also, and with decent graphics and fps. This usually comes down to a few very easy tweaks either within game settings or GPU settings.
You make it sound like I am experiencing serious problems that are making my games uplayable, when I NEVER SAID THAT!!!
What happens when we assume?????
What you said about 6800xt benchmarks compared to newer GPUs and the prices of new GPUs for the slight increase is spot on, and is exactly why I am NOT upgrading at this time.
What you didn't point out is that these new GPUs also need more power and run much hotter. So the hidden cost of moving up to these new cards can also mean upgrading your power supply and cooling system, and possibly a new cpu and motherboard as well Add to that higher electric bills and the few fps rates suddenly don't look that good any more!
If my response to your assumptions sounds too harsh, I apologize. But people are going to read your comment and get an impression that simply is NOT true! I DO keep up with new drivers and I DO optimize my settings for my pc in general, my GPU, and my games. What 'struggles" i may experience are very brief and are usually resolved with a few easy tweaks.
The 6800xt is an amazing GPU, and there are only 2 or 3 new GPUs that match or surpass its 16gb of vram or its performance!
@@denniswade4998 "If you read the whole sentence, I do mention being able to find a way to make them playable also, and with decent graphics and fps. This usually comes down to a few very easy tweaks either within game settings or GPU settings.
You make it sound like I am experiencing serious problems that are making my games uplayable, when I NEVER SAID THAT!!!
What happens when we assume?????"
I read the sentence, rather paragraph, in its entirety. While I do agree my curiosity sparked some assumptions, context is also important. Per your comment:
“I just went through, and honestly am still going through, an internal battle about upgrading my gaming rig. I consider myself fortunate to have a 6800xt gpu with 16gb of vram. It has served me well,. But game studios are really pushing the limit to what graphic levels are in new games and it can be very easy to feel like you are being left behind.”
You stated you were and are currently considering upgrading your rig and specifically mention your GPU and the demand on GPU’s in newer games today. Which leads to my next points.
“I do understand that a lot of this is market strategy to get consumers to constantly be needing to purchase new and "better" parts, but, DAMN, is it ever convincing!”
“When i do play more recent games my pc can sometimes struggle, but I have always put a good story before graphics, and usually find a way to make them playable with decent graphics if I really want to.
“Modders have found ways to make use of DLSS, upscaling, and some FSR techniques for use in heavily modded and very graphic intense veersions of the game.”
After your statements about the confliction you were having about upgrading your rig you state that purchasing new and better parts is convincing. You go on to state that yes, your PC sometimes struggles, but again context is important. You then state you put a good story before graphics (implying your card is underperforming). You also state you can find a way to make them playable with decent graphics if you really want to. Implying your GPU is underperforming. Also implying you have to heavily change your settings with the making games playable with decent graphics with the if you really want to bit. Yes, I may have made some assumptions but the context you gave led me to those assumptions.
"What you said about 6800xt benchmarks compared to newer GPUs and the prices of new GPUs for the slight increase is spot on, and is exactly why I am NOT upgrading at this time."
That is great to hear. I honestly just can’t stand the business practices at hand here with all GPU manufacturers and hate to see people give their hard-earned money to greedy corporations at such an outrageous markup. It seems to determine how they price at launch as demonstrated with the GPU shortage when the mining boom occurred. I believe the only way to rectify the current pricing gouge is to stop supporting it all together (not upgrading), until the price becomes more reasonable.
"What you didn't point out is that these new GPUs also need more power and run much hotter. So the hidden cost of moving up to these new cards can also mean upgrading your power supply and cooling system, and possibly a new cpu and motherboard as well Add to that higher electric bills and the few fps rates suddenly don't look that good any more!"
You are correct! Not only that but, in your case specifically, unless you went to say a 6900 or 6950 XT, you would indeed need not only a new MOBO but also that new fancy DDR5 ram. Another hidden cost, and another reason I went with the previous generation of cards.
"If my response to your assumptions sounds too harsh, I apologize. But people are going to read your comment and get an impression that simply is NOT true! I DO keep up with new drivers and I DO optimize my settings for my pc in general, my GPU, and my games. What 'struggles" i may experience are very brief and are usually resolved with a few easy tweaks."
Not harsh but a bit condescending. It is ironic though, the condescending bit about making assumptions and now the shoe is on the other foot. I never said your drivers were not up to date. I said driver issues. Which could mean a corrupted driver from download or something less than ideal AMD put into the software causing drops in performance for certain games.
So what happens when we assume????? (sorry couldn’t help myself).
"The 6800xt is an amazing GPU, and there are only 2 or 3 new GPUs that match or surpass its 16gb of vram or its performance!"
I agree wholeheartedly! The price to performance was in a league of it's own and the main reason I went with it. I still think it was too expensive though 😂😅 ($500).
It's funny that they say getting 60fps should be the minimum before using frame generation, cuz if I'm already getting a steady 60+fps, I'm gonna be LESS LIKELY to use frame generation.
Use it with something on low fps and you will see why...
the latency cant be undone
For me anything above 80-90 fps i will upgrade resolution. 4k balance or quality look so good with fsr with the frame gain. Other than that i will choose native.
60 fps feels like crap in high refresh rate monitors using a mouse and keyboard so it really makes games feels smoother at the cost of latency
> if I'm already getting a steady 60+fps, I'm gonna be LESS LIKELY to use frame generation
Why though? 100fps is so much more enjoyable than 60fps. Why would you even hesitate to use FG?
@@intelligentdesignmorty8112 I've tried this on a PC in my living room. I maxed out graphics in and was at 33FPS, turned on FG, and stayed at 60. On a controller sat far enough away it really doesn't feel terrible at all. However, I think that improvements to anti-lag+ would be ultra beneficial to ALL users who turn on FG, from low framerates to high. It would allow for a much larger userbase to get that 60FPS experience with far fewer drawbacks. After that, it just needs to be added to as many games as possible, so hopefully modders can find a feasible solution similar to the way AFMF and RSR work.
I don't understand why amd keeps helping the nvidia users. They are clearly not grateful.
Nvidia and its fans are the weights tied onto the legs of gaming right now. Bunch of pricks who think they are shareholders 😂
The input delay is so bad you can keep it
I'm grateful af and this is a big reason why my next gpu will probably be from amd. It's excellent marketing.
Because AMD understands gamers the more free stuff they give everyone the more people will like them and be open to buying a AMD GPU at the same time Nvidia is being anti consumer and competing with their own old cards so less people overall will need a GPU upgrade in the long run AMD will win because we're now seeing that Moore's law is dead and each generation is not that big of a improvement when everyone has 4k 120 fps nobody will care about new GPUs go to any Best buy and look at OLED 4k vs OLED 8k nobody is gonna upgrade to 8k its like a 5% difference for triple the price
@@Iisakki3000 Check the comment above you. You can understand where I'm coming from.
Great to see AMD improve FSR.
I wonder how common frame generation would be now a days, if it would have been released with Force Unleashed 2 back in the days.
They had quite interesting and working frame doubling in it.
And people don't understand how the technology works. Why it probably works with every graphics card. Could be because the CPU plays a big role and not the GPU
Everyone with nvidia gpus "FSR looks so bad why would you use it? im just gonna upgrade."
Everyone else "How about now?!"
Do 80%+ of computer gamer on the Nvidia side vs the rest... minus the apple and intel gamers. Yeah...
They showed you 4 imagines of FSR 3.1 and now your dickriding is crazy
I can turn on my 4090 and play my games (it just works) when I had amd (7900 XTX and 6950 Xt) I had driver timeouts and random shutdowns ...
There's a lot more into these companies than just upscalers or RT...
I had AMD for 15 years until 6 months ago and I'll never touch another AMD GPU I will only buy Nvidia's flagships for now on after I've seen the difference.
@@alkine3011You are fanboy nvidia liars 😅
@77gaza I've probably spent more money on AMD products than you considering both 6950 and 7900 xtx were over $1,000 a piece and not including CPU'S... you making that comment shows who's a fan boy when something is said about AMD people get but hurt but you're probably running around with a mid tier card cause you're broke and talking about how "Good" something is 🤣😂
Ray tracing and Path Tracing are a gimmick also right? 🤣
Apparently AMD cares more about my 3060 12GB than Nvidia does.
Once FSR is up to par im switching immediately, best case scenario would be of course if these GPUs suppliers would stop acting like a 100% inflation compared to 6 years ago is normal.
As a Nvidia GPU customer (GTX 1080. _yes don't judge_), I'm not only rooting for AMD for their non-restrictive software solutions but I'm also strongly considering AMD Radeon for my next build.
Nvidia has sadly disappointed me with their anti-consumer and anti-competitive (market-wise) practices.
(Doesn't mean AMD is a saint either. I ain't no fanboy.)
The one game I used FSR in on my GTX 1650 GDDR6 is Deep Rock Galactic and maybe Enlisted depending on the map and they look really good and on medium (ish) settings im getting a good 90 to 107 fps- 100hz monitor
I used FSR 2 on my 1050 Ti in RDR2 but it looked too bad and didnt even have an impact on my FPS, like maximum 3-5 FPS.
@@domseyboi youre on a 1050ti. Frame gen is great. But it cant revive a corpse
but he wasn't talking about frame gen...@@RadialSeeker113
@@RadialSeeker113 Yeah I know, but the point is that it makes the graphics muddy.
@@domseyboi it really doesn't with dlss frame gen
Genuinely, I think AMD has killed DLSS 3. Like upscaling tech is very specific, it does require machine learning for a really good result despite what AMD says. AMD's filtering approach does indeed prove it's possible to get an okay result, especially with older GPUs but nothing outstanding. However, fast frame generation is a tech we've had for a while now, with AI such as DAIN and the vastly superior RIFE which can run even on a CPU. Sure, with frame generation you can get a better result with machine learning as NVidia's DLSS 3 uses, especially with a lower base FPS, however it's not entirely accurate to suggest it's the only way, and indeed AMD's filtering-with-context can easily get great results as well for frame generation (it's hard for me to put it into words, so I can only hope I've made sense so far).
Although... it should be noted, we'll probably still see some smearing. It's not going to be perfect by any means, but it's imperfections will be only noticeable when going frame by frame and a side by side comparison, not during normal gameplay and definitely not with something like TH-cam compression.
You did a pretty good job putting this into majority terms, although a TH-camr only has to say, "but the DLSS3 frame generation is, again, looking just a bit better than FSR 3.1 with less moiré; the shimmering and jaggies and overall fidelity of the game in motion with frame generation turned on when comparing the two technologies side-by-side." That's all consumers need to hear to alter and fortify their spend and endorsement to favor Nvidia GPU's nowadays.
imagine buying an nvidia gpu just to get more support from the rival company
Imagine buying something, not understanding how it works and then complain that your now technically obsolete product doesn't have the latest bells and whistles that come it the newer product.
@@AlucardNoir I have honestly never used a product that wasn't "obsolete". I have only ever bought 2nd hand graphics cards, currently using an rx 570
😂
@@jacob-y8s🤣 🤣🤣
@@AlucardNoir my friend you just called the rtx 30 series obsolete. Are you ok in the head?
you forgot one major thing that, the example they are showing on ratchet with FSR 3.1 is from 1080p performance mode. FSR is very bad at 1080p resolution regardless of the quality modes and lower quality modes are basically unusable. but with FSR 3.1 they improved so much just from observing a example of a game that using 1080p performance mode. we may see even more massive improvement in quality mode to the point those tiny ghosting and shimmering effect on FSR 3.1 ratchet example which were running at 1080p FSR performance mode become near non-existent
Do to the fact amd is willing to even do this is awesome and still trying to fine tune it.
*_Any GPU?_* 🙄
*_GCN GPU's:_* ☠️
My next card is gonna be an AMD only until my 3090 is irrelevant
That would be a looong wait, as even 2060 is still a good pick.
@@alexturnbackthearmy1907 True, and those are still selling for too much on the used market, sellers need to drop prices since even used 5700XTs are going for a hundred less than them, but yeah 3090 isn't going anywhere.
Depends for what you use 3090. RTX 4090 is already struggling in many AAA titles on 4K. Once the new 5000 series GPUs come, a 4090 will be a "high mid-range" card from Nvidia.
I game at high resolutions 7k-5k at 45fps -70fps on my overclocked rtx 3090 trinity so with with frame gen I can switch to am5 and to new Zen 9900x3d from 5900x and free up a 15% bottle neck and DDR5 upgrade so I will wait for rtx 5080 ti no problem.
you can use 3090 for a long time. i'm still using my Radeon VII (2019) playing at 1440p with games on high or max setting at around 60 fps atm but def can see it aging with newer games.
40 gen wasn't much of an uplift over 30 gen nvidia. So they made their software trickery part of the deal. It's understandable. Why shoot yourself in the foot by giving your previous gen extra longevity. That's no way to milk your customers!
Thanks AMD. I'll be sticking it out with my 3080 just a little while longer.
fsr is finally worth using under 1440p base resolution...... it was time, dang it !!!!!!!
seriously , an nvidia gpu was just worth it because dlss basically laminates fsr under 1440p, glad to finally see majors improvement on fsr , it was much needed
Nvidia is better
I have a 7900xtx 7800x3d and
Nvidia is BETTER IN EVERY WAY
Amd fanboys before, "FG and dlss fake frames is rediculous i'will never use scanling bla bla bla...."
Now amd with fsr and FG "omg thats amazing thats revolution tHaNkS AmD omg"
The problem is that Nvidia uses the tech exclusiveness to sell cards. FSR 3.1 is fantastic but it won't sell AMD cards, in fact it will probably sell Nvidia cards, as you will be able to mix the quality of dlss with the FSR frame generation.
This may be true, but nvidia overprices everything to an insane level, AMD is taking a step back, and making it the choice for budget/normal people, meaning by in large they will have the higher demand and supply, vs nvidia with high demand, but not many people able to or wanting to buy. If you’re content with what you have and it costs 1,000,500,300, heck even 100 dollars cheaper and you get the better future proof (vram etc), then why go big unless it’s needed for work.
@@Koki-qe7vz You missed the point. AMD really needs to stop being the "nice guy" and stop helping nvidia cards with extra features. Why would anyone buy an AMD card to use AMD features when they can buy an Nvidia card and use both Nvidia and AMD features? AMD needs to start being exclusive and make their features only work on AMD cards.
@@brutlern their features are not good enough for that, you and I both know that.
I promise when AMD fixes there encoder issues and make their gpus as good as nvidia and intel for editing. I will go with RX all day everyday and i say this as a RTX 3070 user.
There isn't really an issue with 7000 series GPUs and encoding, particularly as they have AV1 encoding available. There is a big issue however with Nvidia encoding, NVENC at the moment, for which they recently released a hotfix. It sounds like you don't realize that you have things the wrong way around and that it's largely Nvidia cards that are having problems with encoding, not AMD GPUs.
Now they need to invent fake friends
I'm sure there's some AI friends out there
Well nvidia gonna release rtx chat bot so....
"AMD nailed it from the jump, FSR 3 always looked very good"
Well now I smell something and it ain't good
They did the same with GSync too, until AMD gave us FreeSync
TLOU running at 45fps on a 3060 at 1440p is laughable. This is a PS3 game.
Man, I need to get some FSR in Minecraft
I'm pretty sure you can make it work on bedrock and java with some mods
@@Hash6624 Or just use some optimization mods. Minecraft is hilariously broken in so many ways, that some mods can double or triple your performance, while using higher settings.
@@Hash6624not yet, the biggest issue is that Minecraft uses openGL to render, which can’t support fsr. However there is this guy making a mod so that Minecraft uses vulkan instead. Vulkan does support support fsr but the mod is still far from perfect
Use lossless scaling with LSFG enabled
@@MicahDaRhulerthere are already 3 "fps boost" mod.
the main thing they have to work on now is reducing the latency FG introduces. You can really feel it when using FG from about 60 fps specifically on shooters with mouse movement, feels like I'm dragging my mouse through jello. If they get it down to around 5ms response time that would be perfect
My next GPU is going to be AMD just to support AMD.
As a 4090 user and future 5090 buyer..
This is an A-level update with a potential of it being an A+ update.
Buyers from the Big three need this competition to keep going. And you never know, one day Radeon could have a Ryzen moment if NVIDIA falls asleep at the wheel for any length of time.
Keep it up 🎉
Great optimism BUT.... LOL. Nvidia can't possibly fall asleep. They are super woke lol. They'd only lose marketshare if they chose to focus more on other AI applications besides gaming and AMD went down that route. But that IS NOT going to happen, bc Nvidia is not THAT stupid. They could say they just dont care as much about consumer gpu's but that's really silly. If AMD innovates more than Nvidia in the GPU space, it will hurt Nvidias mental marketshare as well, which would interfere with their plans to literally take over the world with accidentally nefarious or subversive AI that exists in everybody's home. I'm only half kidding, of course! 🍻
@@TheModeRed well cool! NVIDIA should focus on doing the right thing. Good for them 👏
@@ehenningsen what is your complete setup by the way? I have one 3080 5900x gaming pc with a Gigabyte 1440p monitor, and another with a 7900xt, 5900x add 3440x1440 monitor. Hard fought and bought 5900x purchases during COVID, and can'[t even begin to fathom spending so so much on a 4090, but my semi-famous and multi-talented, frequently traveling family could possibly afford these things, we just aren't THAT into it to make it worth the difference right now. But we may start streaming...
@@TheModeRed nice! My PC is an Intel 13900kf, 96GB 6000Mts ram, 16TB SSD with an MSI Gaming Trio RTX4090.
My monitor is a TV, a 65" QD-OLED Samsung S90C that can game at 144hz
@@ehenningsenlol, that's a pc or a beast?
Are you secretly a huge Hollow Knight fan? I seem to hear alot of music from that game in your videos :D Keep up the great work.
The ghosting and shimmering and temporal stability is even worse in Minecraft.
It's quite amusing how, until recently, all the AMD fans were bashing Nvidia for their upscaling and frame generation capabilities. But now, they're the ones who seem most excited about these features. LOL, how the tables have turned!
Nope they are 30s and 20s and gtx users in disguise(me) cause nividea policy is stupid
I sincerely hope that Nukem with his FSR2FSR3 mod will be implementing the FSR 3.1 tech in his upcoming versions. One thing is I don;t understand how one will choose FG in games, officially, when selecting DLSS for upscaling and not FSR, will it autodetect that it is an RTX card in the system and by selecting FSR 3.1 it will show ability to enable FG but use DLSS as upscaling in the background?!?- I meant for the RTX 3xxx series, for an example.
so, xess with FSR 3.1 is the superior option now? (For non RTX users)
Why do reviewers slow and even stop the game to show us some little imperfection? You do not play games like that. So if you "play" the game and don't just look for sh$$ it will look fine
Fake frames and fake thumbnails. Stop with the clickbait garbage.
AMD still dog shit, Windows Update not fixed for years.
have you tried lossless scaling? it's amazing, you can even scale youtube videos / any window to double the frames
I'm using it for Enshrouded to good effect. I'm aware of some minor artifacts around my character, but it really doesn't bother me at all. The smoothness feels good.
@@jeremyengel7332 I agree
Even movies on vlc , 48 fps@@hupe5836
*Decoupling FSR upscaling and FG is a massive win for NVIDIA 30 & 20 series holders*
Facts, DLSS and FSR3 FG will be huge for those people
@@iitzfizz Yep free upgrade thanks to AMD lol
It even good for AMD owners. Could use XESS if you prefer it or Native + Frame Gen
This ll really help for mini pc with igpu, handheld, slim igpu laptops. It's just good.
Fake frame nice😂
Everything is fake
fake frames, fake resolution, thank you nvidia for moving the focus away from real performance.
U can thank nvidia for that, amd just gives it to you for free. Learn.
games themselves are fake "reality". If the tech is good and for everyone I don't see why being so negative.
Frame generation, money generation😂
Bro upscaling should not have existed tbh... Devs are getting lazy and are not optimizing the game properly...
Can you test it in Path of Exile (with its custom game engine)? How (bad) is the image quality in fast moving game with lots of stuff moving on the screen?
This upscalers aren't just about performance but also just making the image quality in these games more stable. One game that really noticed actually looked bad when you turn upscaling off was MK1.
Still after 3 months can't use steam vr with AMD drivers, gotta love it.
May I know which game ya? I have a Rx 570 + Pico 4. Just planing beat saber at the moment. Planning to upgrade the GPU to play more demanding titles.
I'd like to know which game as well
Running 6700xt + Quest 3, had no issues with No Man's Sky or VRchat so far.
On any recent driver past 23.12, launching steam vr gives an error 309 which indicates that a part of steam vr isn’t loading correctly. Every recent driver even as of two days ago the most recent one released on 3/20/24 still gives this error for many of users and we have to use outdated drivers just to run steam vr. This for obvious reasons is a bad occurrence as we have to use outdated drivers from December which heavily affects all games in terms of artifacts, driver crashes, incorrect/missing textures, and so on.
And effing up their drivers? No, say it ain't so. Who would have thought that the company with a track record of effed up drivers would make effed up drivers? They must be some sort of psychic.
@sweetroll_riley Interesting, I'm on version 24.2.1, which is the latest rn. What's your GPU?
The issue seems to be affecting 7000 series, which isn't too surprising because it's a new architecture. Still a shame it has all the issues to begin with.
It's crazy how a lot of people still don't know that what amd does is not frame generation, is frame interpolartion (the same tech used on TV from a few years ago), not that is a bad thing, it works, and that is the real reason why Nvidia can't put their FG features on other GPU's, there's hardware to generate those frames, I hope Nvidia makes a software FG for older GPU's, but I don't think that's happening.
like NIS but for interpolation, nice idea
Erm, both AMD and Nvidia would be frame interpolation, because it's using differences between frame 1 and frame 2 to make frame 1.5. Whether using dedicated hardware or general hardware changes nothing about it being frame generation.
FSR FG really smooths out my Cyberpunk gameplay at 1440p. Its like way better motion blur.
FSR isn't stable like XeSS or DLSS. Yes, because FSR doesn't utilize AI acceleration. Just wait. AMD is just getting started bro.
Nvidia is a scam nowadays
THEIR gaming revenue is less than their new AI business.
@@vmafarah9473Gamers don't care about that.
@@morgueblack Nvidia don't care about gamers.
@@morgueblacktrue, nvidia dont care about gamers but care about their new ai business
@@MrHacking7 That's why more gamers should buy AMD or Intel. They have the price-to-performance advantage.
Student with a 1650 laptop here, and man... this makes me wish laptops were upgradeable...
...eGPUs exist jsyk.
-rx6600 user on a gtx970m laptop
@@quarterpounderwithcheese3178 I am a freelance video editor who doesn't even get minimum wage, if I was buying an eGPU I'd just much rather go for a new laptop instead lol
Wake up, Babe! Vex uploaded a new video
I have often heard that the human brain processes information in chunks based on changes in sensory input, and doesn't constantly process the newest inputs. It checks sensory inputs to get a general idea of what the surrounding is, then uses prediction and memory to draw the rest and save processing time. Maybe this really is the best path forward for GPUs if we want them to generate visual worlds based on lower resolution renders of those worlds. And if AI were to completely replace the GPU and become advanced enough, maybe it would even be possible to go back to using the CPU to generate a very low resolution image and piping it into an AI card that instantly converts it to a high resolution. Would likely require training for each individual game to make it work properly though.
i will change on AMD just out of principal !
PC game graphics quality is getting worse and worse.. From SSAA to AI generated fake crap .. Oh, you can improve framerate - if your framerate is high enough for makink it work decently. Oh, you can improve resolution - if your resolution is high enough to work decently. Oh, you can use Ray Tracing - if the near to invisible difference is worth dropping all the fps you could not improve with aforementioned technologies, because you have no 600$+ graphics card actually delivering the needed base fps and resolution for having any advantage from all these shenanigans.
I can't see anythinng innovative in all the crap nvidia and AMD have been trolling their customers with for years now! Inlcuding the exploded pricing (average graphcis card still is at least twice an acceptable price..) all these crap technologies are cynical at best!
Realism isn't what people play either just look at the sales of GTA IV vs GTA V. GTA IV was too realistic & too hard for most people jump into play. GTA V however went back to a more arcade style game play & got a massive player base with online even though GTA IV had online too before it.
10:30 How was that bad English? I understood what it meant perfectly.
AMD is for the people. Nvidia is for their pockets.
With Blue Sky you can use Frame Generation on Movies and its INSANE.
Its very interesting that the generated frames are mostly a mess of pixels. But in Motion you won't see this.
part of me that plays Steam Deck when not on my PC, the idea that I could potentially see BG3 at 60FPS on it is insane.
Hopefully they can make it actually work now. I've seen multiple games lately drop FSR support right before launch because it's was causing stability issues. And not just while it's in use, but if it's even integrated into the game but turned off.
Eventually more people will realize that Nvidia is anti-gamer, and AMD is clearly in our corner.
My issue with frame generation is that it increases input latency, and unfortunately games are releasing using frame generation as a crutch for a "playable" states on release. This should not be the case, and i hate it. Devs should have enough time to optimise without this (looking at you publishers and management who are pushing games out the door before these games are actually optimised).
Why.... Why i even watch this with 4070 xD)
Good video, i think))
so happy for showing rtx 3060 in video because i am just about to build my first pc with rtx 3060 and after seeing this video i am sure that this card will last me good 4 years
It's funny that AMD benefits in many ways from this, not only it's showing "look, we're not far away from what you're used to, consider buying our gpu next time", it also kills sales of budget-range nvidia 4000 series gpu's which are worse than 3000 series and were worth getting ONLY for framegen. Nvidia can either look dumb and claim their past gen cores are too weak for the feature or can play even dumber and in the upcoming months "figure out" a way to enable it on older gpu's despite intentionally locking it only with sales of their latest gen in mind.
Bottom Line: FSR 3.1 still sucks.
Does DLSS with AI learning really worth more (present and the future) than AMD FSR?
Just got sapphire pure 4090 gre but the alternate was 4070 (as my old psu doesn't support 4070s).
YES.
I just realized dlss, fsr, xess are all technically just very advanced taa that could easily be remade on a smaller scale by anyone
I have a dumb question, if I want to use fsr on a game, do I have to lower my resolution below my native? im using a 2560x1440p monitor, and this gpu is still somewhat new to me, my previous gpu was a rx 580 didn't really have any special features, now im using a 6700 xt and I don't understand freeysnc, upscaling or fsr
I have a gtx 1070 with a i7-6700 which runs pretty smooth with every game I throw at it, but in warzone its kinda less with an avg of like 50-70 fps which isn't bad, but I got all excited when I turned on frame generation I got 110 fps to 150 fps consistent but the game was choppy and frizzy as hell, hope they can fix this in fsr 3.1. Currently I use intel Xess, intel Xess has been extremely smooth for me with an avg of 60fps-70fps to lil jumps to 57 fps sometimes. But frame generation is gonna be crazy, I'm just imagining XESS, with frame generation thats gonna be heavenly smooth for me.
no noticeable cut in responsiveness? I beg to differ my friend. I beg to differ....
AMD can save us. We need them to save our community.
Their Image Sharpening IS AMAZING, AFMF is a nice feature and more stuff I won't mention.
If they figure the Encoder and Decoder, and the FSR, they are RULLING THE WORLD.
The stupid thing is this should all be able to be enabled from the drivers, without having to wait for games to support it, which also means that older games will be SOL.
it already is on AMD
Pleas help me understand - how is frame generation amazing, if you cannot cap it? Turning it on pushes your gpu to 100% and your temps through the roof.
It's because the higher your framerate is, the harder your gpu has to work. Lock your framerate to 30 and see how much less strain it puts on your gpu
And why exactly you can't cap it? You don't know how or what is the problem?
And gave game devs a reason to not optimize their games. It's cool for handhelds and undepowered builds. But there is no reason to use frame generation with a $1,000 GPU.
dlss runs off hardware = better picture amd runs of software= in time might catch up or pass it. On my ally i love fsr and use the beta driver for fluid motion and diablo 4 looks little fuzzy but im always around 150 fps and hit up to 170 in dungeons. On my desktop with a 3090 i run everything maxxed out 4k and some times need dlss to hit the 120 mark for my LG c1 and its amazing still lol. I love how far we have come in just a few years.
Finally.. Applied on all games or only for the new releases?
The once that give you the options
What happened to "we're implementing FSR3 into DirectX so you can use it on most games"?
You mentioned Steam Deck, Proton implements FSR2 into its comparability layer, that's how it upscales all games.
There's a big mistake in this video, while both AMD and NV's goals are the same, their approaches are different not just copy. AMD's framegen references to DX11/12 exisitng data and NV's references to after DLSS3 patched game data. They are totally two different technologies.
Just finished playing cyberpunk with an fsr 3.0 mod created by Lukefz, from 65-75 fps on high to 120-150 without any additional upscaling just using the frame gen (rx 5700xt)
Is it enjoyable the fsr 3 frame gen? Any mouse input issue?
@@rizkijunir23 no input delay, just a little bit of ghosting on car bumpers that could be fixed easily
Maybe after Intel created a better resolution scaling, AMD got embarrassed and finally improved FSR. I just cant stand the shimmering. My son loves his 6950 XT tho so at least AMD impressed him.