@@TheRealBleach Of course !!! Those were AMAZING value . Even those Athlon X4 7xx/8xx chips were excellent value. I sometimes miss those days... although I'm 51 years old now, so I remember when computers were 8mhz. lol
@@curie1420 Because it's fun to see 300fps at 4k 😁Starfield is a *completely* different game at a steady 120fps/4k/Ultra, as is Cyberpunk. Throw in an OLED, and talk about immersion.
@@curie1420The higher the framerate the more believable it is, therefore when you already have a good gpu your frames and latency become utterly ridiculous
So far I’ve been using it on my 6750xt and my RogAlly. It works amazing. You can even double the frames of emulated games. Been playing Mario kart at 120fps locked 1080p on my Ally
Watching this on a Zenbook 14X and some of your speech is hitting the resonant frequency of this laptop and sending a little "shiver" through it. What an experience
As far as I know, this feature seems to be available only for the 6000 series and onwards so for the people with older cards I think lossless scaling is still the go to option. Maybe AMD will reconsider and make it available for older cards but I'm not sure if it will be possible
Lossless scaling is زبالة i have it and it starts warping and glitching out if the input is under 40fps. I was finna use this to max out settings in singleplayer games on my 480 but motionblur is unironically the better choice. It probably works well with 60fps input but idh a high refresh ratr monitor so i cant test it out.
people who have 5000 series or 500 series gpus from amd really need to upgrade lol 6000 is great way better than 5000 or 500 series gpus im not even gonna mention the likes of rx 470-450 people its time to upgrade if you are running hardware from 2015-2016 lol
cant wait for these drivers to fully release, im not using this technology yet since the drivers are still being tested but i cant wait to use this for my 6700 xt, bump up that 1440p performance
When I first started watching this channel I was using an HD6450 1gb gpu, crap card but it played MW, 2 and 3 just fine, how about a trip down memory lane Steve, let's see what it can do nowadays, go on mate, you know you want to ;)
“…but the game does feel a lot smoother” is a good way to put it 5:40. You’re still getting 60fps latency, but you’re “getting” 120fps smoothness. Some tech tubers are saying it really should be called a “frame smoother” rather than “frame gen”.
I can't wait till they make more oled monitor with BFI. that way you can have a game running at 60fps frame gened to 120, then with a 240hz screen with bfi that 60fps would look as clear as 240fps with the fluidity of 120
780m does the same, I mean dropping frames. This is because of power limit. Enabling frame generation make some parts of gpu actually work and taking electricity from the gpu core itself. Therefore cards that are power limited from the begging will see a drop in actual real fps after enabling frame gen
Bingo, which is why AFMF is absolutely perfect for the 20 and 24gb cards that they're putting out. Might sound weird, but Forza Horizon 5 is a damn good game to do tests with. It's a good showcase of what's actually happening, as it'll use *everything* you allow it to use, and you can easily see in real-time what each of your changes are doing. That's where I learned that with Nvidia cards, I vastly prefer CAS+MSAA over DLSS, as that combo distributes the load a lot better between the CPU and GPU than DLSS does.
@christophermullins7163 I've used practically all of the 20xx and 30xx RTX cards minus the non-Super 2070. 2060 laptop, 2060, 2060S, 2070 laptop, 2070S, 2080, 2080S, 2080ti, 3050 laptop, 3050ti laptop, 3050 8gb, 3060 8gb, 3060 12gb, 3060ti, 3070 laptop, 3070, 3070ti, 3080, 3080ti, 3090, and 3090ti. I'm currently running a 7900XTX, 7900GRE, and 6900XT. Nvidia has DLSS, but they've started to rely on that too much. My 6900XT gets *better* rasterization than the 3090ti that I had, outside of ray tracing. For the first time in a very long time, I have to say that AMD's 6xxx and 7xxx are going to age much better than the 30xx and 40xx.
@@bonniscootor I upgraded from 6950 XT to 4070 ti super because quality over quantity. I have used fsr and dlss for hundreds of hours each(probably 1000hrs on fsr). I can confidently say that dlss balanced looks at least as good and runs much better than fsr quality. It is irrefutable that dlss has better image quality. Ray tracing sucks rn. I would definitely rather have Nvidia but I would rather give my money AMD. I decided to be selfish and bought Nvidia.
@christophermullins7163 That highly depends on the game. Out of the 40 or 50 that I rotate through, only Cyberpunk looked "better" with DLSS vs FSR. Even when running Nvidia cards, I prefer CAS+MSAA over DLSS/FSR. I don't use FSR very often at all. I play on a 4k/120 tv, and DLSS is only "comparable" on monitors that you're sitting closer to (in my experience).
VRAM is the least of its problem. It's the PCIE 4.0x4 that hurts it the most. It performs horribly on PCIE 3.0, which is the kind of pc that will rock a 6500xt
@@duchuydoanphan7267I actually owned the 6500XT and VRAM capacity was it's biggest problem since more vram would reduce how much texture swapping would need to be done in vram
Alright, you've twisted my arm. I'm going to risk the beta drivers to give AFMF2 a try. I'm not playing on low end hardware, however I do use AFMF1 on my single player games with the eye candy cranked, just to see the 144+ mark. Occasionally the ghosting is noticeable on AFMF1 (even when pushing 100+ 'real' rendered frames) in high speed/active scenarios but it's barely noticeable at all in the video with V2. This will be particularly cool for games that still have their physics tied to framerate and completely wig out over 60fps.
With frame gen especially software/driver enabled always cap your frame rate to the best base your hardware can achieve even if it's not 60fps (or half your monitors refresh rate) it will just help massively with the smoothness
Funnilly enough... At the end in Witcher III the frametime was about 14 to 15 ms before AFMF2 and 16 to 17 ms with the combined frametime + framegen lag. So in that instance we got 2 to 3 ms of extra latency but double the frames in that particular situation.
something this video reminded me was how good RDR2 looks on low! what a stunning game, i imagine it'll age very nicely given how it somehow turns 6 this year !
afmf is god sent I play with it on on games like finals and valorant trust me if u got a decent pc that can push fps consistently for atleast half ur screen refreshrate go for it
Good coverage of the tech, I agree completely. Might I add this though, when my buddy RandomHD starts saying that 6500xt is a low end card, you might be a bit spoiled now adays. The monitor too, if 120 hz is low end "something I picked up" you might be getting too much free stuff.
If you feel awkward about generated frames as "fps", one option would be to use the term "presented frames (per second)" and "rendered frames (per second)" to make the distinction.
Just got my first AMD card, the Sapphire Pure 7900GRE. Don't know why people keep on complaining about their drivers and lack of features. Atm I'm just loving the amd software, there's so much to do here. My two previous cards were the GTX 770 Lightning and GTX 1660 Super.
agree, amd driver is next level, compare to nvidia's driver, be advise to other amd user, uninstall and reinstall the amd driver after every windows update so that u dont need to complain about the driver issue
Look at the low power draw on that 6500 XT. The vram is the only issue. They make an 8gb version but your better off upgrading for a little. This card was released during the GPU shortage & msrp was a little high but quickly dropped to 125-150 usd. Back then NVidia didn't offer any comparable alternative & Intels Arc hadn't come out yet. It would have been more appreciated if it had come out 6-8 months earlier.
I was initially skeptical of frame generation after dealing with the terrible soap opera effect interpolation on HDTVs, but DLSS 3 has been a godsend for my gaming PC. I have a chinese x99 Xeon rig with a 4070, and despite the obvious bottleneck, Cyberpunk looks amazing and runs at 120 FPS with very little noticable artifacts.
This could be huge for any game with capped frames, for instance every update to DBZ Kakarot breaks the mods that unlock fps if AFMF can bypass this "even if they are technically not real frames" then damn that's awesome 😁
This is great, I would love to see you use it on a more powerful card but with games that have engine caps or are badly optimised. Starfield on your best Radeon card maybe?
of course everyone prefers genuine high frame rate to frame gen silly! ^^ thanks for the video as always. random question, when you say you tested GTA V with very high settings, how does that pertain to things like MSAA and the advanced settings? i'm woefully unfamiliar with GTA V's pc settings and whenever i go to play it on my ryzen 5 5600 + rtx 3060 system i can't seem to settle on a good setting for my 4k monitor lol. thanks for everything you do!
Frame rate drop from frame gen in GPU bound scenarios is to be expected - but you'll still see a net positive in your frame rate, since it requires GPU processing and VRAM. When using frame generation, you especially can notice FPS drops if you are in a situation where you are close to the limit of your VRAM capacity before enabling frame generation. Doesn't matter if DLSS3FG, Lossless Scaling or AMMF - all of them need 200-500MB VRAM, which means your card will need to dip more into system RAM which can degrade framerate, texture pop in, or even cause a game to crash (i.e. Hogwarts Legacy becomes unstable from frame generation when at high VRAM usage). IMO best case with frame gen is to boost CPU limited titles to boost your FPS floor. For example X4 or Total War: Warhammer 3. FPS games are some of the worst use cases for FG, since latency is so important in those and that won't improve with FG unless you buy overblown marketing hype.
From what I know the more stable the normal FPS are, the better for AFMF 2. So test where the normal FPS go and stay at a constant with AFMF 2 on and then set the max FPS for each game at the respective constant normal FPS and you'll probably have an even better experience. Setting a max FPS for games is the easiest with RivaTuner as you can even set a button to activate or deactivate it whenever you want. You can set it in the driver for each game as well from what I know(at least on Nvidia so AMD should have something like that) but for me it's more convenient to use RivaTuner as I use MSI Afterburner as well.
If Microsoft can bring this to Xbox Series S, it's going to make most of the games run at 60 fps. Of course, a system level implementation like on PC would be better rather than depending on game devs to release updates to remove frame cap and use this. Fingers crossed!
This is really a help for low end builds! However it seems like it can't yet get a game from unplayable to playable, but rather from playable to high refreshrate, which is still very nice!
This might sound stupid, but I currently use AFMF 2 with RSR to play Minecraft with raytracing... on my RX6400😁 It quiet literally is unplayable without both enabled, but with them I can actually play in friend's creative realm at about 50 fps.
Question: When you do these entry-level card testing, do you always use the 4gb versions because of the price? I have a 5500xt 8gb and I feel that that tended to handle a lot of the games I threw at it on account of the higher memory. Would using a 6500xt 8gb improve the performance (with or without frame gen) by a significant amount or would at the increase price it be more worth it to get a higher tier of gpu?
I think the best thing about AFMF is simply that you can force it on nearly every goddamn game on DX11 or 12, which includes titles that suffer from CPU bottlenecks due to being simulation-heavy, or just plain demanding but not affected much by input lag. DLSS framegen on the other hand is only supported where it is supported, and most of those games are fast-paced stuff where the input lag would be a detriment. Sadly I only have a 60 hz display, so the usefulness is fairly limited for me, but I can see a few titles where I could totally see it smoothing things over a little.
@@t.gchannel4067 So far I just havent found many games where its worthwhile, as it absolutely has to be CPU-limited and slow-paced. Cities Skylines (the first one) is such a candidate since even on my system itll drop to 20-30fps with larger cities just from the CPU load. Hearts of Iron 4 is also CPU-heavy and even more handicapped due to being single-threaded, but AFMF barely helps because most issues come from the game stalling out and briefly freezing, and AFMF cant do much about that. Most GPU-heavy stuff I play is sadly also too fast-paced for AFMF to not cause latency issues, so Id rather not.
Hey! Are you able to do a comparison of the 4GB and the 8GB version of the RX 6500 XT? Especially in titles where it usually caps, or nearly caps out on the 4GB? Would be extremely interesting to see, how this compares.
I've been using the AFMF 1 for the last weeks and it's smooth I didn't even feel some latency. I'm using it on Modded Skyrim SE and FO4 btw, because ENB eats half of my fps on 6600.
does this work for you on windowed borderless gaming? I only play my games in windowed borderless and this afmf (1 or 2) [yes i got the preview driver] never worked for me.
Afmf 2, high/quality mode is the way to go. Just a better experience, no turning itself off and just smoother. But im using 7800xt, maybe the AI accelerator is also helping the 7000 series.
@@shepardpolska interesting, but I only see people that have lower tier GPUs using only the auto mode, which i don't really understand because you have a better mode to choose from. The main issue people had with afmf 1 was that it would disable itself in fast camera movement, so why continue to use a mode where that's still the case, especially when that issue is solved on the high/quality mode?..it's confusing
Any chance you try to use that GPU as the displaying GPU for the dual gpu capability of AFMF2 ?! I would love to see the result paired with a 7800XT or something higher up
Pretty sure you said, in a previous comparison video, that there wasn't much difference between the 6500XT and the older 5500XT, and that the 5500XT won significantly in a PCIe 3.0 system. So it would be interesting to see how AFMF2 works with a card like the 5500XT.
Soo, i don't know if this is a question or stating a theory at this point, what it is actually doing is just generating frames after they were "normally" rendered by the gpu on a base level right? Like the game IS running at let's say 60 in the game engine level but the driver is making it seem like a higher frame rate by using "frame gen" once you see it. Which may explain why GTA V doesn't have any of the weird stutters it gets when you "natively" run the game at high fps, since the game engine itself isn't running at 170+ fps with frame gen right? If this is true, I would love to see how fps based physics games (i. e. Games in which the fps affects the in game physics) react to frame gen. For example, the older GTAs are notorious for their jank physics at anything higher than 24-30 fps (since those were the fps for the console ports).
Hey. Wondering re low-end content, I have a red NVIDIA GF9500GT, 1gb of VRAM, DDR2, with CRT, DVI, and HDMI in, and it is a 50W card, that I have little to no use of. Thus I have been wondering if you would want to have it to test out. Even if it crashes out after 30 seconds...
only gripe with this feature is the lack of support for old amd cards like the 400/500/5000 series, though that is totally understandable from a business standpoint i guess. still cant believe my rx 470 is like 8 years old!
I think AMD recomends you need to have 50-60fps for Frame gen to work effectively, but looks to work well with AMDs RX6500xt, it is supposed to work with Nvidia & Intel cards too so could you test this on say GTX1660 or RTX2060/70 and an Intel card to see if AMD is helping gamers on a budget and planet by saving cards from landfill.
@@adriankoch964 i had 2 capture card before, cheaper one also expensive one, when i plug to mu pc it says doesnt support things i dont ever know what it is before, then, i sold both of it.
I don't know about others but if my game is 120FPS but still have the input latency of 60fps (+ slightly more input lag caused by frame gen itself) is super jarring! I could not get used to it no matter the game. I only see a potential for this tech on consoles since people there are far less sensitive to oddities such as input lag and frame pacing etc.
This piques my curiosity. I wonder how a rx 6400 low profile would do with afmf? Or how an arc a380 does in games with fsr 3 frame gen? Probably bad since this might not hit 60 fps to start with but it would be intersting.
That is pretty good afmf 2 boosting a weak gpu like the 6500xt but I wouldn't get a 4gb graphic card like the 6500xt ever, 5500xt 8gb model is the best budget gpu on market in
For people who have older hardware and can't use it: It's shit. Unless you're running the game well above 60 Fps, it's terrible. Even after the AFMF2 update.
how is everyone utilizing almost 100 percent in cs2? i have a 7800x3d and a 7900xtx. it's always 20-30 percent util on both cpu and gpu on my end and it's causing stutters and frame drops. AND GPU TEMPS IS ALWAYS AT 30-50c depending on what my current room temp is.
I don't think this is useless but it feels much more like a tiny little extra than a key feature, as Nvidia claimed when marketed DLSS framerate to pretend the 40 series wasn't overpriced crap. I mean, you reeeally need ultra-high FPS for competitive games. And this doesn't help in competitive games since there's always a delay to the extra frames and you don't get quicker information. And you reeeeeally need more frames when you have 20FPS and want the game to be playable. And this doesn't work well in that case either. Basically, this gives you extra frames, except when you actually need them. They're only there when they're a nice-to-have, like going 60 to 100 (and the higher you go, the less it's relevant). We should never pretend those are real frames, as Nvidia did.
I would like to see a video about frame gen with very low framerates. Like how does it feel when you normaly only get around 20 fps and with frame gen maybe 35. Does it stutter or feel sluggish?
It almost doubles your input lag, so it can feel like going from playing on a PC monitor to playing on a TV with its game mode off. Also the lower the native frame rate and the more things change from one frame to another, the more noticeable artifacts will become, since the FG needs to interpolate more ("aka make shit up"), and each fake frame will persist much longer on screen. In FPS games, you don't want to enable any sort of FG at
The strange thing with the 6500 card is that AFMF does not work on pci 3.0 systems as far as I know. ( I run a and 3200pro and a 6500 card right now. Upgrade to 3600 is imminent!)
It doesn't come across on TH-cam, but can you FEEL the extra frame rate when AMD's own counter isn't visible? Not that AMD would lie about that sort of thing, of course, paragons of honesty that they are. After all, I'm sure that there's all kinds of valid technical reasons why no other standard benchmarking tool can see the extra frames, unlike with other framegen methods... edit: The lack of GTA5's massive, unplayable stuttering when you pass 180fps is rather suspicious, too.
Because the extra frames don't happen in the game engine like with DLSS 3 and FSR 3, this type of framegen works purely between GPU driver and display. GTA stutter happens because game engine breaks down when rendering high FPS, but AFMF doesn't make the game engine render more, it generates frames inbetween real ones
Thanks for making low-end content and making it entertaining. Have a wonderful day!
Thanks for watching :)
Why are you testing old games.test gamesloke alan wake 2 and forbiden west@@RandomGaminginHD
his first three or four years was all about 2 core pentiums and a GTX 750 ti. lol
@@Haywood-Jablomie cant forget the g3258 and g4560s. legends
@@TheRealBleach Of course !!! Those were AMAZING value . Even those Athlon X4 7xx/8xx chips were excellent value. I sometimes miss those days... although I'm 51 years old now, so I remember when computers were 8mhz. lol
I tried out AFMF2 on my 7900XTX and it is literally insane! This is the birthday gift that keeps on giving!
why would you need a frame gen on a gpu that is $1k?
@@curie1420 Because it's fun to see 300fps at 4k 😁Starfield is a *completely* different game at a steady 120fps/4k/Ultra, as is Cyberpunk. Throw in an OLED, and talk about immersion.
@@curie1420for higher resolutions? same purpose as to why you need frame generation on low end cards like rx 7600 or rtx 4060.
@curie1420 Same as weaker cards to boost framerates for better motion fluidity.
@@curie1420The higher the framerate the more believable it is, therefore when you already have a good gpu your frames and latency become utterly ridiculous
Yes, the answer is yes. I noticed a massive improvement on my potato 6400. I really do love all of AMDs features.
Yes. I always used nvidia but i bought a 6500 XT and i never ill back to nvidia. I love the software AMD.
@@hansvogel4335That's what I'm saying I've never had driver problems with AMD and I don't like how much stuff I have to download for Nvidia
fingers crossed, this will be huge when its available for other entry level or mid level gpu
So far I’ve been using it on my 6750xt and my RogAlly. It works amazing. You can even double the frames of emulated games. Been playing Mario kart at 120fps locked 1080p on my Ally
@@fresco_dinero5155how does the 6750xt treat you? I'm considering getting one
@@fresco_dinero5155 Would you say its better than lossless scaling?
If the 6500 XT supports AFMF 2 then the 5700 XT for sure can. I'd love to see AFMF 2 make it down to some RDNA 1 cards.
sorry to break it to u but 5000 series isnt getting it :)
Use LSFG. same thing as AFMF but runs on everything. Even on Intel graphics 4000😅
AFMF 2 not supported for RDNA1 is the same reason XeSS not supported in RDNA1
@@musouisshin as an Rx 5700 owner, this is kind of depressing. But i do own lossless scaling. i guess i can try that instead.
No it can not. They are different architectures.
Watching this on a Zenbook 14X and some of your speech is hitting the resonant frequency of this laptop and sending a little "shiver" through it. What an experience
As far as I know, this feature seems to be available only for the 6000 series and onwards so for the people with older cards I think lossless scaling is still the go to option.
Maybe AMD will reconsider and make it available for older cards but I'm not sure if it will be possible
Lossless scaling is زبالة i have it and it starts warping and glitching out if the input is under 40fps. I was finna use this to max out settings in singleplayer games on my 480 but motionblur is unironically the better choice. It probably works well with 60fps input but idh a high refresh ratr monitor so i cant test it out.
LS is pretty expensive performance wise on RDNA1 and below tho, sadly... But still a great option
people who have 5000 series or 500 series gpus from amd really need to upgrade lol 6000 is great way better than 5000 or 500 series gpus im not even gonna mention the likes of rx 470-450 people its time to upgrade if you are running hardware from 2015-2016 lol
@@justaguy5791 we're in second world countries where new hardware is like 3x the price it should be and tariffs on imports are insane
@@fv101 Exactly. If I had the means, I wouldn't be using a laptop with a 3rd Gen Intel CPU and an Nvidia GT 645m GPU
cant wait for these drivers to fully release, im not using this technology yet since the drivers are still being tested but i cant wait to use this for my 6700 xt, bump up that 1440p performance
You could try out AFMF1. It's plenty good enough to use, 2 just improves on it.
Interesting feature and I'm so curious to see where these technologies go. Love your vids too.
literally thought about this card after your first video given i have one sitting around. lovely as always
When I first started watching this channel I was using an HD6450 1gb gpu, crap card but it played MW, 2 and 3 just fine, how about a trip down memory lane Steve, let's see what it can do nowadays, go on mate, you know you want to ;)
I live in the low end. I just dont play a lot of newer games anymore. Always looking for low end content.
AFMF were my favourite post industrial punk dubstep act
to me its ZLUDA
But then they changed their name to The Glorified Frame Gens of Mumu, and I went off them then
“…but the game does feel a lot smoother” is a good way to put it 5:40. You’re still getting 60fps latency, but you’re “getting” 120fps smoothness. Some tech tubers are saying it really should be called a “frame smoother” rather than “frame gen”.
Im RX 6500 XT User anyway and im gonna tested it as well for benchmark. thankyou so much for this!
Cyberpunk devs r really taking their time with the official fsr3 frame gen update. I'm just really waiting for it to drop.
only n cards owner have to waiting for it
Ryzen 4070 still smokes the RX 6500 XT while being cheaper
Bro why does my Ryzen 4070 overheat tho 😭😭😭
Is there a joke here I don't get?
ztt XD
Ryzen 4070 saved my family and my dog
@@itsuadman watch ztt shorts you'll know
This makes my 6600 so new again
I can't wait till they make more oled monitor with BFI. that way you can have a game running at 60fps frame gened to 120, then with a 240hz screen with bfi that 60fps would look as clear as 240fps with the fluidity of 120
780m does the same, I mean dropping frames. This is because of power limit. Enabling frame generation make some parts of gpu actually work and taking electricity from the gpu core itself. Therefore cards that are power limited from the begging will see a drop in actual real fps after enabling frame gen
From what I know AFMF is meant to help with a cpu bottleneck and FSR is meant to help alleviate a gpu bottleneck
Bingo, which is why AFMF is absolutely perfect for the 20 and 24gb cards that they're putting out.
Might sound weird, but Forza Horizon 5 is a damn good game to do tests with. It's a good showcase of what's actually happening, as it'll use *everything* you allow it to use, and you can easily see in real-time what each of your changes are doing. That's where I learned that with Nvidia cards, I vastly prefer CAS+MSAA over DLSS, as that combo distributes the load a lot better between the CPU and GPU than DLSS does.
Do you own an RTX GPU?
@christophermullins7163 I've used practically all of the 20xx and 30xx RTX cards minus the non-Super 2070.
2060 laptop, 2060, 2060S, 2070 laptop, 2070S, 2080, 2080S, 2080ti, 3050 laptop, 3050ti laptop, 3050 8gb, 3060 8gb, 3060 12gb, 3060ti, 3070 laptop, 3070, 3070ti, 3080, 3080ti, 3090, and 3090ti.
I'm currently running a 7900XTX, 7900GRE, and 6900XT. Nvidia has DLSS, but they've started to rely on that too much. My 6900XT gets *better* rasterization than the 3090ti that I had, outside of ray tracing.
For the first time in a very long time, I have to say that AMD's 6xxx and 7xxx are going to age much better than the 30xx and 40xx.
@@bonniscootor I upgraded from 6950 XT to 4070 ti super because quality over quantity. I have used fsr and dlss for hundreds of hours each(probably 1000hrs on fsr). I can confidently say that dlss balanced looks at least as good and runs much better than fsr quality. It is irrefutable that dlss has better image quality. Ray tracing sucks rn. I would definitely rather have Nvidia but I would rather give my money AMD. I decided to be selfish and bought Nvidia.
@christophermullins7163 That highly depends on the game. Out of the 40 or 50 that I rotate through, only Cyberpunk looked "better" with DLSS vs FSR.
Even when running Nvidia cards, I prefer CAS+MSAA over DLSS/FSR. I don't use FSR very often at all. I play on a 4k/120 tv, and DLSS is only "comparable" on monitors that you're sitting closer to (in my experience).
If only the RX 6500 XT has 6gb VRAM
Yeah I think there is an 8gb model out now I’ll have to find one!
VRAM is the least of its problem. It's the PCIE 4.0x4 that hurts it the most. It performs horribly on PCIE 3.0, which is the kind of pc that will rock a 6500xt
@@duchuydoanphan7267I actually owned the 6500XT and VRAM capacity was it's biggest problem since more vram would reduce how much texture swapping would need to be done in vram
Yep an RX 6550 XT is way overdue
@@BillyBoy444 a 16GB one to compete with the 4060Ti
Insane what this tech can do for low end hardware.
Huge latency due to 30 fps?
30 fps doesn't have enough info to generate frames @@semyonzhigunov671
@@semyonzhigunov671 it's playable if you disable vsync
@@semyonzhigunov671Have you seen the video? There's typically less than 1 frame of added latency.
@@semyonzhigunov671 if it can turn 30fps into 60fps I wouldn't really care about latency tbh
Chill + afmf is the best option.
this is was i wanted to see. many thanks
Alright, you've twisted my arm. I'm going to risk the beta drivers to give AFMF2 a try. I'm not playing on low end hardware, however I do use AFMF1 on my single player games with the eye candy cranked, just to see the 144+ mark.
Occasionally the ghosting is noticeable on AFMF1 (even when pushing 100+ 'real' rendered frames) in high speed/active scenarios but it's barely noticeable at all in the video with V2. This will be particularly cool for games that still have their physics tied to framerate and completely wig out over 60fps.
With frame gen especially software/driver enabled always cap your frame rate to the best base your hardware can achieve even if it's not 60fps (or half your monitors refresh rate) it will just help massively with the smoothness
This also helps with frame time consistency.
Great test! Would love to see 2.5k tests, I'm sure it'll do quite well in Forza 4. ^_^
Funnilly enough... At the end in Witcher III the frametime was about 14 to 15 ms before AFMF2 and 16 to 17 ms with the combined frametime + framegen lag. So in that instance we got 2 to 3 ms of extra latency but double the frames in that particular situation.
Me trying to look for the difference between AFMF2 turned off and turned on using my 60hz monitor:
There should be artifacts now if not any perceived difference in smoothness.
something this video reminded me was how good RDR2 looks on low! what a stunning game, i imagine it'll age very nicely given how it somehow turns 6 this year !
afmf is god sent I play with it on on games like finals and valorant trust me if u got a decent pc that can push fps consistently for atleast half ur screen refreshrate go for it
bruh what kind of texas instrument you rockin that you need framegen on valorant
Good coverage of the tech, I agree completely. Might I add this though, when my buddy RandomHD starts saying that 6500xt is a low end card, you might be a bit spoiled now adays. The monitor too, if 120 hz is low end "something I picked up" you might be getting too much free stuff.
You can get a 6500xt for about £150 and a 120hz monitor for around £100 or slightly less they are comfortably low end in todays market.
If you feel awkward about generated frames as "fps", one option would be to use the term "presented frames (per second)" and "rendered frames (per second)" to make the distinction.
Just got my first AMD card, the Sapphire Pure 7900GRE. Don't know why people keep on complaining about their drivers and lack of features. Atm I'm just loving the amd software, there's so much to do here. My two previous cards were the GTX 770 Lightning and GTX 1660 Super.
agree, amd driver is next level, compare to nvidia's driver, be advise to other amd user, uninstall and reinstall the amd driver after every windows update so that u dont need to complain about the driver issue
Happy weekend, Steve!
When will it be on official driver
Not sure tbh hopefully soon
They mention Q4 2024 in the community blogpost
Look at the low power draw on that 6500 XT. The vram is the only issue. They make an 8gb version but your better off upgrading for a little. This card was released during the GPU shortage & msrp was a little high but quickly dropped to 125-150 usd. Back then NVidia didn't offer any comparable alternative & Intels Arc hadn't come out yet. It would have been more appreciated if it had come out 6-8 months earlier.
It's Xbox Series S GPU just clocked higher so no wonder power usage is low
@@Radek494 Its good if your paying the electricity bill.
I was initially skeptical of frame generation after dealing with the terrible soap opera effect interpolation on HDTVs, but DLSS 3 has been a godsend for my gaming PC. I have a chinese x99 Xeon rig with a 4070, and despite the obvious bottleneck, Cyberpunk looks amazing and runs at 120 FPS with very little noticable artifacts.
This could be huge for any game with capped frames, for instance every update to DBZ Kakarot breaks the mods that unlock fps if AFMF can bypass this "even if they are technically not real frames" then damn that's awesome 😁
Its great on my RX 7600 ! I love AMD GPUs and im never going back to Nvidia!
5:35 fakes per second would be perfect
ztt meme are now contagious, we need more ryzen 4070
This is great, I would love to see you use it on a more powerful card but with games that have engine caps or are badly optimised.
Starfield on your best Radeon card maybe?
I hope this get added on the custom drivers, it would greatly benefit my Vega 64.
That would be awesome if it worked with older cards too
Have you tried the lossless scaling app? That is probably the best option you have. Don't know how good it runs on a Vega though
of course everyone prefers genuine high frame rate to frame gen silly! ^^
thanks for the video as always.
random question, when you say you tested GTA V with very high settings, how does that pertain to things like MSAA and the advanced settings?
i'm woefully unfamiliar with GTA V's pc settings and whenever i go to play it on my ryzen 5 5600 + rtx 3060 system i can't seem to settle on a good setting for my 4k monitor lol.
thanks for everything you do!
Frame rate drop from frame gen in GPU bound scenarios is to be expected - but you'll still see a net positive in your frame rate, since it requires GPU processing and VRAM.
When using frame generation, you especially can notice FPS drops if you are in a situation where you are close to the limit of your VRAM capacity before enabling frame generation. Doesn't matter if DLSS3FG, Lossless Scaling or AMMF - all of them need 200-500MB VRAM, which means your card will need to dip more into system RAM which can degrade framerate, texture pop in, or even cause a game to crash (i.e. Hogwarts Legacy becomes unstable from frame generation when at high VRAM usage).
IMO best case with frame gen is to boost CPU limited titles to boost your FPS floor. For example X4 or Total War: Warhammer 3. FPS games are some of the worst use cases for FG, since latency is so important in those and that won't improve with FG unless you buy overblown marketing hype.
From what I know the more stable the normal FPS are, the better for AFMF 2. So test where the normal FPS go and stay at a constant with AFMF 2 on and then set the max FPS for each game at the respective constant normal FPS and you'll probably have an even better experience. Setting a max FPS for games is the easiest with RivaTuner as you can even set a button to activate or deactivate it whenever you want. You can set it in the driver for each game as well from what I know(at least on Nvidia so AMD should have something like that) but for me it's more convenient to use RivaTuner as I use MSI Afterburner as well.
I've got a 5600xt and I need to try this out. Trying to hold out on an upgrade as long as I can haha
Sadly this needs series 6000 and newer
sorry to break it to u but 5000 series isnt getting it :)
If Microsoft can bring this to Xbox Series S, it's going to make most of the games run at 60 fps. Of course, a system level implementation like on PC would be better rather than depending on game devs to release updates to remove frame cap and use this. Fingers crossed!
This is really a help for low end builds! However it seems like it can't yet get a game from unplayable to playable, but rather from playable to high refreshrate, which is still very nice!
This might sound stupid, but I currently use AFMF 2 with RSR to play Minecraft with raytracing... on my RX6400😁 It quiet literally is unplayable without both enabled, but with them I can actually play in friend's creative realm at about 50 fps.
Question:
When you do these entry-level card testing, do you always use the 4gb versions because of the price? I have a 5500xt 8gb and I feel that that tended to handle a lot of the games I threw at it on account of the higher memory.
Would using a 6500xt 8gb improve the performance (with or without frame gen) by a significant amount or would at the increase price it be more worth it to get a higher tier of gpu?
my 6700xt is gonna last till the end of the world. thanks amd.
I need to remember to try this with my rx 6650 xt
I think the best thing about AFMF is simply that you can force it on nearly every goddamn game on DX11 or 12, which includes titles that suffer from CPU bottlenecks due to being simulation-heavy, or just plain demanding but not affected much by input lag. DLSS framegen on the other hand is only supported where it is supported, and most of those games are fast-paced stuff where the input lag would be a detriment.
Sadly I only have a 60 hz display, so the usefulness is fairly limited for me, but I can see a few titles where I could totally see it smoothing things over a little.
AFMF can improve your gaming experience even if you are using a 60hz monitor, believe me, I have tried it.
@@t.gchannel4067 So far I just havent found many games where its worthwhile, as it absolutely has to be CPU-limited and slow-paced. Cities Skylines (the first one) is such a candidate since even on my system itll drop to 20-30fps with larger cities just from the CPU load. Hearts of Iron 4 is also CPU-heavy and even more handicapped due to being single-threaded, but AFMF barely helps because most issues come from the game stalling out and briefly freezing, and AFMF cant do much about that.
Most GPU-heavy stuff I play is sadly also too fast-paced for AFMF to not cause latency issues, so Id rather not.
Hey dude, what program are you using to capture metrics on the side? (FPS, frametime etc.)
Thank you.
Hey!
Are you able to do a comparison of the 4GB and the 8GB version of the RX 6500 XT? Especially in titles where it usually caps, or nearly caps out on the 4GB?
Would be extremely interesting to see, how this compares.
I've been using the AFMF 1 for the last weeks and it's smooth I didn't even feel some latency. I'm using it on Modded Skyrim SE and FO4 btw, because ENB eats half of my fps on 6600.
now try afmf 2 and get half the latency added and better fluid frames.^^
@@bns6288 will do! thank you.
@@racherim 😊
Plan to use afmf2 on my 7800xt new build thats gonna get here in few days 4k 120hz vrr with playnite is so nice
does this work for you on windowed borderless gaming?
I only play my games in windowed borderless and this afmf (1 or 2) [yes i got the preview driver] never worked for me.
Afmf 2, high/quality mode is the way to go. Just a better experience, no turning itself off and just smoother. But im using 7800xt, maybe the AI accelerator is also helping the 7000 series.
I believe that the extra AI acceleration as well as more Compute per CU makes RDNA 3 have less latency. Quality shsould be the same on RDNA 2
@@shepardpolska interesting, but I only see people that have lower tier GPUs using only the auto mode, which i don't really understand because you have a better mode to choose from.
The main issue people had with afmf 1 was that it would disable itself in fast camera movement, so why continue to use a mode where that's still the case, especially when that issue is solved on the high/quality mode?..it's confusing
6:41 - Just some good old frames, never meaning no harm....
Any chance you try to use that GPU as the displaying GPU for the dual gpu capability of AFMF2 ?! I would love to see the result paired with a 7800XT or something higher up
I use it on my rx 6400 and it feels like a new gpu
As long as you getting 50fps minimum it's ok to play with AFMF or FG.
Yeah pretty much personally it feels ok in a lot of single player games with 50+
At 3050 trying to get 60-72 for comfortable FSR3 gaming.
Pretty sure you said, in a previous comparison video, that there wasn't much difference between the 6500XT and the older 5500XT, and that the 5500XT won significantly in a PCIe 3.0 system. So it would be interesting to see how AFMF2 works with a card like the 5500XT.
sorry to break it to u but 5000 series isnt getting it :)
Their performance is similar most of the time, but RDNA 1 as an architecture has a lot of features missing.
Make More and More videos Like that 😮 For low end gaming rx 6600
Soo, i don't know if this is a question or stating a theory at this point, what it is actually doing is just generating frames after they were "normally" rendered by the gpu on a base level right? Like the game IS running at let's say 60 in the game engine level but the driver is making it seem like a higher frame rate by using "frame gen" once you see it. Which may explain why GTA V doesn't have any of the weird stutters it gets when you "natively" run the game at high fps, since the game engine itself isn't running at 170+ fps with frame gen right?
If this is true, I would love to see how fps based physics games (i. e. Games in which the fps affects the in game physics) react to frame gen. For example, the older GTAs are notorious for their jank physics at anything higher than 24-30 fps (since those were the fps for the console ports).
I'm using 6700XT and... I don't get that FMF 2 feature, I think its still beta?
I hope that one day this will work with my Vega 3 iGPU :))
💀💀💀💀bro is hoping for the sun to rise from the west
It uses A.I., so no that dream is dead. Even the more powerful than the rx 6500, 5700xt can't support AFMF.
Hey.
Wondering re low-end content, I have a red NVIDIA GF9500GT, 1gb of VRAM, DDR2, with CRT, DVI, and HDMI in, and it is a 50W card, that I have little to no use of. Thus I have been wondering if you would want to have it to test out. Even if it crashes out after 30 seconds...
only gripe with this feature is the lack of support for old amd cards like the 400/500/5000 series, though that is totally understandable from a business standpoint i guess. still cant believe my rx 470 is like 8 years old!
Hey Steve, I think it would be interesting to see how well AFMF works at a base framerate of 30fps or even lower.
I think AMD recomends you need to have 50-60fps for Frame gen to work effectively, but looks to work well with AMDs RX6500xt, it is supposed to work with Nvidia & Intel cards too so could you test this on say GTX1660 or RTX2060/70 and an Intel card to see if AMD is helping gamers on a budget and planet by saving cards from landfill.
how do you capture the Metrics Overlay? i did recording use OBS but there is no appear in the video.
Prolly HDMI capture card.
@@adriankoch964 i had 2 capture card before, cheaper one also expensive one, when i plug to mu pc it says doesnt support things i dont ever know what it is before, then, i sold both of it.
This could be useful for when GTA VI comes out and the current amd gpus becoming cheaper.
I don't know about others but if my game is 120FPS but still have the input latency of 60fps (+ slightly more input lag caused by frame gen itself) is super jarring! I could not get used to it no matter the game. I only see a potential for this tech on consoles since people there are far less sensitive to oddities such as input lag and frame pacing etc.
Is it safe to use with anticheats? I guess cs2 is fine but what about EAC for elden ring?
This piques my curiosity. I wonder how a rx 6400 low profile would do with afmf? Or how an arc a380 does in games with fsr 3 frame gen? Probably bad since this might not hit 60 fps to start with but it would be intersting.
That is pretty good afmf 2 boosting a weak gpu like the 6500xt but I wouldn't get a 4gb graphic card like the 6500xt ever, 5500xt 8gb model is the best budget gpu on market in
Nice now devs can rely even more on not optimizing their game and just slap fsr and afmf on it
Yep :(
i have an rx 6600 paired with some old ass i5 4590, will this help with the bottleneck?
For people who have older hardware and can't use it:
It's shit.
Unless you're running the game well above 60 Fps, it's terrible. Even after the AFMF2 update.
Its crap even with new hardware cause you can't use vsync with this generator so a lot of screen tearing.
how is everyone utilizing almost 100 percent in cs2? i have a 7800x3d and a 7900xtx. it's always 20-30 percent util on both cpu and gpu on my end and it's causing stutters and frame drops. AND GPU TEMPS IS ALWAYS AT 30-50c depending on what my current room temp is.
I think you should've tested it without anti lag. Having it enabled does more harm than good in my experience.
If in the future Amd can make it work in 30-40fps properly, gaming will be changed
How do i enable the ADM fps counter overlay?
Could you test the Ryzen 7000 IGPU or the Radeon 610m in some laptop CPU with AFMF.
such a shame this amazing card doesn't have GPU encoder.
Yeah it is lacking a bit :(
"Amazing" card that was released in 2022 with only 6GB of video memory. 🙄
I have a 6950xt and I don't have a choice of fluid frames 2. I only have fluid frames. What gives?
I don't think this is useless but it feels much more like a tiny little extra than a key feature, as Nvidia claimed when marketed DLSS framerate to pretend the 40 series wasn't overpriced crap.
I mean, you reeeally need ultra-high FPS for competitive games. And this doesn't help in competitive games since there's always a delay to the extra frames and you don't get quicker information.
And you reeeeeally need more frames when you have 20FPS and want the game to be playable. And this doesn't work well in that case either.
Basically, this gives you extra frames, except when you actually need them. They're only there when they're a nice-to-have, like going 60 to 100 (and the higher you go, the less it's relevant).
We should never pretend those are real frames, as Nvidia did.
I want to test this on my single slot RX 6400 in my cheap Optiplex build.
I would like to see a video about frame gen with very low framerates. Like how does it feel when you normaly only get around 20 fps and with frame gen maybe 35. Does it stutter or feel sluggish?
It almost doubles your input lag, so it can feel like going from playing on a PC monitor to playing on a TV with its game mode off.
Also the lower the native frame rate and the more things change from one frame to another, the more noticeable artifacts will become, since the FG needs to interpolate more ("aka make shit up"), and each fake frame will persist much longer on screen.
In FPS games, you don't want to enable any sort of FG at
hi how did you the frame 2 i have still on 1?
The strange thing with the 6500 card is that AFMF does not work on pci 3.0 systems as far as I know. ( I run a and 3200pro and a 6500 card right now. Upgrade to 3600 is imminent!)
I've found that if you're getting 60fps+ then you might as well just put fps limit on 60 and turn off frame gen completely.
It doesn't come across on TH-cam, but can you FEEL the extra frame rate when AMD's own counter isn't visible? Not that AMD would lie about that sort of thing, of course, paragons of honesty that they are. After all, I'm sure that there's all kinds of valid technical reasons why no other standard benchmarking tool can see the extra frames, unlike with other framegen methods...
edit: The lack of GTA5's massive, unplayable stuttering when you pass 180fps is rather suspicious, too.
Because the extra frames don't happen in the game engine like with DLSS 3 and FSR 3, this type of framegen works purely between GPU driver and display.
GTA stutter happens because game engine breaks down when rendering high FPS, but AFMF doesn't make the game engine render more, it generates frames inbetween real ones
The reason for the fps drop, when turning on afmf, from what I hear is due to the extra load being put on the gpu from having to run afmf.
AMD did a great job