Great video! I just wanted to let you know that just because there is a spike in the frametime, but not a spike in the GPU Busy metric, doesn't exactly mean it's a Ram or CPU bottleneck, it could be a bunch of things, such as the game itself hanging, the hard-drive pulling assets or another program using the CPU draw calls for another purpose. It is a good metric to see if it's your GPU that's causing stutters (Flushing V-ram or not having enough, transient power spikes, unstable OCs or UVs) or holding your framerate back, but unfortunately beyond that, it doesn't tell you what exactly is causing the stutter, just that it wasn't your GPU. LatenecyMon can tell you in most cases I believe. I've watched you for a while and I'm certain you know this information because you're smart and well versed in computer rendering, but I just wanted to clarify just in case and for anyone in the comments that may have gotten the wrong idea about what the metric is actually recording.
Yes, exactly. I said that it could be some other things even when I stated that the cpu (and consequently ram, hdd) were loading assets as well 💪 Basically we know when its not the gpu, which is a good thing
Draw Rate is basically the FPS of the overlay, it is limited by the Sampling Period(in Settings -Data Processing). So by default 10fps Draw rate is the max with 100ms sampling period(how often it gets the data). 1sec = 1000ms so 10frames per second = 100ms. If you want the graph smoother you increase both.
Just installed the Intel PresentMon Betav0.5 and configured it as per your video in 2 min! I am happy to report my RX6800 and 5800X are "Balanced" in DCS World Multithread. Very Nice!
@@AncientGameplays I considered not watching it until reading this comment since my system is AMD, I was assuming this intel software is for intel cpu's
I love this, I usually understand bottlenecks as I know my system and know a ton about games and how they work but this is actually amazing and makes it much easier for people to find out what the bottleneck is. Heck I will use it just to confirm if I am correct or not. Also allowing it on a second monitor is great and will allow me to just slap it on my portrait monitor and keep an eye on things. Also... OMG... Bruccius's thank you kind sir ahahahaha, I have not seen that in a while now.
@@AncientGameplays The preset section shown at 3:23 where there's basic, gpu focus, and then custom + the edit button. Also just figured out you can customize the default presets.
Which games are good for testing? 5900x and rx 7900 xtx. Getting double frametime then gpu busy on; Vallheim and Dishonored 1. Battlebit seems more stable. Could this also be the game engine? The cpu doesnt seem to go up in usage, so I don't understand why there is such a big difference.
Yeah, that a classic and BIG CPU bottleneck you got there man. The usage doesn't matter anything at all, that CPU has low IPC and is most likely paired with a low end RAM kit. watch this: th-cam.com/video/hAVlzEW8qgM/w-d-xo.html
@@AncientGameplays Any tips on which CPU and ram to go for? I have had memory frequency issues with every AMD cpu I have bought recently. 3900x and the 5900x..
I have to say my gaming friends, I have an I9-10900k and I still rock the frames on most top Titles. Perhaps a wee bottleneck in my CPU but overall, my performance with this 7900XTX is fantastic! A big step-up from my 3080 TUF
Downloaded the intel presentmon yesterday. Was trying to find a way to make background transparent like MSI's one. guess not yet but the OSD from Intel GPU busy is wholesome! thx AG❤
it would be nice to see how much frame time is lost in a 5600x sytem in non very optimized games (or main thread dependant), but also to show how good really is in higher resolutions were not much tmes is lost due to game driver, or ddr5 latency, etc... Frame time - GPU Busy Time = Rest of the System time. We may found ddr4 and 5800x3d (or 32Mb cache of 5600x) not wasting time due to latency being so good versus much expensive "newer", or to validate which gpu upgrades are good to keep old ssytem parts, or what games just lose too much time and resources available...6% loads on 13900K is a too often seen scenario...
This is a cool and everything, but all you really need to look at is GPU usage. If it is 98% or higher, you are GPU limited. Anything below means you are either CPU limited or limited by the game engine.
exactly, but that's not the point. look at the Counter Strike 2 benchmark, even though the CPU was bottlenecking, the GPU and CPU had around the same times, meaning that there were nothing loading in the background, while it was different with Jedi: Surviror, which is interesting to see
Even single player games use some sort of connection to server, but things become more interesting when you play multiplayer online competitive games. Even if game itself didn't depends on data from other side of Earth, you still want have some own input to draw game scene. Playing movie is almost optimal state of things, but you know stutters there. I got quite responsive connection, 15msec typically in my DX9 game, 30ms in another DX11 game. If your PC doesn't wait for incoming data, the image becomes disconnected from reality, if it waits, you have already at least 30ms delays. Idk how devs solved lack of continuos positioning and other data during compute, it looks they use either last known pos or aproximation given last pos, speed and heading. Anyway I think we are doomed, until we become psychically and physically fully connected all together with no lag. I think I love to have some lags😅.
If Intel really wants to impress the penguin crowd (penguinestas) they will port this to Linux, which desperately needs a GPU monitoring solution. There is Green with Envy for Nvidia and corectrl for AMD but neither utility allows for real-time monitoring of CPU and GPU stats.
@@XxXTMillzXxX Mango is great IF you can install it, config it and modify your Steam launcher to support it. Not exactly a plug n play option like this, is it?
Just took a look at Intel PresentMon and my frame time = gpu busy (I mean they are near the same value all the time). There is no gap at all. 10900k and RTX 3080 ti. Tested in Cyberpunk 2077 1080p Ultra settings and Diablo 4. Reduced my overclock to 5.1/5.0GHz as there is zero increase in FPS in games. 5.2/5.1GHz is just for benchmarks now. Love this program. In the video he is CPU limited in CS2. Why he is getting less than 99% GPU load, around ~80% which is clear cpu bottlenecking (expected see high fps but even so). Even in Jedi he is seeing CPU bottlenecks for periods, there can be lots of reasons for this. With me its 99% GPU load and both frame time and gpu busy are the same (all the time). This is in the games I have tested. This is bad I guess because a faster card likely wont do me much good. I would say the CPU is not paired with the correct GPU in this video but still is up to the task and fine. The GPU is a little strong for the build but its acceptable. Likely upgrade is a 7800X3D. It will likely push more frames in CS2 at lower resolutions. Even so this is a don't touch build if there is no hitching or stuttering. He's still getting most of the GPUs performance. Its balanced enough.
Just to let you know, AMD launched this months first driver which fixed Power Draw issues for 1440p + 1080 high refresh monitors. Which is my combo. 7W idle, 25W TH-cam video. Previously was 95W+. As far as I've read they fixed it for a whole bunch of other configurations. Finally!
@12:36 this is the part where he shows the different frame time graphs of when the CPU is bottlenecking the GPU. imho he seems to be mumbling through most of the video or I'm just too dumb to figure out what he's saying. But the 12:36 mark illustrates CPU bottlenecking really well.
Okay so in this regard I have a strange issue here and I hope you can see my comment to help me with it I have a Core i7 12700f paired with RX 6700XT and 32 gigs of memory. Every Time I go to the bios (Gigabyte Z690 UD) and activate the Gaming profile or Enable the Enhanced Multi Core Performance I witness a drop in the Gpu usage in games which results in much lower FPS. Also, when I enable either of these options in the bios I notice that the Cpu usage inside games increases to 30%~40% vs 3%~7% when they are off. So I am not sure what's going on here
the low FPS don't results because of the low usage, the low usage appears due to those gimmicks you're trying to use. Let the CPU at stock and use the RAM XMP, period
gaming profile disables the ecores. Ecores smooth stuff out in general I recommend keeping them on. I have the same cpu but a b660 gigabyte mobo. CPU is great on stock settings and XMP. If you on DDR4 set RAM command rate to 1T.
Lets hope this will get AMD and Nvidia do more to smooth out the driver overhead, mostly Nvidia leave a lot on the table on there "older" architectures
@@AncientGameplays the only thing I can remember it that there is some AMD overhead regarding DX11 in there official drivers that modded drivers fix on older architectures like Polaris
i have 7800x3d and when i use full hd resolution and using ultra performance dlls my gpu is not at max.. When you see gpu not working max there is cpu bottleneck on gpu demand games.
I just got a 12700 k, 1080ti, 64G ram, samsung 990 SSD and my fps in GTAV were 150ish 1080p. I just installed a 4080 and now I'm at 100fps or less. Any ideas?
I got these stutters you showed in jedi survivor in all games. Running my 7600 with a 7800xt and 32gb 6000 cl30 ram. What is causing this exactly and how can i fix this?
I dont get the use of this, its effectively gpu utilization crammed inside frametime graph. If you look at gpu utilization graph and gpu busy they are basically in parallel to each other. If it gets implemented in RTSS i might try it, for now, meh.
you're not really getting the point here, at all. The GPU busy feature can tell you effectively if the issue of stutters, frame drops and so on is because of your GPU or not, something that the GPU usage can't show at all. That's the point
@@AncientGameplays well at least with current presentmon configuration you cant see stutters in gpu busy either, because its the **AVG value** shown, not realtime per frame. As of now, you can only see big gpu bottlenecks and dips, which leads to exact same value as gpu utilization. I get what you are saying but for now it does not show what you are suggesting, even in your own video there are literally no stutters shown, like in RTSS default frametime graph for example.
I'm gonna try it cause fedex in Portugal destroyed my pc and fnac sen't my pc back with insurance paid 4070ti tuf oc, new case but broken mobo and other parts damaged in the accident. I got single channel 3800mhz cl17, so yes i gotta check my stats. Thank you.
What ima love about Fabio he can explain how a systems GPU is bottlenecked by the CPU how the Graphics card is sleeping not pulling its weight as in running at 98% to 100% rather running at 70% or far less but ima guilty as sin doing that with most of my retro AGP builds why they are for retro gaming only lower FPS as in 60fps in some games max running at 60Hz to 85Hz monitors why because the graphics are of the highest crispiest as they will ever get yet the cards running at even 50% or way lower in some games with the builds running ultra silent that way the Graphics card bottlenecked in bed sleeping just dreaming the games on easy street will last forever as the capacitors last up and looks after the silicon to be as good as new no fans blasting sucking in dust to block the cards that is what it is all about purely ornamental builds that can run PC games on Direct X10 to the max if need be as these are AGP the crown jewel builds antiques that hold a premium that should only be looked after with the most utter respect...But not High end like Fabio is bench testing live gaming all his viewers knows he is of the best of the Techs with zero need for Liquid Nitrogen overclocking to find that sweet spot in his systems as most gamers as in 99.9999% will never use Liquid Nitrogen cooling that is only for the likes of perfectionists like GN's Steve and JayZtwoCents chasing world records with golden sample silicon to the max anyhow hats off to Fabio amazing as always nothing but total respect..
well, I guess, for CS Go test 1ms GPU time and 1.32ms CPU time is not a match. 1 vs 1.32 it's a big difference. So, huge CPU bottleneck, right? GPU can produce 1000 frames but CPU only 750, huuuuuuge bottleneck.
The very first example he showed was absolutely a cpu bottleneck… gpu couldn’t be fully utilized because it was outputting too many frames for the cpu to handle.
7800X3D+7900XT here. I'm having an issue where it is polling wattage and VRAM data from the integrated GPU, Radeon TM instead of the 7900XT. Going in settings, it only says default adapter as the driver source.
It's a tool to identify bottlenecks but it doesn't solve anything. It's pretty cool to have metrics though. We'll have to see the effect on CPU load when games start using the Direct Storage API. You would have to think that AMD, Nvidia, and game developers have access to this information already. How does this compare to AMD's System Monitor?
It solves the dilemma as said in the title because people now can actually "SEE" if they need a better CPU, even the most basic user can learn how to see that in 2 minutes usually
@@AncientGameplays I'm not too sure if the most basic user will be able to handle it. It points out CPU load but it doesn't tell you what or why. I mean the issue could be that your Windows has degraded, you have insufficient RAM or RAM frequency/throughput, or your using a mechanical drive still. This could be an interesting bench-marking segment where you pick a game(s) and GPU(s) and find the cheapest CPU to give optimum performance results.
i know i have cpu bottleneck, i had to downclock and undervolt my r5 3600 xD, its because i dont know what the "safe voltage" is, because it was high temp. right now 4ghz @ 1.144v. ive heard these cpus killing themselves because of the 5v boost it does on stock
hey non related, but I was trying to static overclock my 7700x to 5.4ghz, i set the voltages to 1.20500 and when I tried to run prime95 the computer instantly reboots, is 1.20500 too low?! I know you run at 1.18 and ur fine, and I know every system is different. could you maybe help me?
@@AncientGameplays now im just using, pbo, but i noticed on the amd software, the cpu had a preset of "overclock cpu" could that have messed up the voltages i put in the bios?
This might be a stupid suggestion, but why don't gpu makers make a graphics card that has not only a gpu and vram on it but also a special "cpu" type chip that can handle everything in-game that the system's cpu handles now just more efficiently and effectively? That way the system's cpu wouldn't be burdened with the task of running certain aspects of a game, since that would be handled by the "cpu" on the graphics board, and can instead focus on system stability and processes.
U allways have a bottleneck, else FPS will never stop rising. The first bottleneck (CPU/GPU /bus speed) will be the 1 that determines your FPS. The ideal condition is that the GPU is on 100% utilization. than u know your system is not limiting it from reaching its full potential.
Can Someone help me? I'm encountering an unusual problem while gaming. Despite disabling notifications, I'm hearing the notification sound randomly. Strangely, when this happens, if I press any movement key like W, A, S, or D and then remove my finger from the button, my character continues to move on its own. How can I troubleshoot and resolve this perplexing issue?
Can you test maybe lower gpu and cpu for cs2 with this method? would be interested in it got a 10600kf and 5600xt but want at least constantly 360fps but dont know if i need a gpu upgrade or cpu upgrade for it to reach a smoother experience there.
There's a easy way to check if you have a CPU bottleneck, is the CPU load, if your game can't load you CPU, you can't have a CPU bottleneck, it's potentially the software being poorly optimized.
@@AncientGameplays That's my point, if the game engine can't make use of the CPU, is a software bottleneck and not a CPU one, of course, 20% utilisation on a 7800X3D will run faster than 20% utilisation on a 5800X3D because the first is a faster CPU.
@@AncientGameplays CPUs shouldn't be used at 100%, but shouldn't be under utilised either, Windows needs very little processing power from modern CPUs, a game could go to 80% utilisation with no issues.
so intel just got off the 32bit L1 cache feed? finally. 64bit came out over 10 years ago moores law dictates 128bit L1 should have come about over 5 years ago and 32bit should have gone the way of windows 95. further moores law GPU memory bit buses should be at 1-2mb as 512kb memory bus was soooooo 2008 with the radeon 6800 and nvidia gtx290/5 gpus
What we need is a bottleneck for dummies software to identify where the weakness is in our systems, most of us either don't have the time to dig deep into finding out or are not capable of doing so. This Intel performance software could do the trick if it breaks things down in easy to understand language for most to understand, and that could be a real help for many in seeing what part of their PC's need upgrading, because let's be honest, a lot of us are not that good at doing a balance system, many of us just throw in what we think is a good match from cpu, gpu and memory, or many throw in higher end parts, but a lot of us are not good at seeing the right balance of hardware is, and that could be helpful in reducing cost whiles also showing us what needs upgrading to get a bigger bang for the bucks. This Intel performance software could do just that once it's mature. On another note, Intel have released the source code as open source, that's good news because AMD, Nvidia and Intel could integrate this into their drivers and that could monitor performance and give us tips in where the bottlenecks are and what hardware upgrades would help.
@@AncientGameplays maybe you are right. but there is a reddit post about it in Hitman. And I just tried to turn rebar on for Spider-Man Remastered and the cpu load jumped to 100% and gpu load became 50%. I had GPU 80-90 and CPU 50-60 before this. I have a weak CPU it`s i3 12100 and I can notice when it can`t use GPU`s full potential. I have 4060
I'm encountering an unusual problem while gaming. Despite disabling notifications, I'm hearing the notification sound randomly. Strangely, when this happens, if I press any movement key like W, A, S, or D and then remove my finger from the button, my character continues to move on its own. How can I troubleshoot and resolve this perplexing issue?
IM tested this its just show gpu time and its very good! but not cpu time im only seen cod warzone game showed gpu time and cpu time if a app can show cpu time its great! its just show the frame time not cpu time! if they add cpu time its gonna be greatest benchmark app!
so this just shows you there is a bottleneck, but not WHERE, it could be software sided, game engine, driver overhead, slow ram, not just CPU how is this different from low gpu usage?
Yes, it is much different because you can see that it is not gpu sided, with the gpu usage it is relative as sometimes, even at 99% you're getting bottlenecked. A good example was TLOU before smart access memory
it's a really good tool and i like it but it don't have the customizable aspect of RTSS (Rivatuner Statistics Server) yet. so if someone know how to import the GPU busy data to add it in RTSS that would be amazing for me Thanks.
I have a weird question. Do you do this sign 👌 because you want too or that you don't really think about it? 👀 ( question is cause I know some really big famous ppl do it because they are being told.) ❤
what the hell is that? I use that sometimes when talking to say some things are like "top". Been doing that since I was a child, its a common thing here
@@AncientGameplays that's why I asked my guy. Top tier actors and basically all huge content creators are striking deals with bad ppl to do the 666 symbol.
@@AncientGameplaysFH5 (and previously FH4 and FH3) are my primary overlay stability stress-tests, which I use during RTSS development and testing. They never banned anyone for overlays, it is false info.
Great video! I just wanted to let you know that just because there is a spike in the frametime, but not a spike in the GPU Busy metric, doesn't exactly mean it's a Ram or CPU bottleneck, it could be a bunch of things, such as the game itself hanging, the hard-drive pulling assets or another program using the CPU draw calls for another purpose.
It is a good metric to see if it's your GPU that's causing stutters (Flushing V-ram or not having enough, transient power spikes, unstable OCs or UVs) or holding your framerate back, but unfortunately beyond that, it doesn't tell you what exactly is causing the stutter, just that it wasn't your GPU. LatenecyMon can tell you in most cases I believe.
I've watched you for a while and I'm certain you know this information because you're smart and well versed in computer rendering, but I just wanted to clarify just in case and for anyone in the comments that may have gotten the wrong idea about what the metric is actually recording.
Yes, exactly. I said that it could be some other things even when I stated that the cpu (and consequently ram, hdd) were loading assets as well 💪
Basically we know when its not the gpu, which is a good thing
@@AncientGameplays Oh whoops! I def missed that.
LatencyMon is good for finding bad programs/drivers causing stutters.
Draw Rate is basically the FPS of the overlay, it is limited by the Sampling Period(in Settings -Data Processing). So by default 10fps Draw rate is the max with 100ms sampling period(how often it gets the data). 1sec = 1000ms so 10frames per second = 100ms. If you want the graph smoother you increase both.
Thanks for the head's up!
Since this is open source. MSI Afterburner should include this in their graph.
Two videos in a row today, DAMN son xD
Your on fire!!!
3 videos im a row when? :)
have to admit shes pretty good at it
I assumed this was only for Intel Arc cards. It's cool that everyone can use it
Indeed
gpu : help, i'm being bottled neck!
cpu: shut up, I'm busy. 😆😆😆
hahahah
Just installed the Intel PresentMon Betav0.5 and configured it as per your video in 2 min!
I am happy to report my RX6800 and 5800X are "Balanced" in DCS World Multithread. Very Nice!
Nice indeed!
Intel software works on AMD?
@@LexxDesign3D it have similar instruction sets, so it's work for both
haven't you watched the video? @@LexxDesign3D
@@AncientGameplays I considered not watching it until reading this comment since my system is AMD, I was assuming this intel software is for intel cpu's
0:09 This happens right now in Starfield. 180w on a 3060 Ti, 99% GPU usage - other games 240w.
I love this, I usually understand bottlenecks as I know my system and know a ton about games and how they work but this is actually amazing and makes it much easier for people to find out what the bottleneck is. Heck I will use it just to confirm if I am correct or not. Also allowing it on a second monitor is great and will allow me to just slap it on my portrait monitor and keep an eye on things. Also... OMG... Bruccius's thank you kind sir ahahahaha, I have not seen that in a while now.
haha, You're welcome kind sir
Just gonna make sure to double check given that it still in work in progress!
GPU Bussy lmao.
Very good video man appreciate this info, didn't even know this app existed, thanks!!!
Also, AMD WHERE IS FSR3.
Thanks as well! Fsr3 is supposed to be announced in the 25th
every other channel: its a good software, yeah
ancient gameplays: even talks about scaling slider. :Gigachad:
Haha
I downloaded this when the Gamers Nexus video came out but hadn't had an opportunity to try it out. 7900 XTX and 7800X3D
lay an eye on this then :D
@@AncientGameplays So I was using it last night and it's pretty cool although a lot of the custom things don't work on my AMD card.
@@TyGamer125 what custom things?
@@AncientGameplays The preset section shown at 3:23 where there's basic, gpu focus, and then custom + the edit button. Also just figured out you can customize the default presets.
Yes, I made some of them myself as well @@TyGamer125
Which games are good for testing?
5900x and rx 7900 xtx.
Getting double frametime then gpu busy on; Vallheim and Dishonored 1. Battlebit seems more stable. Could this also be the game engine? The cpu doesnt seem to go up in usage, so I don't understand why there is such a big difference.
Yeah, that a classic and BIG CPU bottleneck you got there man. The usage doesn't matter anything at all, that CPU has low IPC and is most likely paired with a low end RAM kit.
watch this: th-cam.com/video/hAVlzEW8qgM/w-d-xo.html
@@AncientGameplays Thank you! Yeah Got 64gb 2880mhz. Because I couldnt get the frequencies higher.
@@MusicForHourss that's it then
@@AncientGameplays Any tips on which CPU and ram to go for?
I have had memory frequency issues with every AMD cpu I have bought recently. 3900x and the 5900x..
@@MusicForHourss get the 5800x3d and some 3600MHz ram and the difference will be huge
Now next step is adding CPU busy/wating stats aka if its hold back by RAM bandwidth/latency or if it reached its own max ST/MT potencial.
Maybe
I have to say my gaming friends, I have an I9-10900k and I still rock the frames on most top Titles. Perhaps a wee bottleneck in my CPU but overall, my performance with this 7900XTX is fantastic! A big step-up from my 3080 TUF
Can we expect a video on the new AMD chipset driver 5.08 ??
There's really not muxh to show there
@@AncientGameplays I totally understand. Your always on top of the new drivers so was just wondering.
would like to see this in all new videos. This is great. Wonder how Rachel & Clank with DirectStoarge 1.2 will perform?
Downloaded the intel presentmon yesterday. Was trying to find a way to make background transparent like MSI's one. guess not yet but the OSD from Intel GPU busy is wholesome! thx AG❤
it would be nice to see how much frame time is lost in a 5600x sytem in non very optimized games (or main thread dependant), but also to show how good really is in higher resolutions were not much tmes is lost due to game driver, or ddr5 latency, etc... Frame time - GPU Busy Time = Rest of the System time. We may found ddr4 and 5800x3d (or 32Mb cache of 5600x) not wasting time due to latency being so good versus much expensive "newer", or to validate which gpu upgrades are good to keep old ssytem parts, or what games just lose too much time and resources available...6% loads on 13900K is a too often seen scenario...
This is a cool and everything, but all you really need to look at is GPU usage. If it is 98% or higher, you are GPU limited. Anything below means you are either CPU limited or limited by the game engine.
exactly, but that's not the point. look at the Counter Strike 2 benchmark, even though the CPU was bottlenecking, the GPU and CPU had around the same times, meaning that there were nothing loading in the background, while it was different with Jedi: Surviror, which is interesting to see
Even single player games use some sort of connection to server, but things become more interesting when you play multiplayer online competitive games. Even if game itself didn't depends on data from other side of Earth, you still want have some own input to draw game scene.
Playing movie is almost optimal state of things, but you know stutters there.
I got quite responsive connection, 15msec typically in my DX9 game, 30ms in another DX11 game.
If your PC doesn't wait for incoming data, the image becomes disconnected from reality, if it waits, you have already at least 30ms delays. Idk how devs solved lack of continuos positioning and other data during compute, it looks they use either last known pos or aproximation given last pos, speed and heading.
Anyway I think we are doomed, until we become psychically and physically fully connected all together with no lag.
I think I love to have some lags😅.
Thanks for doing this software overview! This seems like a highly useful app so I just downloaded it.
💪💪💪
If Intel really wants to impress the penguin crowd (penguinestas) they will port this to Linux, which desperately needs a GPU monitoring solution. There is Green with Envy for Nvidia and corectrl for AMD but neither utility allows for real-time monitoring of CPU and GPU stats.
Mangohud?
@@XxXTMillzXxX Mango is great IF you can install it, config it and modify your Steam launcher to support it. Not exactly a plug n play option like this, is it?
Nice now I'm going to see if I can find my bottleneck and overclock it out!!!
Check it out :D
Hey new AMD Chipset drivers dropped, I've been checking your channel for updates
Huge thanks for this incredible tip !!
God Bless you !!
Thank you as well!
I just use one of those 3.5" system monitors from ali express. Works great 👌
Just to have things showing, interesting!
A 3.5 inch monitor??
Just took a look at Intel PresentMon and my frame time = gpu busy (I mean they are near the same value all the time). There is no gap at all. 10900k and RTX 3080 ti. Tested in Cyberpunk 2077 1080p Ultra settings and Diablo 4. Reduced my overclock to 5.1/5.0GHz as there is zero increase in FPS in games. 5.2/5.1GHz is just for benchmarks now. Love this program.
In the video he is CPU limited in CS2. Why he is getting less than 99% GPU load, around ~80% which is clear cpu bottlenecking (expected see high fps but even so). Even in Jedi he is seeing CPU bottlenecks for periods, there can be lots of reasons for this. With me its 99% GPU load and both frame time and gpu busy are the same (all the time). This is in the games I have tested. This is bad I guess because a faster card likely wont do me much good.
I would say the CPU is not paired with the correct GPU in this video but still is up to the task and fine. The GPU is a little strong for the build but its acceptable. Likely upgrade is a 7800X3D. It will likely push more frames in CS2 at lower resolutions.
Even so this is a don't touch build if there is no hitching or stuttering. He's still getting most of the GPUs performance. Its balanced enough.
Glasses😳, I mean, it's would be very interesting to see on 7800X3D with RX 7900XTX, also Ray Tracing
My man coming with the scoops!!!
hahaha
what's the Difference with the MSI AfterBurner?
Afterburner is much more complete as explained
i see Thank you
Just to let you know, AMD launched this months first driver which fixed Power Draw issues for 1440p + 1080 high refresh monitors. Which is my combo. 7W idle, 25W TH-cam video. Previously was 95W+. As far as I've read they fixed it for a whole bunch of other configurations. Finally!
Not only that, but other combos as well :D
Check my community section, just posted about that
What driver? Graphics or chipset? Can you post a link?
To my surprise the newer chipset did drop the performance sooo I'm gonna go back to the previous one
I didn't really do many test, but seems okay here
@12:36 this is the part where he shows the different frame time graphs of when the CPU is bottlenecking the GPU.
imho he seems to be mumbling through most of the video or I'm just too dumb to figure out what he's saying. But the 12:36 mark illustrates CPU bottlenecking really well.
mumbling? lol
CPU was just loading..... SHIT....LOL
🤣🤣💪
"I know my system, and I know my shit." 💪😎👊
hahah
quando passei o cursor pelo teu video no "feedback" do youtube em mute parecia que iniciaste o video a dizer boa tarde. lol
Hahah, coincidência
Great. Time to dust off the 486DX2-66.
I cannot get intel GPU busy to work for me at all. Show me why?
I have two RTX 2080ti's are they too old?
Installed but can't get it as an overlay, only have it outside of my games lol !
Maybe it's conflicting with Adrenalin (I'm on a 7900XTX) ?
Choose the app manually then
@@AncientGameplays I launch the app set hotkey but when in-game got no overlay it's like another window (and I didn't select it in a window)
Okay so in this regard I have a strange issue here and I hope you can see my comment to help me with it
I have a Core i7 12700f paired with RX 6700XT and 32 gigs of memory.
Every Time I go to the bios (Gigabyte Z690 UD) and activate the Gaming profile or Enable the Enhanced Multi Core Performance I witness a drop in the Gpu usage in games which results in much lower FPS.
Also, when I enable either of these options in the bios I notice that the Cpu usage inside games increases to 30%~40% vs 3%~7% when they are off.
So I am not sure what's going on here
the low FPS don't results because of the low usage, the low usage appears due to those gimmicks you're trying to use. Let the CPU at stock and use the RAM XMP, period
@@AncientGameplays Thank you for your help
gaming profile disables the ecores. Ecores smooth stuff out in general I recommend keeping them on. I have the same cpu but a b660 gigabyte mobo. CPU is great on stock settings and XMP. If you on DDR4 set RAM command rate to 1T.
@@epeksergastis hmmm I see, Thank you for sharing knowledge man!
great video. you are bottled necked on the first csgo bench. 7800x3d helps with those fluctuations
there are no "fluctiations" and a CPU bottleneck is there of course, we're running mostly 540P native res xD
Wow, hell frozen! Intel was doing something right and for free 😮.
Exactly haha
How long will it take for RivaTuner Statistics Server to add this feature?
who knows haha
Lets hope this will get AMD and Nvidia do more to smooth out the driver overhead, mostly Nvidia leave a lot on the table on there "older" architectures
AMD has very little drivers overhead, actually they're up there in the front when we're talking about that
@@AncientGameplays Indeed. I switched from a GTX 1660 to a 5700 XT and I still see the same (or lower) CPU usage in some games with a Ryzen 5 3600XT.
@@AncientGameplays the only thing I can remember it that there is some AMD overhead regarding DX11 in there official drivers that modded drivers fix on older architectures like Polaris
So if I'm seeing Frame time of 60.3ms and GPU Busy of 20.1ms, what does that tell me?
i have 7800x3d and when i use full hd resolution and using ultra performance dlls my gpu is not at max.. When you see gpu not working max there is cpu bottleneck on gpu demand games.
depends. What you have there happens because you're tying to render like 480P resolution, obviously the CPU would be the bottleneck
In the case of Jedi survivor i think it is safe to say the game is the bottleneck not the pc 😂
Its a good example though haha
Hi! Could you please make a video about the new AMD Chipset driver 5.08.02.027 and its differences vs the oldest driver? Thanx!!!
They're fine, most differences are for x3d cpus
Obrigado Geeklord.
💪💪
Does anyone know why after I downloaded presentmon, I had high latency and audio issues that went away as soon as i deleted it?
Maybe a bug? Its still on beta
Developers: clearly you all have weak CPUs, nothing wrong with our optimization.
Its a mix in most scenarios xD
Can someone tell me why the app cant read my CPU temperature? HWinfo64 and Afterburner can
Its still on beta, its for GPU mostly now, it will get better
I just got a 12700 k, 1080ti, 64G ram, samsung 990 SSD and my fps in GTAV were 150ish 1080p. I just installed a 4080 and now I'm at 100fps or less. Any ideas?
Its an obvious cpu and ram bottleneck yes, the game is also shit and broken. The 4080 has bigger cpu overhead, that's why I guess
I got these stutters you showed in jedi survivor in all games. Running my 7600 with a 7800xt and 32gb 6000 cl30 ram. What is causing this exactly and how can i fix this?
I dont get the use of this, its effectively gpu utilization crammed inside frametime graph. If you look at gpu utilization graph and gpu busy they are basically in parallel to each other. If it gets implemented in RTSS i might try it, for now, meh.
you're not really getting the point here, at all. The GPU busy feature can tell you effectively if the issue of stutters, frame drops and so on is because of your GPU or not, something that the GPU usage can't show at all. That's the point
@@AncientGameplays well at least with current presentmon configuration you cant see stutters in gpu busy either, because its the **AVG value** shown, not realtime per frame. As of now, you can only see big gpu bottlenecks and dips, which leads to exact same value as gpu utilization. I get what you are saying but for now it does not show what you are suggesting, even in your own video there are literally no stutters shown, like in RTSS default frametime graph for example.
It is already integrated into RTSS OverlayEditor
@@unwinder Really? i will into guru3d then, as always thx for developing RTSS :)
@@h1tzzYT it is not publicly released yet, but PresentMon integradion details are available in MSI AB development thread in the forum.
I took it for a quick spin on my 7600/6750 XT rig and it can't read GPU temp or memory. Latest drivers/BIOS etc.
it did here but not the power draw. needs updates
On mine it read power draw just fine. I really hope Intel keeps working on this.
Is this software free and where do we download? How do we get latencymon?
Free yes. Google my friend
I'm gonna try it cause fedex in Portugal destroyed my pc and fnac sen't my pc back with insurance paid 4070ti tuf oc, new case but broken mobo and other parts damaged in the accident. I got single channel 3800mhz cl17, so yes i gotta check my stats. Thank you.
I`m sorry but how is this monitoring utility solving bottleneck?? BTW thx for review!
Didn't say it solved the bottleneck, said it solved the dilemma, since now you can literally watch it on graphs
What ima love about Fabio he can explain how a systems GPU is bottlenecked by the CPU how the Graphics card is sleeping not pulling its weight as in running at 98% to 100% rather running at 70% or far less but ima guilty as sin doing that with most of my retro AGP builds why they are for retro gaming only lower FPS as in 60fps in some games max running at 60Hz to 85Hz monitors why because the graphics are of the highest crispiest as they will ever get yet the cards running at even 50% or way lower in some games with the builds running ultra silent that way the Graphics card bottlenecked in bed sleeping just dreaming the games on easy street will last forever as the capacitors last up and looks after the silicon to be as good as new no fans blasting sucking in dust to block the cards that is what it is all about purely ornamental builds that can run PC games on Direct X10 to the max if need be as these are AGP the crown jewel builds antiques that hold a premium that should only be looked after with the most utter respect...But not High end like Fabio is bench testing live gaming all his viewers knows he is of the best of the Techs with zero need for Liquid Nitrogen overclocking to find that sweet spot in his systems as most gamers as in 99.9999% will never use Liquid Nitrogen cooling that is only for the likes of perfectionists like GN's Steve and JayZtwoCents chasing world records with golden sample silicon to the max anyhow hats off to Fabio amazing as always nothing but total respect..
well, I guess, for CS Go test 1ms GPU time and 1.32ms CPU time is not a match. 1 vs 1.32 it's a big difference. So, huge CPU bottleneck, right? GPU can produce 1000 frames but CPU only 750, huuuuuuge bottleneck.
It will never be a perfect match though, not even with a non bottleneck scenario
with lossless scaling around, i dont mind bottleneck. save me tons of power.
That makes no sense as you can simply lock frames
The very first example he showed was absolutely a cpu bottleneck… gpu couldn’t be fully utilized because it was outputting too many frames for the cpu to handle.
That's exactly what a bottleneck is called... a cpu bottlenecking the gpu... you're "special" aren't you?
What is that frame rate graph you use from your testing
The msi afterburner one?
@@AncientGameplays the fps graph you show from the Jedi survivor
@@Ghostlynotme445 dude, I am literally making a video about it...
i will wait when amd will add this to adrenaline .. i have too many monitoring software HWinfo 64 runing 24/7 AMD and gigabyte which i have to
7800X3D+7900XT here. I'm having an issue where it is polling wattage and VRAM data from the integrated GPU, Radeon TM instead of the 7900XT. Going in settings, it only says default adapter as the driver source.
same for me
There are still some issues as this is beta, but you can choose the source on the settings edit
It's a tool to identify bottlenecks but it doesn't solve anything. It's pretty cool to have metrics though. We'll have to see the effect on CPU load when games start using the Direct Storage API. You would have to think that AMD, Nvidia, and game developers have access to this information already. How does this compare to AMD's System Monitor?
It solves the dilemma as said in the title because people now can actually "SEE" if they need a better CPU, even the most basic user can learn how to see that in 2 minutes usually
@@AncientGameplays I'm not too sure if the most basic user will be able to handle it. It points out CPU load but it doesn't tell you what or why. I mean the issue could be that your Windows has degraded, you have insufficient RAM or RAM frequency/throughput, or your using a mechanical drive still. This could be an interesting bench-marking segment where you pick a game(s) and GPU(s) and find the cheapest CPU to give optimum performance results.
@@AncientGameplays you can already SEE this in msi afterburner, just look at gpu usage, i don't understand what's different
Will there always be a CPU bottleneck for 1080p in new games?
Most likely as gpus keep getting stronger
i know i have cpu bottleneck, i had to downclock and undervolt my r5 3600 xD, its because i dont know what the "safe voltage" is, because it was high temp. right now 4ghz @ 1.144v. ive heard these cpus killing themselves because of the 5v boost it does on stock
hey non related, but I was trying to static overclock my 7700x to 5.4ghz, i set the voltages to 1.20500 and when I tried to run prime95 the computer instantly reboots, is 1.20500 too low?! I know you run at 1.18 and ur fine, and I know every system is different. could you maybe help me?
Yeap, your cpu is not that great. Try at 5.3GHz
@@AncientGameplays now im just using, pbo, but i noticed on the amd software, the cpu had a preset of "overclock cpu" could that have messed up the voltages i put in the bios?
This might be a stupid suggestion, but why don't gpu makers make a graphics card that has not only a gpu and vram on it but also a special "cpu" type chip that can handle everything in-game that the system's cpu handles now just more efficiently and effectively? That way the system's cpu wouldn't be burdened with the task of running certain aspects of a game, since that would be handled by the "cpu" on the graphics board, and can instead focus on system stability and processes.
What CPUs need are Data Processing Units that can offload and speed up memory bandwidth
U allways have a bottleneck, else FPS will never stop rising. The first bottleneck (CPU/GPU /bus speed) will be the 1 that determines your FPS. The ideal condition is that the GPU is on 100% utilization. than u know your system is not limiting it from reaching its full potential.
Exactly
Can Someone help me?
I'm encountering an unusual problem while gaming. Despite disabling notifications, I'm hearing the notification sound randomly. Strangely, when this happens, if I press any movement key like W, A, S, or D and then remove my finger from the button, my character continues to move on its own. How can I troubleshoot and resolve this perplexing issue?
Can you test maybe lower gpu and cpu for cs2 with this method? would be interested in it got a 10600kf and 5600xt but want at least constantly 360fps but dont know if i need a gpu upgrade or cpu upgrade for it to reach a smoother experience there.
That gpu would be the bottleneck there unless using really low settings
My old ass body is the bottleneck 😂
Hahaha
the overlay wont show up for me?
There's a easy way to check if you have a CPU bottleneck, is the CPU load, if your game can't load you CPU, you can't have a CPU bottleneck, it's potentially the software being poorly optimized.
Bottlenecks don't happen just because the cpu isn't at 100% lol. They can happen due to lack of ipc, instructions and ofc, game engine
@@AncientGameplays That's my point, if the game engine can't make use of the CPU, is a software bottleneck and not a CPU one, of course, 20% utilisation on a 7800X3D will run faster than 20% utilisation on a 5800X3D because the first is a faster CPU.
@@gustavoalmeida1579 its still a cpu bottleneck, and ipc or cache one. No game engine will ever use a new cpu at 100%, that's never what you want
@@AncientGameplays CPUs shouldn't be used at 100%, but shouldn't be under utilised either, Windows needs very little processing power from modern CPUs, a game could go to 80% utilisation with no issues.
Should i disable msi afterburner while running this?
Dont run both at the same time
@AncientGameplays Thank you my friend
where do we download intel presentmon?
Great video btw!! Always beyond informative
A simple google search :D
so intel just got off the 32bit L1 cache feed? finally. 64bit came out over 10 years ago moores law dictates 128bit L1 should have come about over 5 years ago and 32bit should have gone the way of windows 95.
further moores law GPU memory bit buses should be at 1-2mb as 512kb memory bus was soooooo 2008 with the radeon 6800 and nvidia gtx290/5 gpus
you have 2 top GPUs 7900 XT and 7900 XTX? Damn!! :P
XT was sent for testing only
The biggest problem that causes stutter is the Windows 😂
I just tried it and it doesn't show my CPU's temperature... and I have a 13600K lol
Its still in beta, it will get there
What we need is a bottleneck for dummies software to identify where the weakness is in our systems, most of us either don't have the time to dig deep into finding out or are not capable of doing so.
This Intel performance software could do the trick if it breaks things down in easy to understand language for most to understand, and that could be a real help for many in seeing what part of their PC's need upgrading, because let's be honest, a lot of us are not that good at doing a balance system, many of us just throw in what we think is a good match from cpu, gpu and memory, or many throw in higher end parts, but a lot of us are not good at seeing the right balance of hardware is, and that could be helpful in reducing cost whiles also showing us what needs upgrading to get a bigger bang for the bucks.
This Intel performance software could do just that once it's mature.
On another note, Intel have released the source code as open source, that's good news because AMD, Nvidia and Intel could integrate this into their drivers and that could monitor performance and give us tips in where the bottlenecks are and what hardware upgrades would help.
try turning rebar on and off. it worked for Hitman 3
Not even the point here
@@AncientGameplays in hitman 3 rebar causes CPU bottleneck. If you disable it the GPU runs on 100%. Maybe it will work for star wars also
no, rebar never ever in the history of ever cause CPU bottleneck... at max it eliminates it... you have some odd issues there...
@@pavelb7036
@@AncientGameplays maybe you are right. but there is a reddit post about it in Hitman. And I just tried to turn rebar on for Spider-Man Remastered and the cpu load jumped to 100% and gpu load became 50%. I had GPU 80-90 and CPU 50-60 before this. I have a weak CPU it`s i3 12100 and I can notice when it can`t use GPU`s full potential. I have 4060
no sorry. I wasn`t right. there is just a slight defference. @@AncientGameplays
so what ever frame time is higher is the bottleneck?
Usually yes, it tell you it isn't the gpu that's causing the stutters or fps drops
I'm encountering an unusual problem while gaming. Despite disabling notifications, I'm hearing the notification sound randomly. Strangely, when this happens, if I press any movement key like W, A, S, or D and then remove my finger from the button, my character continues to move on its own. How can I troubleshoot and resolve this perplexing issue?
You mean windows notifications?
@@Red1Napoleon Notifications don't arrive, only the notification sound plays.
@@gamer_489 did you try disabling 'Windows Push Notifications System Service" ?
@@Red1Napoleon I will try
Thank You!
Congratz
What does the presented fps graph (99%) mean?
Which one? 99% is usually the 1% lows, the close to the averages, the best
Don't you want a GPU bottleneck.Not a CPU bottleneck just asking.
you want to NOT be limited by the CPU and RAM, yes that's the point usually
IM tested this its just show gpu time and its very good! but not cpu time im only seen cod warzone game showed gpu time and cpu time if a app can show cpu time its great! its just show the frame time not cpu time! if they add cpu time its gonna be greatest benchmark app!
CPU time = Frame time...
but in the call of duty warzone value its not same as in game cpu time @@AncientGameplays
Draw rate is just framerate of the graphs
Oh, I supposed it was that lol
Noob "GPU busy" graph user vs Chad "I just know my CPU suck" enjoyer
so this just shows you there is a bottleneck, but not WHERE, it could be software sided, game engine, driver overhead, slow ram, not just CPU
how is this different from low gpu usage?
Yes, it is much different because you can see that it is not gpu sided, with the gpu usage it is relative as sometimes, even at 99% you're getting bottlenecked. A good example was TLOU before smart access memory
Maaan Jedi Survivor is still a broken stuttering mess. Even on a high end system. I will probably get it in 2 years when it will be free lol
it's a really good tool and i like it but it don't have the customizable aspect of RTSS (Rivatuner Statistics Server) yet.
so if someone know how to import the GPU busy data to add it in RTSS that would be amazing for me Thanks.
Remember guys, the best type of bottleneck is GPU core performance, as it is least likely to cause stuttering
I have a weird question. Do you do this sign 👌 because you want too or that you don't really think about it? 👀 ( question is cause I know some really big famous ppl do it because they are being told.) ❤
what the hell is that? I use that sometimes when talking to say some things are like "top". Been doing that since I was a child, its a common thing here
@@AncientGameplays that's why I asked my guy. Top tier actors and basically all huge content creators are striking deals with bad ppl to do the 666 symbol.
Can you be banned using it in some game? Cuz forza horizon ban ppl using overlays.
Than FH5 team are a bunch of idiots...
@@AncientGameplays not the 1st to say that. But I can't complain, cuz I can get banned aswell for complaints hehehe! :X
I've used the rivatuner overlay with no issues and no ban.
@@megadeth8592 I have a friend that was banned exactly for using it. But they don't do it to everyone. Just to ppl using it too much. Idk why.
@@AncientGameplaysFH5 (and previously FH4 and FH3) are my primary overlay stability stress-tests, which I use during RTSS development and testing. They never banned anyone for overlays, it is false info.