I had a i3 6100 (2C/4T) all the way until 2023. It was truly awful. PSA never neglect your CPU. I learned the hard way. Night and day difference with Ryzen 5 5600X ❤❤❤
@ThunderTheBlackShadowKitty right... Crazy ain't it... A few years ago I changed my old Xeon first gen intel CPU for a 9900k... And it literally went from 60 to 100fps to 140fps constant.... Just from a cpu change... As long as have a CPU with hyper threading on intel atleast then it gives you double like u said...
not really a waste. Its probably a higher model CPU that had defective cores which were then disabled and sold as a lower end model. If not repurposed, then it would actually be a waste of silicon
fun fact: the reflection appears because it uses voxels, small cubes, and billboards, which are easier to process and usually handled by the gpu. nanite, on the other hand, uses real 3d geometry, and part of its load is on the "cpu"(this latest chapter, by the way, is at least 20% heavier on the cpu)..
i NOTICED THAT!!! While my son locks his fps to 75 because of his display, he usually buzzes around the map at about 40 to 70% CPU utilization with his Ryzen 1600. After the new season his CPU is hammered at 80% to 100% constantly. I had to adjust his settings to get him back to the frame times he is used to playing with. Seems like each time they add a new feature to the game it breaks the performance.
@@HardWhereHero i was cataloging fortnite's performance for a video i was planning to make and noticed this increase in cpu usage. i used to get 144 fps on medium settings, and now it's impossible to maintain above 100 fps consistently, even on the low preset (indicating a cpu bottleneck). because of this, i decided to increase some effects and lock it at 72 fps on my 144hz monitor. the good news is that this is supposedly going to be fixed in a future patch. on consoles, it was already fixed within the first few weeks, and there it's possible to achieve 120 fps on a cpu similar to a ryzen 3600.
I’d love to see a lot more of these CPU tests in the future. However, try adding a part where you limit FPS to set caps like 60 fps (even 30 ?), coming from one experiencing it on a i3 5005u (2c/4t 2 GHz Broadwell CPU) it helps the CPU to actually run the games : ie loading assets, textures, sounds, and running the actual online client some more instead of being stuck computing & sending drawcalls because the FPS is uncapped.
Discovered your channel recently, and love watching the videos. Great showcase of performances and even more entertaining with you along. Awesome videos 🙏👌
As a Software student I'm amazed that Kryzzp abused a CPU to reveal the forbidden 3D modeling of each and every object before the final touches.. I'm just glad the game didn't puke the coding lines as it loaded 😭😭
Cara, esse vídeo foi incrivelmente divertido de assistir!!! Uma ideia pra um próximo vídeo assim seria comparar com 2 núcleos com 3d cache versus 2 núcleos sem o 3d cache do seu 7950x3D 😂 Saudações do Brasil!!
I played this game (without Nanite) on I5-10400f and it was at 100% usage all of the time. Fortnite is very CPU intensive, but it at least used all of the cores.
Kryzp we need you to test the 4090 vs the 5070 and 5070 ti properly when they come out, in games at native resolution without any upscaling or fg, you're the only one that pays attention to those tests!!!!
It reminds me of the good ol days when I first started playing this game. I was using a Core 2 Duo and Gt 710 with 3gb of ram. The perfect recipe for disaster.
If you are someone with a cpu like this, using fraps instead of afterburner could get you a bit more framerate and stop some stutters. Turning down the amount of times that the overlay refreshes could help too.
Is it bad this honestly way better than I thought it would be? Was expecting sub 30 fps all the time the fact it can even reach 100 fps is impressive. Those stutters and freezes are exactly what I expected though
I have this exact stutter issue on pubg at the sanhok map. I am running an 11th gen i7 with 8C/16T clocked at 4.9Ghz. So this video looks normal to me when I think about that experience, yet devs are not fixing the issue.
this has a bit better performance than my 2011 laptop (i5 2430m, gt 540m on a 1366x768 display) (performance mode with 50% 3D gives me around 25 fps (usually 30fps or more) with stutters)
What a great video! Reminds me of when I was playing Fortnite on my laptop with an i7 5500u...and Nvidia GT840m graphics. 2c4t and a HDD 😂. The other players got so many free kills whilst I was frozen. And then there's my son playing it on the Switch with a consistent 30fps....
The gray textures are objects that don't have their shaders compiled yet. In the past, the game would stutter while things on the screen got their shaders compiled. Now they try to compile the shaders before the game (which works most of the time), or during the game on the CPU cores that aren't being utilized. But this CPU is so slow and its two cores are being used to the fullest by Windows and the game itself, so it just doesn't have time to compile the necessary shaders.
YES THIS IS THE SPIRIT THAT WE ALL WANT!!! NOT THE FASTEST AND THE GREATEST!!!!! Do more with shitty ones this is what we want to see!!!!!! Good job zworm have fun in 2025!
I think the stuttering in fortnite is often the shader caching. Even if it says its done doing shaders in the loading screen, I always find my game lagging for a while after it before it goes away.
Great video man! Was wondering if you could arms reformer with the 4080S been wanting to get the game but wanted to see how It would perform before getting it😊
I've played Fortnite for the first time on an intel i3-2100(2c/4t), the experience wasn't great, but at least i got to play with my brothers on the PS4 Slim, i remember when the one who played on console carried the one on PC, i've had so many great memories with my old PC, i still have it and i'm considering making some sort of server or start a homelab out of it since i've had it since 2011, that's my first ever PC and i want to make something great out of it.
this reminds me so much of what it was like playing on my laptop with a i3-8130U with integrated graphics. For some reason when fortnite was released it used to run the game very well, then last year i tried it again and i couldnt even play with my boyfriend withot it crashing constantly aslo didnt help that he was playing in the latest console and i was with that thing lol.
The funny thing about GPU % utilization is it's based off whatever clock speed it's at. So if it's at 1000MHz and 20%, that would be (approximately) the same as 2000mhz and 10% usage. So in this test, yeah it's probably really only using 1-5% of the full utilization of the card lol
7900xtx with g6900, perfectly balanced, it reminds me of the time i bought new pentium g3240 in local shop when my pc broke down, just to have something to edit doc files and later paired it with used gtx 980 ti i got from my cousin, it was terrible, truly awful experience.
once I used to have a G2010 (2nd gen intel pentium) and a gt 710 ..... I ran fortnite at a whopping 10fps at 720p performance mode So glad that now I have a Ryzen 7 laptop with rtx 4060 which runs most games.
i went from an fx6300 (in 2024 by the way) to a ryzen 5 7600x and i love this cpu so much, but until i get my gpu im bottlenecked by a good ole gtx970, but i do get 300 fps in fortnite on performance mode, what a beautiful bottleneck
:0. a cpu test instead of a gpu test!! I’d love to see a lot more of these too
yeah i wish he tries intel i5 750
I had a i3 6100 (2C/4T) all the way until 2023. It was truly awful. PSA never neglect your CPU. I learned the hard way. Night and day difference with Ryzen 5 5600X ❤❤❤
Its amazing right what a new powerful cpu can do... The difference is amazing, in gaming and everything...
@Adamgreen735 Some games my FPS literally doubled or even tripled.
@ThunderTheBlackShadowKitty right... Crazy ain't it... A few years ago I changed my old Xeon first gen intel CPU for a 9900k... And it literally went from 60 to 100fps to 140fps constant.... Just from a cpu change... As long as have a CPU with hyper threading on intel atleast then it gives you double like u said...
@@Adamgreen735 Always better to be GPU bound rather than CPU bound. I learned that lesson the hard way.
I had i3 4430 and now im on i5 8400 😂
LOL. 2 cores and 2 threads is such a waste of silicon in todays market. So glad you gave it a taste of the good life.
Not a waste. Can’t even compete with gt210(not because it is a gpu and celeron is a cpu). I bet that celeron renders faster than gt210
not really a waste. Its probably a higher model CPU that had defective cores which were then disabled and sold as a lower end model.
If not repurposed, then it would actually be a waste of silicon
@ right. Non-flagship products are often cut-down chips
@@dieselgeezer18 Well, that is true. All cpus are labeled by how good their quality in the process of making is
@@muskelinbut the gt 210 is old and was somewhat viable for extra video out ports in 2011. This CPU is 3 years old and can't do anything.
damn this CPU is so powerful that it made you see the developing PROCESS of UE5 in Fortnite
💀💀
The cpu is too trash to joke about. It's painful to use
You: "I just bought RX 7900 XTX"
Your friend: "That's great, I can only afford the RX 7800 XT"
Their CPU: Ryzen 7 9800X3D
Your CPU:
"costs only 500$"
I love these videos, I do want to see more CPU tests in performance mode! not just GPUS tested is pretty refreshing
Bro, i love these videos, they're just so fun to watch
"How much bottleneck do you want?"
"Yes"
fun fact: the reflection appears because it uses voxels, small cubes, and billboards, which are easier to process and usually handled by the gpu. nanite, on the other hand, uses real 3d geometry, and part of its load is on the "cpu"(this latest chapter, by the way, is at least 20% heavier on the cpu)..
i NOTICED THAT!!! While my son locks his fps to 75 because of his display, he usually buzzes around the map at about 40 to 70% CPU utilization with his Ryzen 1600. After the new season his CPU is hammered at 80% to 100% constantly. I had to adjust his settings to get him back to the frame times he is used to playing with. Seems like each time they add a new feature to the game it breaks the performance.
@@HardWhereHero i was cataloging fortnite's performance for a video i was planning to make and noticed this increase in cpu usage. i used to get 144 fps on medium settings, and now it's impossible to maintain above 100 fps consistently, even on the low preset (indicating a cpu bottleneck). because of this, i decided to increase some effects and lock it at 72 fps on my 144hz monitor. the good news is that this is supposedly going to be fixed in a future patch. on consoles, it was already fixed within the first few weeks, and there it's possible to achieve 120 fps on a cpu similar to a ryzen 3600.
I’d love to see a lot more of these CPU tests in the future. However, try adding a part where you limit FPS to set caps like 60 fps (even 30 ?), coming from one experiencing it on a i3 5005u (2c/4t 2 GHz Broadwell CPU) it helps the CPU to actually run the games : ie loading assets, textures, sounds, and running the actual online client some more instead of being stuck computing & sending drawcalls because the FPS is uncapped.
I have a laptop with that same cpu
You can carry the cute fluffy and cute blue jelly thingies. Fluffy ones make you fly and blue ones heal.
Nvm just finished the video that broken graphics jelly boy looks horrific 😭
Discovered your channel recently, and love watching the videos. Great showcase of performances and even more entertaining with you along. Awesome videos 🙏👌
best video creator thanks for this lovely video🎉
Thank you so much 😀
@@zWORMzGaming 6:39 You can carry it actually by just keeping it in the inventory slo
Such a wholesome benchmark. It was funny to watch through the entire thing. One of the best videos you’ve ever done
I was laughing throughout the entire video! I'm amazed that it didn't crash!
Get a life
@GrEeCe_MnKy OK
Nice video. I would like to see more monitor reviews in the near future from you please
new viewer here! been watching yout gt 710 vids as i have the 2gb ddr5 ! your videos are very enjoyable
from england
Such an absurd, and fun test!!
Love these kind of hilarious tests XD
You should try running modern games with 4gb of system ram LOL or 8gb.
Maybe 8 in 2025!! Coz he already did 4gb
As a Software student I'm amazed that Kryzzp abused a CPU to reveal the forbidden 3D modeling of each and every object before the final touches.. I'm just glad the game didn't puke the coding lines as it loaded 😭😭
that would be Matrix confirmed 🤣🤣🤣
Cara, esse vídeo foi incrivelmente divertido de assistir!!!
Uma ideia pra um próximo vídeo assim seria comparar com 2 núcleos com 3d cache versus 2 núcleos sem o 3d cache do seu 7950x3D 😂
Saudações do Brasil!!
LOL you're the best Kryzzpo!! Funny video.
I love your videos! Greatings from Brazil!
Got hit by a missile in war thunder just before your video popped up in my notifs and i was clicking on it, it was worth it honestly
u are insane and i absolutely love it, keep up the good work man❤
I played this game (without Nanite) on I5-10400f and it was at 100% usage all of the time. Fortnite is very CPU intensive, but it at least used all of the cores.
That thing being on lga 1700 is crazy
Kryzp we need you to test the 4090 vs the 5070 and 5070 ti properly when they come out, in games at native resolution without any upscaling or fg, you're the only one that pays attention to those tests!!!!
It reminds me of the good ol days when I first started playing this game. I was using a Core 2 Duo and Gt 710 with 3gb of ram. The perfect recipe for disaster.
Never laughed this much on a CPU test video
If you are someone with a cpu like this, using fraps instead of afterburner could get you a bit more framerate and stop some stutters. Turning down the amount of times that the overlay refreshes could help too.
17 year old cpu: let me die in peace
Kryzp: not yet.
the cpu in the video is not 17 years old, it's only 3 years old.
@@kevin18992Celeron... is a 3 year old CPU?
@@Abobus-xp8ch The g6900 is. Look on google
It's from 12th generation which is around 3 years old.He even said that at the start of the video himself
@@salvo27 Q1'22 says the intel website
Ok it looks playable though and good to see no stuttering 😮
Another banger Video keep up the good work
Thanks! Will do!
Fortnite is broken regardless of a celeron CPU
Funny enough due to recent releases Fortnite is one of the better optimized games when it comes to visuals to performance ratio
there is actually hitching issue on direct x11
At least 28c on fortnite is there. Goat in this term.
Honestly not too bad i was expecting worst performance but if you tweaked the pc theres no telling how much higher it would go but nice video :D
Now I want to see a 9700x3d with gt 710
Is it bad this honestly way better than I thought it would be? Was expecting sub 30 fps all the time the fact it can even reach 100 fps is impressive. Those stutters and freezes are exactly what I expected though
I remember when i tried gaming on my Intel Pentium Silver laptop, Safe to say, It sucked
"PC's are complicated machines."
AI "solution" to every PC problem.
Love old games being played this way
2:31 it tends to stutter first few matches for shaders to comp after that it's smooth atleast on higher end pcs
I have this exact stutter issue on pubg at the sanhok map. I am running an 11th gen i7 with 8C/16T clocked at 4.9Ghz. So this video looks normal to me when I think about that experience, yet devs are not fixing the issue.
I used to play Fortnite with an i3 9100f and it was as stuttery and buggy as this LMAAAO I was so happy when I got a Ryzen 5 5500
That Rx 7900 xtx is just there watching the cpu sh*t itself on Fortnite
I really like those CPU tests. You should lock fps to see if it breaths! Those are really 2 strong cores!
this has a bit better performance than my 2011 laptop (i5 2430m, gt 540m on a 1366x768 display) (performance mode with 50% 3D gives me around 25 fps (usually 30fps or more) with stutters)
Great video as usual, Kryzzp! One more suggestion I have is an RTX 4090 in Fortnite, which you surprisingly haven't done yet.
I remember seeing those missing textures when shaders had to re-compile on my old PC.
Fantastic test 😄. Pls try 2C/4T, the difference will be big
What a great video! Reminds me of when I was playing Fortnite on my laptop with an i7 5500u...and Nvidia GT840m graphics. 2c4t and a HDD 😂. The other players got so many free kills whilst I was frozen. And then there's my son playing it on the Switch with a consistent 30fps....
Peak kryzp content 🗣️
Looking forward to the Intel B580 benchmarks hope you get it soon :)
7900 xtx with a 65W power draw is cursed
The gray textures are objects that don't have their shaders compiled yet. In the past, the game would stutter while things on the screen got their shaders compiled. Now they try to compile the shaders before the game (which works most of the time), or during the game on the CPU cores that aren't being utilized. But this CPU is so slow and its two cores are being used to the fullest by Windows and the game itself, so it just doesn't have time to compile the necessary shaders.
Damn, my first laptop was a celeron laptop and it struggled to run Minecraft surprised to see how far intel's shitty celeron has gone
wow! I would love to watch much more about CPU bottleneck please!
I would like to see fps lock in rtss to 30 or even 20 just to see if it can eliminate stutters
Twwwhat kind of Frankenstein was that??? 😂😂😂
I love your vids man
Thanks for the Fortnite upload from after a while👍 hopefully more soon🤞
You should try the Athlon 3000G to compare. It's got 2 cores and 4 threads
man i really wish for more CPU tests!
It's kind of impressive actually (and hilarious) how it's completely pegged at 100% all the time. Not even going down to 99% or something
YES THIS IS THE SPIRIT THAT WE ALL WANT!!! NOT THE FASTEST AND THE GREATEST!!!!! Do more with shitty ones this is what we want to see!!!!!! Good job zworm have fun in 2025!
Way better than the crappy i3 that I had that would crash when I even attempted to open Fortnite
that's such a strong cpu.. :P
I think the stuttering in fortnite is often the shader caching. Even if it says its done doing shaders in the loading screen, I always find my game lagging for a while after it before it goes away.
more of these pls
It seems like we're gonna be getting CPU tests now! It was really fun watching this one so I'm excited to see what else you test Kryzzp! ♥
Great video man! Was wondering if you could arms reformer with the 4080S been wanting to get the game but wanted to see how
It would perform before getting it😊
I've played Fortnite for the first time on an intel i3-2100(2c/4t), the experience wasn't great, but at least i got to play with my brothers on the PS4 Slim, i remember when the one who played on console carried the one on PC, i've had so many great memories with my old PC, i still have it and i'm considering making some sort of server or start a homelab out of it since i've had it since 2011, that's my first ever PC and i want to make something great out of it.
this reminds me so much of what it was like playing on my laptop with a i3-8130U with integrated graphics. For some reason when fortnite was released it used to run the game very well, then last year i tried it again and i couldnt even play with my boyfriend withot it crashing constantly aslo didnt help that he was playing in the latest console and i was with that thing lol.
Cant wait for the 5090 vids
Fun video plz make more like this
The funny thing about GPU % utilization is it's based off whatever clock speed it's at. So if it's at 1000MHz and 20%, that would be (approximately) the same as 2000mhz and 10% usage. So in this test, yeah it's probably really only using 1-5% of the full utilization of the card lol
Rtx 50 series is out and i am waiting for your video
We need your thoughts on the RTX 5090!
Used to have this cpu on my laptop and I tried to run fortnite on it once’s and I thought it could get 120fps if only I knew
I mean it can, but with 0 fps 1% low lmao
When are you getting an RTX 5090?
7900xtx with g6900, perfectly balanced, it reminds me of the time i bought new pentium g3240 in local shop when my pc broke down, just to have something to edit doc files and later paired it with used gtx 980 ti i got from my cousin, it was terrible, truly awful experience.
Have you gotten your hands on B580 ? btw try to limit the fps with these cpus it may have an impact.
any way thank you for the video
for this rig a 12700k might be a bit better, since those 2 more cores and higher clocks can really make a difference in newer games!
those freezes happen to me all the time on the ryzen 3 3200g :/ i needed to watch this video to check if your forze haha
What is that cpu cooler ur using
Please try RTX 4090 with GTA San Andreas
Next time would you try to test the og map is less intensive than the normal chapter 6 map
once I used to have a G2010 (2nd gen intel pentium) and a gt 710 .....
I ran fortnite at a whopping 10fps at 720p performance mode
So glad that now I have a Ryzen 7 laptop with rtx 4060 which runs most games.
Very funny video Ca you test the RYZEN 7 1700 in 2024?
G7400 next? See if extra 2 threads fare better?
Wait when will you get the b580 for testing?
This was wild lol aaaand it crashes again 😂 fortnite it’s been crashing more than usual or is just me?
Celeron G6900 is an overclocking monster with any MB with PCIe 5.0 and clock generator. You can reach up to 5.5GHz at ease, +61% FPS warranty.
zWORMz, why havent you ever benchmarked the rx 7800 xt?
Intel b580 when?
you think you will actually be buying a 5090 to test considering the msrp?
Gotta do the 6800 XT! Was one of my favourite cards!
i went from an fx6300 (in 2024 by the way) to a ryzen 5 7600x and i love this cpu so much, but until i get my gpu im bottlenecked by a good ole gtx970, but i do get 300 fps in fortnite on performance mode, what a beautiful bottleneck
is it possible for you to benchmark bottleneck builds in the near future? like the 2200g with the 6600
Can you do a test with I7 3770? It's been 3 years since your last video about it.