@@cov_.g9702 i7 8700 and RX 580 (4gb) with 24gb ram cost the same as RTX 3080 here in the philippines and i have an athlon so i was wondering if i could pair haha
@@Spark-pv9js well im planning to get a 13700k and i can hold on im buying gpu now because every gpu here disappears after .5 nanoseconds even 3rd world countries arent safe
Just wanna say i love how you’ve not changed over the years, quality wise yes but you still have the same style as the old videos, been a fan for ages keep it up bro
When I remember that my PC journey started with a Intel Core 2 Quad Processor Q6600 and a GTX 550ti and 4gb of ddr 2 ram (I still have the full build) Oh man, not the best rig when I buyed it for like 80€ in 2015 but what other options did I had? As a Student with no money. I still enjoyed every second, even if I had to run some games in 720p or low settings to even get 30fps
Try using an AMD card. Nvidia cards don't have a hardware scheduler, so that task is up to the CPU. On AMD cards it's handled in hardware and thus frees up some CPU performance. Hardware unboxed made a great video testing this.
Doesn’t matter if you use RX, RTX or even Upcoming arc gpus, it will get same fps regardless of any graphics cards, maximum pairing for 5350 is 5570 and for 5150 it’s 5450 or weaker, any more powerful gpu will get same fps because of huge cpu bottleneck, for proof, check out budget builds official am1 video
@@Annyumi_ I had watched thet video already and the only GPU he added was an Fury, he didn't test it with an Nvidia card. The reason why nvidia hards cause more issues for low end CPU's is, like I mentioned, the lack of a hardware scheduler. The CPU has to do that task then which places an additional load on it. If you want proof, look at the video I linked.
I remember thinking that these AM1 setups were pretty cool when they released... I definitely made the right decision keeping a used core2 quad system as my media center rather than building an AM1 system though. 😂
Do you still use the Quad? Honestly I'm lost with what to do with mine. Was thinking the Q6600 8gb system I have, media centre, file server, or something lol. Was going to use the 4690k system, but I found an rx580 8gb (silly price, hoping it works lol), so think that's going to be the bedroom TV pc now lol
@@johnnycarrotheid Nope, that system is long gone, though mostly just because I wanted to build something new. I've got a passively cooled 5300g system for video streaming and light couch gaming and I'm not a data hoarder anymore so I don't need much storage space. I still have a weird fascination with that era of hardware, though I'll leave it to youtubers to make videos of them playing with it rather than cluttering up my house, since I get pretty unlucky with old parts failing on me.
@@guantanamobae530 The q6600 was my kids Roblox/Sims machine till the lockdowns, she got an r5 3600 upgrade and the old ones sat in a cupboard lol. I've been the same, clear out of hdd's was years ago, thinking of going backwards as drives are cheap as chips. The 4690 nearly got gone for a 3200g/3400g but from looking there wasn't much of a difference, so I kept it. Had it paired with an rx580 before, so know it's old but very capable. For a bedroom light gaming pc it should be a nice , cheap, time waster lol. Old prehistoric 360 controllers + dongle, cheap n cheerful project.
I remember someone using an Athlon 5350 and using an Asus mATX board, was able to overclock the chip to 2.8GHz and the results were fairly good. Too bad it's the only board that allows OC'ing.
Really enjoy your channel - I like messing around with older hardware too, one thing I like to do is to see exactly how much performance you can get out of hardware... max RAM ... fastest CPU the board will support, SSD etc.
I actually still use this processor but the AM1 5350 version overclocked. (@2.1ghz) 8 gigs of DDR3 1600mhz ram. The single channel ram is the "BIG" bottle neck. basically a good low power video streaming device and I might one day upgrade with discreet graphics card like a GT 1030. (give the poor CPU a better chance at other tasks) Nice to see these old work horses from 2014 still kicking. Interesting video RamdomGamingHD.
The ATHLON 5150 has Jaguar AMD cores, The same ones that use PS4 and Xbox One These CPUS were thrown to compete against Intel's atom, although this CPU uses a 25W TDP, they were created for 10W or less TDP
Nope. PS4 uses modified FX-83xx processors. You cannot just slap two processors together and call it a day. To prove my point, classic PS4 games like Witcher 3 run fine on FX-83xx architecture, i.e. you could get 30+ FPS all the time.
Being bottlenecked by the cpu is always quite annoying because there isn’t really anything to be done about it. No matter how much you lower or raise the graphical settings/resolution the gpu is the only thing that changes in usage. (speaking from experience with my i5 4310m and intel hd 4600 graphics)
It's arguably one of the only true forms of bottleneck because it's actually keeping you from using another part of your PC correctly. I'm currently running a 105Ti with an Athlon 3000G and obviously the only real "solution" is overclocking. Even then, it's obviously still impossible not to get stutters. I do plan on upgrading, obviously, but it's very irritating!
Not entirely true. It may be true that GPU is the only thing that changes in usage but if you can push your GPU usage high enough to reach 95~% then you will get out of the CPU bottleneck and the game will run much more stable. So raising the resolution does indeed help
Even though the 3070 reports 25-40% i'm sure its only clocked to 1000mhz or below, it has alot of power saving features so it only goes full boost when its actually being used 100%
In comparsion the Energy saving on Ampere really sucks. You re right, IT will clock between 210 6xx MHz Most of the time in this video, but you only need a Bit more load and IT clocks as high als It gets, which make the Nvidia cards really ineffecient at Part load.
0:28 I've got a Celeron soldered on to an ASUS motherboard and the all important PCIE x16 GPU slot which would be interesting to find out if it can beat this Athlon.... hit me up if you want to borrow it.... I remember the 5350 Athlon being a slight bit more capable as it's just over 2GHz instead of 1.6GHz for the 5150.... can lend you one of those as well 😂
Would it overclock if you increased the multiplier (ratio, in that bios) rather than the frequency? Looks like the frequency also causes the ram to overclock (and other unseen components) which could be what stops it booting.
As a proud owner of my R9 5900X 32GB DDR4 3,200 and RX 6700XT, I think it's a good experience to go back to my original build and set it up as my family house storage server. My original build which consisted a Athlon 5350 8GB of 1,600MHz DDR3 and a GTX 750 ti was actually fairly competent at getting 30 to 60 fps in most titles I played (including R6 Siege and CSGO!). Its the humble beginnings that bring you to where you are today that can really take you back!
I have the same CPU on a fun rig, mostly linux stuff, I've tried it with win10 and a HD7700 low profile and with some tweaks I can play RedOut 2 on 1024x768 (CRT monitor) no problem, I was pretty surprised about that
Looking at the CPU usage it seems like the 2.0 x4 PCIe might have been the cause of some the stutters in at least the Witcher 3. Given that the 6500XT bottlenecks at 3.0 x4 (although dependent on game, resolution and textures) I would assume that something like a 750 or about equivalent might be at the limit of what wouldn't be bottlenecked at 2.0 speeds. I also have my own experience of a card being bottlenecked to 1x 1.0 speed and that exhibited some fairly strange behavior in games regardless of settings/resolution. Ironically I think to give this CPU the best chance it can get it would be best paired with something like an RX 550 or R7 260/360 for minimal driver overhead and minimal PCIe bottlenecking.
I hope some day you manage to get an RTX 3090 Ti and pair it with the lowest end intel Atom in existance; it would be awesome if MSI afterburner shows "GPU usage: 0,0000001%" . That would be bottleneck god level
I'm not 100% sure. But, that looks just like my case. Except that the front panel has something covering all the mesh. It's somewhat of a tricky case to get everything to work nicely in.
This video reminded me about my Pentium G2030 + GTX 1050 2GB. God, it hurts knowing that my CPU was preventing my GPU to run games it was supposed to run. Then I changed to a i5 3470 and now I'm using a i5 10400, only need to change the GPU.
I found a terrible chip on ebay for just over £27 in February, I think you'd really "enjoy" it! It's an AM4 Carizzo APU that runs DDR4 2400mhz It's technically unsupported on most boards, but it seems to work if the bios was set up for the 7th generation chips. It definitely works on the motherboard I chanced my arm on, the Asrock a320m-HDV. Look up the AM4 A10-8770E, the E is important. The on board graphics are surprisingly capable but the actual CPU is *atrocious*
Motherboard manufacturers have been dropping support on AM4 boards for A-series APUs in favour of Ryzen APUs and CPUs, claiming lack of space for later BIOS revisions. So such parts may work, as long as the BIOS isn't too new. Rather the opposite of what you expect with BIOS updates.
@@MarkTheMorose yes, I made sure to check with Asrock themselves what bios was the cut off point for the removal of the older APU support, and for my specific board it was the bios 7.0 on the A320m-HDV rev 4.0 Asrock nor their contemporaries have never officially supported the Carizzo generation APUs (like my 8770E) on their A320 or higher boards, with the Bristol Ridge (9xxx numbered) being the first official chips to be listed as compatible. I took a gamble with the A10-8770E, which was an OEM only APU, using the same interface as the later designs as they were on paper extremely similar specifications wise, although the shift from piledriver to piledriver+ was a small concern
@@joannaatkins822 When I first moved from various Intel boards to socket AM4, my first APU was an A6-9500E, while I waited for Ryzen APUs. Being a hoarder, I kept hold of it once it was replaced, shifting it to a secondary PC. I still have it after replacing it with first an A8-9600, then an Athlon 200GE. I fear that I may actually be the reason for the global chip shortage. Sorry.
@@MarkTheMorose Obviously, such selfish behaviour! (joking) I'm likely going to upgrade/downgrade to an Athlon 3000g, and weirdly they're less expensive right now than the A10 and A12 chips second hand. I want a snappier CPU and a less powerful GPU, as my A10-8770e actually plays Crysis at 720p
I remember the day’s. When low frame rates were standard on the Atari ST and Amiga. We were pretty used to the low frame rates. Now, I couldn’t go back to those low frame rates now.
My system is even more bizzare right now.I have a cheapest pcie 2.0 motherboard and single channel 12 gb ddr3 ram which can't fully run the 3570k processor i have on it for some reason and my CPU can't run(obviously) my 6600xt properly. So its different tiers of suck.
Did you attempt the CPU ratio or just the APU frequency? Probably doesn't matter, that processor is so crippled even a mild overclock wouldn't help it.
I know this isn't really related to the video, but could you check out the "Lossless Scaling" program on Steam? It let's you use FSR on every game basically and it made the difference for me between playing at 480p with Anti Aliasing on Elden Ring and using FSR in Elden Ring even tho I have an old RX 460 2gb OEM in my pc, I think it's a very interesting program and I've not seen many talk about it honestly
Slightly. I'm running a i5 10400 with a 3060 and even in some games there is some loss of frames compared to the new i7's. But the main thing is there's no stuttering and my CPU isn't at 100% all the time leaving room for my background stuff.
@@Porsche996TT Yeah ideally aim for at least 10th gen intel or above. With 6 cores and 12 threads. But if you were on a i7 you'd probably want a new i7 which has 8 cores and 16 threads.
Lol..I had an athlon 5350 overclocked in the guest room for ages as a media PC...I thought I was insane adding a GTX 750 so people could play old games like Race Driver Grid, fallout 3 and similar. Even that was terribly bottlenecked, even with the cpu overclocked.
Of course the FPS will not change at higher resolutions, this is all the CPU can give at 1080p and 1440p. It's like you pair a FX chip with the same RTX 3070, at 1080p and higher resolutions you will get the same performance. If you ask for a channel that did similar tests with a FX but different GPU that's Hardware Unboxed.
Oh the struggle is real! I had HP pre built with AMD A10-9700 and integrated Radeon R7 series GPU. Then I swapped the psu so I could fit GTX 970 to it. Well what a disaster it was. CPU was huge bottleneck even with GTX 970. I feel your pain.
No, it was not :) Stop peddling nonsense. You cannot just fuse two processors, lots of things are needed to make cores run together. Processor in PS-4 was made from heavily modified FX-83xx processors. They even have similar FPS in games released for PS4 like Witcher 3.
@@aleksazunjic9672 maybe not literally but it is the same architecture as this Athlon CPU (and there are indeed two 4 core modules) , and no, the PS4 wasn't based on FX processors because these didn't have the same architecture (FX used Bulldozer and Piledriver, not Jaguar. Also FX is CPU only, while Jaguar is an APU) so the Athlon is closer to what's inside the PS4. What I wanted to say is if you imagine this Athlon with double the cores, it is equivalent in performance to what's inside the PS4. Should I mention this doesn't work for the graphics part that was custom made to be 9 times more powerful and with added support for GDDR5 or are you gonna make me copy-paste the entire Wikipedia page?
@@KuroAno Jaguar architecture is based on Bulldozer . Jaguar had simplified ALU and AGU units to save on costs. PS4 CPU is based on FX-83xx processor, like it or not . I.e. 8 cores 8 threads (at least that was AMD marketing of FX-83xx) . You cannot just imagine things, iGPU on Athlon is far less capable than GPU on PS4 ( rough equivalent would be Radeon 7850) . This computer in the video is far less capable than PS4 and it shows.
I have a similar setup with a 200ge and an rx 6600 and I'm usually fine for 60 fps on max on graphics but sometimes I have to turn everything down in msi afterburner to get stable fps with no stutter
I tried running the Final Fantasy XV benchmark with a 1gb GTX 560ti and not all of the textures would load. However, I was able to get the game to load textures and run at 720p with a 2gb GTX 650ti Boost edition.
@@KneppaH yes they did improve the game performance and fixe alot of bugs but it's still one of the most demanding games on both cpu & gpu. But the thing is, fortnite is just an optimization mess that even it stutters on my i7 10700kf
I don't know why, but it always surprises me when a CPU gives a bottleneck as I have always been lead to believe that the game was over 90% Graphics card and the CPU was just there to keep everything aligned so to speak. But, with this video, I am proved to be wrong as I know the 3070 is a good GPU and so I know it would cause little to no bottleneck in itself. Thanks, Steve for another enlightening video.
Apart from gamelogic, the CPU has to process every drawcall for the gpu. In modern games there are enormous amount of drawcalls to be made. Fallout 4 was notorious for that. Decreasing viewdistance, decreases drawcalls, because there are less objects to be rendered. Also in-game shadows are a cause for higher drawcall counts. For me it was impossible to hit a smooth 60fps without hickups in fallout 4 with an Athlon II 760k @ 4,4Ghz. This was already a slow cpu from the start.Setting the view distance of objects, npc's, vegetation to minimum, shadow distance to minimum made it a lot better, but still not entirely smooth.
yeah the ole am1 system was more like a experiment then anything .. for a quad core it is 1/4 the die size of the fx8000 series of cpus. that was its schtick.. its amazing it ever ended up on shelf's imo and didnt stay an obscure production test model not for sale.
theroretically being 1/4 the size i wonder if they ever experimented by making an opteron by joining 4 athlon x4 am1 cpus to make a 16 core slow boat. lol
Or what if I put en an RTX 3070 in a Sony vaio desktop PC that I have which does have a PCI express slot. However the CPU is a Intel Pentium D 2.8 GHz CPU from 2005.
The AM1 platform is cool and it's always interesting to put them to tests, but they are certainly not gaming capable. Even a celeron will better them dependant on generation. I used to have an AM1 with the sempron which obviously is weaker than the Athlon due to less cores etc and I tested Fortnite with a gtx 1050ti and while it ran ok on low settings there was a fair amount of stutter. Will be better as a retro gaming system.
The word "Unplayable" gets thrown around quite a lot these days, as more and more the youth of the PC space consider sub 60fps an "unplayable" experience. However, This video demonstrates my understanding of "Unplayable", as even at 15 or 20fps, if the frame-rate is consistent, there is much fun to be had. This CPU cannot handle gaming, because of the Stutters and micro-freezes, it is far too weak to game upon. This then, is Unplayable, when your gaming experience is inconsistent, stuttery, uncontrollable. And I have lived this reality before. Thankfully, I am now somewhat inured from the worst of Frame-dips.
my very first prebuilt had a sempron from this series and down the road i threw in a 270x it could game but barely i played titanfall 1 on like LOW 900p that was years ago ik you can overclock them, one dude pushed one super hard and i thought it was pretty funny always wondered how it would game with a massive OC still would be hindered but you could for sure play older games
I mean, if it was like 3 - 4 GHz Quad Core with no hyperthreading, like a Phenom x4, it could still run all those games even now with 60 FPS no issues, but 1.6Ghz is... yeah
i mean, even that is bottlenecking your 6700 pretty hard. of course not to this comical level, but still there. if you can spare the cash, you should get an r5 5600
Nice video, I have a 3070 with a 10400f, which on many forums and comments everyone says this is a bad combo as the 3070 gets bottenecked. I don't see this happenning, she runs at 99% on many heavy AAA games. What do you guys think?
Maybe a little bit in some scenarios in some games, but then you're already at very high framerates. For now 6 cores with 12 threads with high IPC and clockspeeds are a good match for your 3070. And if you are gaming @ 1440p or higher this becomes even more a non issue.
is it a cpu bottleneck if my cpu stands at 50-60% while my gpu is at 99-100% or its just fine? im running a 9400f stock and a rx 6750 xt oc edit: i play on 1440p btw:)
Depends on the resolution/game you use. At 1080p in Assassins Creed Origins/Odyssey, MSFS 2020 and other CPU intensive games my 10400 does bottleneck my 3070 such that eventually I'll upgrade to a faster CPU. At 4K things even out but at lower fps.
You really need to overclock the 3070 to get rid of the bottleneck.
Why? The 3070 already looks like it's bored out of its mind. It's got nothing to do.
@@nightbirdds It was a joke
🤣
😂
@@nightbirdds come on man
3:33 this CPU doesn't break the game, it makes you take a break while you game
This cpu makes the Celeron look like a quantum processor
That's mean, celerons are nowhere near as hard to code for as a quantum CPU
@@LiamNajor 💀
but nowhere near my other Biostar motherboard with a E1-6010 1.3 GHZ
@@MixedVictor still not as slow as my biostar board with an athlon 64 X2 4600+
@@yeetus59 i just remembered of someone who installed NetBSD on a freaking Pentium (i think it was from 1998 if i remember correctly)
4:18 - "Ah, here we go again." I see what you did there! 😂
5:09 - 1% usage on the 3070. YIKES!
ah shit, here we go again
Man you got the best timing! I literally thought about selling my i7 8700 and rx 580 for an RTX 3080 and athlon half an hour ago
Why 😭
@@cov_.g9702 woosh
@@cov_.g9702 i7 8700 and RX 580 (4gb) with 24gb ram cost the same as RTX 3080 here in the philippines and i have an athlon so i was wondering if i could pair haha
Sell it for 12100f and a 3060 lol
@@Spark-pv9js well im planning to get a 13700k and i can hold on im buying gpu now because every gpu here disappears after .5 nanoseconds even 3rd world countries arent safe
Just wanna say i love how you’ve not changed over the years, quality wise yes but you still have the same style as the old videos, been a fan for ages keep it up bro
Yep I like his down to earthness.
"Running around looking up or down doesn't exactly scream optimal gaming experience"
Goldeneye 007 speedrunners would probably disagree lol
You should try increasing the resolution until the gpu becomes the bottleneck, I'm guessing you could get up to 8k/16k
But then you got all round bottleneck? xD
How would you even know for this config lol I feel like you’d hit 1fps no matter what
Right? I was thinking 4k gaming so the GPU takes on the workload
Rip cpu😂
When I remember that my PC journey started with a Intel Core 2 Quad Processor Q6600 and a GTX 550ti and 4gb of ddr 2 ram (I still have the full build) Oh man, not the best rig when I buyed it for like 80€ in 2015 but what other options did I had? As a Student with no money. I still enjoyed every second, even if I had to run some games in 720p or low settings to even get 30fps
Thanks for the buyers guide! Just sold my i9. Hope my athlon 5150 arrives soon 🙏
The real game here is getting the luxury of being able to manually count the individual frames with your naked eye. A true gentleman's sport.
This gaming experience makes me feel good about my i5 2500k and GTX 960. So, Steve, if you thought this video was pointless, it isn't.
Still a nice combo :)
same but i have an i5 3570
had the same, then a i7 2600k with a 980ti and now ryzen 3 3100 with a 2060.
Try using an AMD card. Nvidia cards don't have a hardware scheduler, so that task is up to the CPU. On AMD cards it's handled in hardware and thus frees up some CPU performance. Hardware unboxed made a great video testing this.
Doesn’t matter if you use RX, RTX or even Upcoming arc gpus, it will get same fps regardless of any graphics cards, maximum pairing for 5350 is 5570 and for 5150 it’s 5450 or weaker, any more powerful gpu will get same fps because of huge cpu bottleneck, for proof, check out budget builds official am1 video
@@Annyumi_ I had watched thet video already and the only GPU he added was an Fury, he didn't test it with an Nvidia card. The reason why nvidia hards cause more issues for low end CPU's is, like I mentioned, the lack of a hardware scheduler. The CPU has to do that task then which places an additional load on it. If you want proof, look at the video I linked.
@@Annyumi_ nah there's literally a hardware unboxed benchmark with the RX 580 outperforming the RTX 3080 in fortnite when paired with an i7 4790K
"but it can run?"
"yes"
"ill take your entire stock"
-schools
I remember thinking that these AM1 setups were pretty cool when they released... I definitely made the right decision keeping a used core2 quad system as my media center rather than building an AM1 system though. 😂
Do you still use the Quad?
Honestly I'm lost with what to do with mine.
Was thinking the Q6600 8gb system I have, media centre, file server, or something lol.
Was going to use the 4690k system, but I found an rx580 8gb (silly price, hoping it works lol), so think that's going to be the bedroom TV pc now lol
@@johnnycarrotheid Nope, that system is long gone, though mostly just because I wanted to build something new. I've got a passively cooled 5300g system for video streaming and light couch gaming and I'm not a data hoarder anymore so I don't need much storage space. I still have a weird fascination with that era of hardware, though I'll leave it to youtubers to make videos of them playing with it rather than cluttering up my house, since I get pretty unlucky with old parts failing on me.
@@guantanamobae530
The q6600 was my kids Roblox/Sims machine till the lockdowns, she got an r5 3600 upgrade and the old ones sat in a cupboard lol.
I've been the same, clear out of hdd's was years ago, thinking of going backwards as drives are cheap as chips.
The 4690 nearly got gone for a 3200g/3400g but from looking there wasn't much of a difference, so I kept it.
Had it paired with an rx580 before, so know it's old but very capable.
For a bedroom light gaming pc it should be a nice , cheap, time waster lol.
Old prehistoric 360 controllers + dongle, cheap n cheerful project.
I remember someone using an Athlon 5350 and using an Asus mATX board, was able to overclock the chip to 2.8GHz and the results were fairly good. Too bad it's the only board that allows OC'ing.
I've didn't experience so many stutters during my whole life as in this video. Thanks Bro 😁
"When your not doing much the frame rate is at its best" 😂🤣😂
Really enjoy your channel - I like messing around with older hardware too, one thing I like to do is to see exactly how much performance you can get out of hardware... max RAM ... fastest CPU the board will support, SSD etc.
the 3070 is snoring while the athlon is fighting for it's life
Putting some RGB might help...
I actually still use this processor but the AM1 5350 version overclocked. (@2.1ghz) 8 gigs of DDR3 1600mhz ram. The single channel ram is the "BIG" bottle neck. basically a good low power video streaming device and I might one day upgrade with discreet graphics card like a GT 1030. (give the poor CPU a better chance at other tasks) Nice to see these old work horses from 2014 still kicking. Interesting video RamdomGamingHD.
I was able to play Doom 2016 when paired with 1050 Ti on ultra settings 1080p. (without overclocking) It was unbelievable back then.
You could easily reach 2.5ghz with the 5350.
The ATHLON 5150 has Jaguar AMD cores, The same ones that use PS4 and Xbox One
These CPUS were thrown to compete against Intel's atom, although this CPU uses a 25W TDP, they were created for 10W or less TDP
Nope. PS4 uses modified FX-83xx processors. You cannot just slap two processors together and call it a day. To prove my point, classic PS4 games like Witcher 3 run fine on FX-83xx architecture, i.e. you could get 30+ FPS all the time.
Being bottlenecked by the cpu is always quite annoying because there isn’t really anything to be done about it. No matter how much you lower or raise the graphical settings/resolution the gpu is the only thing that changes in usage. (speaking from experience with my i5 4310m and intel hd 4600 graphics)
It's arguably one of the only true forms of bottleneck because it's actually keeping you from using another part of your PC correctly. I'm currently running a 105Ti with an Athlon 3000G and obviously the only real "solution" is overclocking. Even then, it's obviously still impossible not to get stutters. I do plan on upgrading, obviously, but it's very irritating!
Not entirely true. It may be true that GPU is the only thing that changes in usage but if you can push your GPU usage high enough to reach 95~% then you will get out of the CPU bottleneck and the game will run much more stable. So raising the resolution does indeed help
Gpu: why i am still here
Cpu: why i am still here just to suffer i feel like my cores are burning
Even though the 3070 reports 25-40% i'm sure its only clocked to 1000mhz or below, it has alot of power saving features so it only goes full boost when its actually being used 100%
In comparsion the Energy saving on Ampere really sucks.
You re right, IT will clock between 210 6xx MHz Most of the time in this video, but you only need a Bit more load and IT clocks as high als It gets, which make the Nvidia cards really ineffecient at Part load.
I’m still quite impressed the 3070 managed to utilise 30/40% in cyberpunk 😎🙏🏼
As the proud owner of a 3070 this brings me more joy than anything in life thus far… Thank you!
0:28 I've got a Celeron soldered on to an ASUS motherboard and the all important PCIE x16 GPU slot which would be interesting to find out if it can beat this Athlon.... hit me up if you want to borrow it.... I remember the 5350 Athlon being a slight bit more capable as it's just over 2GHz instead of 1.6GHz for the 5150.... can lend you one of those as well 😂
Would it overclock if you increased the multiplier (ratio, in that bios) rather than the frequency? Looks like the frequency also causes the ram to overclock (and other unseen components) which could be what stops it booting.
BCLK
I’m not sure if multiplier overclocking existed back then
He need It to lower the RAM speed to 667 or 800mhz and then increase the BCLK, that's how these things overclock. Very easy.
I'm still running a i7 2600 clocking between 3900 and 4200 in games, paired with a GTX 970.
that's a nice combo.
As a proud owner of my R9 5900X 32GB DDR4 3,200 and RX 6700XT, I think it's a good experience to go back to my original build and set it up as my family house storage server.
My original build which consisted a Athlon 5350 8GB of 1,600MHz DDR3 and a GTX 750 ti was actually fairly competent at getting 30 to 60 fps in most titles I played (including R6 Siege and CSGO!). Its the humble beginnings that bring you to where you are today that can really take you back!
I have the same CPU on a fun rig, mostly linux stuff, I've tried it with win10 and a HD7700 low profile and with some tweaks I can play RedOut 2 on 1024x768 (CRT monitor) no problem, I was pretty surprised about that
Looking at the CPU usage it seems like the 2.0 x4 PCIe might have been the cause of some the stutters in at least the Witcher 3.
Given that the 6500XT bottlenecks at 3.0 x4 (although dependent on game, resolution and textures) I would assume that something like a 750 or about equivalent might be at the limit of what wouldn't be bottlenecked at 2.0 speeds.
I also have my own experience of a card being bottlenecked to 1x 1.0 speed and that exhibited some fairly strange behavior in games regardless of settings/resolution.
Ironically I think to give this CPU the best chance it can get it would be best paired with something like an RX 550 or R7 260/360 for minimal driver overhead and minimal PCIe bottlenecking.
They used to test these Athlons with a 750Ti and It was "alright". 750 and 750Ti was the top gpu for It.
I hope some day you manage to get an RTX 3090 Ti and pair it with the lowest end intel Atom in existance; it would be awesome if MSI afterburner shows "GPU usage: 0,0000001%" . That would be bottleneck god level
Haha nice, I'm subscribed to your channel for this kind of content 🤠
I'm not 100% sure. But, that looks just like my case. Except that the front panel has something covering all the mesh. It's somewhat of a tricky case to get everything to work nicely in.
an extra DIMM might still slightly boost performance by adding an extra rank for interleaving
there's only one slot on the board
Have you ever tried the e1-6010 you should
Do this again but with a single core celeron from the early 2000s.
most games wont even start.
This was like my setup back in 2011, brand new and good for the time GPU, but paired with an ancient Dual Core 2.2 Ghz AMD Athlon.
Not that it would make game playable, but you should try AMD GPUs for weak CPUs are their drivers have less CPU overhead.
I am impressed the RTX 3070 didn't kill the poor AMD Athlon trying to pull so much from that cpu.
This video reminded me about my Pentium G2030 + GTX 1050 2GB. God, it hurts knowing that my CPU was preventing my GPU to run games it was supposed to run. Then I changed to a i5 3470 and now I'm using a i5 10400, only need to change the GPU.
perfect time change the gpu, gpu prices going down
get an rx 6600
A 10400 will bottleneck a 3070 at 1080p on CPU intensive games.
@@BALLOOROOM I know about that, but I'm not planning on buying such a powerful GPU.
First I have to change my PSU, by the way.
I found a terrible chip on ebay for just over £27 in February, I think you'd really "enjoy" it! It's an AM4 Carizzo APU that runs DDR4 2400mhz
It's technically unsupported on most boards, but it seems to work if the bios was set up for the 7th generation chips. It definitely works on the motherboard I chanced my arm on, the Asrock a320m-HDV.
Look up the AM4 A10-8770E, the E is important. The on board graphics are surprisingly capable but the actual CPU is *atrocious*
Motherboard manufacturers have been dropping support on AM4 boards for A-series APUs in favour of Ryzen APUs and CPUs, claiming lack of space for later BIOS revisions. So such parts may work, as long as the BIOS isn't too new. Rather the opposite of what you expect with BIOS updates.
@@MarkTheMorose yes, I made sure to check with Asrock themselves what bios was the cut off point for the removal of the older APU support, and for my specific board it was the bios 7.0 on the A320m-HDV rev 4.0
Asrock nor their contemporaries have never officially supported the Carizzo generation APUs (like my 8770E) on their A320 or higher boards, with the Bristol Ridge (9xxx numbered) being the first official chips to be listed as compatible. I took a gamble with the A10-8770E, which was an OEM only APU, using the same interface as the later designs as they were on paper extremely similar specifications wise, although the shift from piledriver to piledriver+ was a small concern
@@joannaatkins822 When I first moved from various Intel boards to socket AM4, my first APU was an A6-9500E, while I waited for Ryzen APUs. Being a hoarder, I kept hold of it once it was replaced, shifting it to a secondary PC. I still have it after replacing it with first an A8-9600, then an Athlon 200GE. I fear that I may actually be the reason for the global chip shortage. Sorry.
@@MarkTheMorose Obviously, such selfish behaviour! (joking)
I'm likely going to upgrade/downgrade to an Athlon 3000g, and weirdly they're less expensive right now than the A10 and A12 chips second hand. I want a snappier CPU and a less powerful GPU, as my A10-8770e actually plays Crysis at 720p
I remember the day’s. When low frame rates were standard on the Atari ST and Amiga. We were pretty used to the low frame rates. Now, I couldn’t go back to those low frame rates now.
The mad lad strikes yet again
3:25 the font on "block party" made me think it said "dfi lan party"
My system is even more bizzare right now.I have a cheapest pcie 2.0 motherboard and single channel 12 gb ddr3 ram which can't fully run the 3570k processor i have on it for some reason and my CPU can't run(obviously) my 6600xt properly.
So its different tiers of suck.
Did you attempt the CPU ratio or just the APU frequency? Probably doesn't matter, that processor is so crippled even a mild overclock wouldn't help it.
This reminds me of desperately trying to game on my Cyrix 6x86 Pr166 back in the 1990s.
Spoiler alert: it didn't end well.
It could game, but it could not run latest titles. But there were many games that were designed for 386 and 486 processors, and they did run well.
I know this isn't really related to the video, but could you check out the "Lossless Scaling" program on Steam? It let's you use FSR on every game basically and it made the difference for me between playing at 480p with Anti Aliasing on Elden Ring and using FSR in Elden Ring even tho I have an old RX 460 2gb OEM in my pc, I think it's a very interesting program and I've not seen many talk about it honestly
Also sorry if my English is terrible, cheers from Italy :3
This made me feel good about my FX-4300. I never thought I'd see a day where I would be able to say that.
I had that same case, thermals are terrible to say its form a company called "cooler master"
I love your version of CJ meme " Ah, s***, here we go again."
😁
I had no idea Coolermaster started making "Safes" :P
Still rocking my Coolermaster case from like 2007 or 8, it's a fuckin unit still to this day.
Have you ever played Star Wars the old republic (swtor)? I've always found it performs really well.
Do you think the i7 7700K would bottleneck an RTX 3060 or even a 3070 ?
Partially , i.e. in some games that run over 100 FPS. But, you should be able to get 60 FPS in any game without bottleneck.
@@aleksazunjic9672 That's not bad, thanks.
Slightly. I'm running a i5 10400 with a 3060 and even in some games there is some loss of frames compared to the new i7's. But the main thing is there's no stuttering and my CPU isn't at 100% all the time leaving room for my background stuff.
@@portman8909 It seems that I have to upgrade my CPU also, I thought the i7 7700K is enough for a 3060
@@Porsche996TT Yeah ideally aim for at least 10th gen intel or above. With 6 cores and 12 threads. But if you were on a i7 you'd probably want a new i7 which has 8 cores and 16 threads.
I HAD ONE!!!! SAME BOARD!!! I LOVED IT!
it played factorio and stardew valley played fine!
Minesweeper runs just fine 🤣
Lol..I had an athlon 5350 overclocked in the guest room for ages as a media PC...I thought I was insane adding a GTX 750 so people could play old games like Race Driver Grid, fallout 3 and similar. Even that was terribly bottlenecked, even with the cpu overclocked.
I like how the CPU is really cool
What is the name of that CABINET??
This reminds me of the old days, playing (and finishing) Witcher 2 , FC 3 and AC brotherhood with 15-20 fps on a low end pentium and Radeon HD 5450.
Of course the FPS will not change at higher resolutions, this is all the CPU can give at 1080p and 1440p.
It's like you pair a FX chip with the same RTX 3070, at 1080p and higher resolutions you will get the same performance.
If you ask for a channel that did similar tests with a FX but different GPU that's Hardware Unboxed.
Oh the struggle is real! I had HP pre built with AMD A10-9700 and integrated Radeon R7 series GPU. Then I swapped the psu so I could fit GTX 970 to it. Well what a disaster it was. CPU was huge bottleneck even with GTX 970. I feel your pain.
Did you know 2 of these athlon cpus fused together was basically the processor inside the PS4 and original XBox One (not the S or X) ?
No, it was not :) Stop peddling nonsense. You cannot just fuse two processors, lots of things are needed to make cores run together. Processor in PS-4 was made from heavily modified FX-83xx processors. They even have similar FPS in games released for PS4 like Witcher 3.
@@aleksazunjic9672 maybe not literally but it is the same architecture as this Athlon CPU (and there are indeed two 4 core modules) , and no, the PS4 wasn't based on FX processors because these didn't have the same architecture (FX used Bulldozer and Piledriver, not Jaguar. Also FX is CPU only, while Jaguar is an APU) so the Athlon is closer to what's inside the PS4. What I wanted to say is if you imagine this Athlon with double the cores, it is equivalent in performance to what's inside the PS4. Should I mention this doesn't work for the graphics part that was custom made to be 9 times more powerful and with added support for GDDR5 or are you gonna make me copy-paste the entire Wikipedia page?
@@KuroAno Jaguar architecture is based on Bulldozer . Jaguar had simplified ALU and AGU units to save on costs. PS4 CPU is based on FX-83xx processor, like it or not . I.e. 8 cores 8 threads (at least that was AMD marketing of FX-83xx) . You cannot just imagine things, iGPU on Athlon is far less capable than GPU on PS4 ( rough equivalent would be Radeon 7850) . This computer in the video is far less capable than PS4 and it shows.
It would be interesting to see what the most powerful CPU/ram setup you can cram into that tiny little PC.
I had my volume at 0 but I could still hear "Hello everyone and welcome to another video"...
I have a similar setup with a 200ge and an rx 6600 and I'm usually fine for 60 fps on max on graphics but sometimes I have to turn everything down in msi afterburner to get stable fps with no stutter
Your single core performance isn't your weekness. I'd nab a 2600x or 1400 for a cheap upgrade, if you have the cash of course.
The 200GE is significantly faster than the 5150, even with half the number of cores.
the Athlon 5150 is an APU so why not see how the on board graphics do?
Use nexus lite os for Benchmarking as it's really stripping out everything
This is below the level of a computer with a graphics card from a giveaway lol
I tried running the Final Fantasy XV benchmark with a 1gb GTX 560ti and not all of the textures would load. However, I was able to get the game to load textures and run at 720p with a 2gb GTX 650ti Boost edition.
I can't believe the cyberpunk experience was alot smoother than fortnite 😵
CD-projekt red had to optimize the game a lot for the weak ps4 and xbox one cpu's. Now it is one of the better optimized games.
@@KneppaH yes they did improve the game performance and fixe alot of bugs but it's still one of the most demanding games on both cpu & gpu.
But the thing is, fortnite is just an optimization mess that even it stutters on my i7 10700kf
Amazing temps on the athlon in afterburner lol definitely buying that beast asap
Can u test it on 4k gaming with this proccesor? I wonder if the game gonna became little playable because 4k more to gpu ...
I don't know why, but it always surprises me when a CPU gives a bottleneck as I have always been lead to believe that the game was over 90% Graphics card and the CPU was just there to keep everything aligned so to speak. But, with this video, I am proved to be wrong as I know the 3070 is a good GPU and so I know it would cause little to no bottleneck in itself. Thanks, Steve for another enlightening video.
Apart from gamelogic, the CPU has to process every drawcall for the gpu. In modern games there are enormous amount of drawcalls to be made. Fallout 4 was notorious for that. Decreasing viewdistance, decreases drawcalls, because there are less objects to be rendered. Also in-game shadows are a cause for higher drawcall counts.
For me it was impossible to hit a smooth 60fps without hickups in fallout 4 with an Athlon II 760k @ 4,4Ghz. This was already a slow cpu from the start.Setting the view distance of objects, npc's, vegetation to minimum, shadow distance to minimum made it a lot better, but still not entirely smooth.
When you win a giveaway GPU:
yeah the ole am1 system was more like a experiment then anything .. for a quad core it is 1/4 the die size of the fx8000 series of cpus. that was its schtick.. its amazing it ever ended up on shelf's imo and didnt stay an obscure production test model not for sale.
theroretically being 1/4 the size i wonder if they ever experimented by making an opteron by joining 4 athlon x4 am1 cpus to make a 16 core slow boat. lol
You could use the world abysmal for a good alternative to disaster or terrible.
Will the 11600k bottleneck the 3080 ti @ 1440P?
Idk what’s worse, this combo or the fact that Fortnite and cyberpunk have almost the same fps
You really need to do a part 2 where you try to get a playable experience
Or what if I put en an RTX 3070 in a Sony vaio desktop PC that I have which does have a PCI express slot. However the CPU is a Intel Pentium D 2.8 GHz CPU from 2005.
Where do you live in the UK? Not exact location but general direction? If close to Leeds i could give you my ryzen 7 1700 for a video
hey mate could you tell me what camera are you using for your videos?
oh those "camilles" are some beautiful flowers
The AM1 platform is cool and it's always interesting to put them to tests, but they are certainly not gaming capable. Even a celeron will better them dependant on generation. I used to have an AM1 with the sempron which obviously is weaker than the Athlon due to less cores etc and I tested Fortnite with a gtx 1050ti and while it ran ok on low settings there was a fair amount of stutter. Will be better as a retro gaming system.
i think you should get the 860k and try it in that system.
Perfomance wise, similar to i5 laptop U dual core?
The word "Unplayable" gets thrown around quite a lot these days, as more and more the youth of the PC space consider sub 60fps an "unplayable" experience. However, This video demonstrates my understanding of "Unplayable", as even at 15 or 20fps, if the frame-rate is consistent, there is much fun to be had.
This CPU cannot handle gaming, because of the Stutters and micro-freezes, it is far too weak to game upon. This then, is Unplayable, when your gaming experience is inconsistent, stuttery, uncontrollable.
And I have lived this reality before. Thankfully, I am now somewhat inured from the worst of Frame-dips.
my very first prebuilt had a sempron from this series and down the road i threw in a 270x it could game but barely i played titanfall 1 on like LOW 900p that was years ago ik you can overclock them, one dude pushed one super hard and i thought it was pretty funny always wondered how it would game with a massive OC still would be hindered but you could for sure play older games
i would love to have a quad core athlon for my nas wich currently runs on a Sempron 2650, wich is the worse, dual core variant
Please do a video on i7 4790K with these high end cards especially rtx 2070 !
Wow, that Athlon is a total stinker. Brilliant 👍👍👍👍
I mean, if it was like 3 - 4 GHz Quad Core with no hyperthreading, like a Phenom x4, it could still run all those games even now with 60 FPS no issues, but 1.6Ghz is... yeah
I currently have a Ryzen 5 1600 AF paired with an rx 6700 xt... Kind of weird that ultra and medium settings have nearly identical fps...
i mean, even that is bottlenecking your 6700 pretty hard. of course not to this comical level, but still there. if you can spare the cash, you should get an r5 5600
Nice video, I have a 3070 with a 10400f, which on many forums and comments everyone says this is a bad combo as the 3070 gets bottenecked. I don't see this happenning, she runs at 99% on many heavy AAA games.
What do you guys think?
Maybe a little bit in some scenarios in some games, but then you're already at very high framerates. For now 6 cores with 12 threads with high IPC and clockspeeds are a good match for your 3070. And if you are gaming @ 1440p or higher this becomes even more a non issue.
is it a cpu bottleneck if my cpu stands at 50-60% while my gpu is at 99-100% or its just fine? im running a 9400f stock and a rx 6750 xt oc
edit: i play on 1440p btw:)
btw i found a rx 6750 xt on my country at 499 dollars :D
Depends on the resolution/game you use. At 1080p in Assassins Creed Origins/Odyssey, MSFS 2020 and other CPU intensive games my 10400 does bottleneck my 3070 such that eventually I'll upgrade to a faster CPU. At 4K things even out but at lower fps.
How bad of a bottleneck is a GTX 970 to a 12100F and what should i upgrade to?
if we're going by golf scoring, this pc build is quite formidable