I chose the 3550 back then, and I would do the same again. The extra heat and noise just wouldn’t be worth it to me personally. I do own an 8350 system today, but it’s more like a fun curiosity. And I have a huge modern cooler on it 😅
aaayyy I have a i5 3550 with me right now hahaha I was planning on using it for a windows XP retro gaming pc, but I ended up buying a cheap i7 2600.... just because lol
I chose the fx-8350, and always would. The slightly lower gaming was well offset by actual multithreaded performance - background tasks didn't crush and halt the system. The GPU I could afford was holding any system back (as the main bottleneck) anyway. I still use my fx-8350 as my backup machine to this day (My main pc has gone through: Ryzen 1700X, 2700X, 3900X, to 5950X - all on the original MB.)
My FX-8300 & RX-470 combo has been in daily use since 11/2016 and performs as well or better than ever. It's entirely possible that software upgrades have made better use of the FX-8300's multi core capabilities over time.
Yes, that was the issue. FX offered more cores when game developers did not use all that power. That's why in some games FX was beaten by i3 that was dual core 4 threads but better single core performance. But as expected, over time games asked for more cores and FX could show its true potential.
I saw the video from MDFGaming where he shows in detail how to do it right. Tho I think you need the higher end motherboard chipsets and ram for best results which usually most people are happy with midrange.
Even 3570k oc wins, i5 3570k was very expensive. More expensive than 3570. I think a good idea keep comparing about prices. The fact is, 8320 cost (BRAZIL) about 400. i7 3770 was about 1000-1500. With 500, you could have 8350. The performance for the price of the 8350 is monstrous
@@Daniel-CoolTI I also moved from Phenom II to FX back in the day however mine was plagued with motherboard and ultimately CPU issues so my experience was sadly short lived.
my buddy used to go to thrift stores and buy broken desktop PC for like $5 and strip out the parts. Most of the time either the motherboard or power supply was broken. I did the same and in the end I got a Core i5-4430 and 8GB of DDR3 RAM for $5. Mind you, this was years ago when the value of a Haswell i5 could feed a person for a week.
haha yeah I used to do that back in the day too. Found some cool things. I remember once I found a Unisys server the size of a large end table sitting about 3 feet high. Couldn't wait to get it home and tear it down. Had its own wheels and it weighed at least 75 lbs. Turns out it had a mobile Pentium III soldered onto this huge daughter card. Nothing special other than the design.
No matter what others say, the FX was ahead of it's time, maybe too much ahead in the time, when the people used 2 cores for gaming because it was optimal with same year gpus. Since we see the results of testing with modern gpus, it shows it's weakness in memory and nanometer size of chip while having good potential in multicore and value.
First gen FX IPC was behind phenom ii's by the time the FX 8350 came out the i7 3930k had been out for a year, by the time the FX 8370 was released both the i7 4790k and i7 5960x were in the market. 2 core CPUs have been dead since 2010 Intels first gen i7s had 6 core 12 threaded extreme skus the Phenoms II had affordable 6 core skus. 😂
First sentence is true, second is not so. Its hybrid design with shared cache between a single FPU and 2 cores is a huge bottleneck. It makes so many threads pointlessly wait for others in the same task. Memory bandwidth and process size have nothing to do with that. It's simply bad design.
Nah, FX was a gamble on developers making better programs with multithreading and games to use less of one core and less of int, so it's only shines with modern tasks and games since it has more cores to work around, while 4c4t has worse 1% in modern games due to lack of cores. The problem with FX was it being not intel, coming out too late and not fast in single core, despite being a cheaper product overall. Also phenoms were more expensive(1000$ for X6 1090T jeez), making them a higher rank, previous gen vs cheap AM3+ starter cpus.
I upgraded from an Phenom 965 on a AM3 Motherboard to an FX8350. They were excellent value for the money for keeping your existing socket a good while longer. There is another video that shows how to properly milk the most out of your fx by overclocking the bus and ram as a priority.
Good review. I am suprise to see that there is an important difference between the FX 6300 and FX 8320. I upgraded from FX 6300 to FX 8350 on my third PC.
The AMD FX 8 cores was the best solution for an office pc or simple homelab server. You could have 8 cores to your disposal for very low cost. Sure the single core performance is lower then an i5 3570 but that did not really matter if you are running vm's for simple operations like a windows or linux server. Minecraft server anyone ? As a side note, the little brother of this i5 the 3470s was the obvious choice from a consumer standpoint back in those days. Thanks for the content, I always love your tests.
office pc? I dunno. Maybe if you were working from home, but not in a business. Having a ton of high wattage cpu's in a single office, yeah not fun. Been there already. But a home office or home server, for the price..sure!
well the fact with Fx was the Multi Tasking capabilities... An Fx-6300 would do mostly like a 3rd gen i3 for single tasks (not talking about single threaded apps, but using a single app in the PC) while an Fx 8320 would do like a 3rd gen i5 and Fx8350 like a high end 3rd gen i5. In some scenarios and cherry picked apps, an Fx6300 would battle toe to toe against a 2nd or 3rd gen i5, pretty much like an Fx 83xx would battle against an i7... but most of the time the performance was more on pair with i3 and i5 respectively. The real improvement of the Fx cpus were the extra cores, which bring a lot of multitasking to the mix... in my Fx8350 i was able (thrue core afinity in Windows Task Manager) to render Photos with RawTherappee with 6 cores (making it effectively an overclocked Fx6300) while leaving 2 cores (effectively an Apu A6) for web browsing and even emulator gaming without stutters and problems... if i tried to do that with an i5 the performance penalty was too high... Therefore Fx chips were better for multitasking, which is also relevant when you play games while streaming and/or using a com. software + come Chrome tabs open... At this point in time it might seem like there's no value left in this CPUs, and you can't do this kind of multitasking... but if you play indie and/or older tittles you can still put ahead a lot of multitasking, and for Fx 8 core chips... up until 2024 you are still kind of okeyish for 1080p/60fps, they are 12 year old designs, so you can't really ask that much nowadays, but at the correct price they can still provide a decent platform.
Thank you for testing the OG nfs most wanted, it is very interesting game from hardware perspective, getting high fps for some reason is EXTREMELY difficult and with unlocked fps it has severe frametime issues. Now to put into perspective i have tested this game on my 5950x and i could only muster ~100fps on this ancient game😄 So its always interesting to see how other more period correct hardware performs on it. Btw would be nice if you could revisit this amd fx with moderate OC, i heard it really benefits from it
heh I think it was during that time that developers didn't know what to do. How to properly use extra threads but still make it playable on single threaded machines. Unfortunately both suffered. As far as overclocking, sure. A lot of people have commented asking as well. I'll have to find or put together a better heatsink, but yeah definitely will.
@@jims_junk nice looking forward for your fx revisit then, in case you wanna see the nfs mw bench on 5950x here is my older test: th-cam.com/video/ZL_jRHbkszg/w-d-xo.html
The FX CPUs got cut in price real fast, a few months after release. The fx 6300 became 90-100$ and the 8320 became 130$ and their motherboards were also cheaper, they were 45-70$.
When they were released, FX were expensive. Over time, reviews badmouthing them (unfairly), and picking on them especially in games, prices dropped by at least half. It took only 1 or 2 months, and prices fell.
@@opinadorrj4337 The FX 8150 came out costing 245$ while the FX 8120 was 200$. The FX 8320 came out costing 169$ and the 8350 came out costing 200$. The FX 6100 came out costing 160$ and the FX 6300 came out costing 130$. And it's true their prices dropped within a few months.
FX 8xxx architecture is still capable of running relatively modern games at sluggish but steady 30-40 FPS . On the other hand, Intel 4c/4t CPUs are now a stutter fest . Thus, AMD was able to win - somehow - in the end 😁
Thats not True, if fx is running some modern game than the 4c/4t i5 Will run it Just as well and overclocked I5 Will even beat it L, especially at around 5ghz the I5 Will end up being untouchable for the fx no matter how far you'll overclock it.
@@patrickc8007 Nope. Games like Cyberpunk 2077 (even horrible mess like Starfield 😝) tend to run steadily at FX-83xx CPUs . Slowly, but steadily without much 1% 0.1% drops. On Intel 4c/4t CPUs average FPS may be fine, but they stutter heavily, with 1% and 0.1% dropping bellow 10. No amount of overclocking helps, there are simply not enough trends .
@@aleksazunjic9672 My Brother finished cp2077 on a 4670k which is 4c/4t and didnt notice any of that, from what i saw on the benchmark it ran pretty much the same like on 8350 or even slightly smoother, both are irrelevant at this point tho, dropped to 35-40 fps in busy areas.
@@patrickc8007 Hey dude, back in the day I used to play lot of games on the hardware that was less than adequate 😁If you really want to play and finish the game, you simply pretend not to notice slowdowns and stutters. This is especially true for single player titles. But objectively, if you measure FPS during the gameplay, 4c/4t is simply not there anymore. Of course, I agree that both CPUs and technologies are obsolete, but it must be said that AMD did age better and "won" at the end, i.e. their multicore technology finally become partially useful.
@@aleksazunjic9672 Benchmarks says something else, those arent Real 8 cores Man, theres more to it, shared FPUs, latency, memory controller, even if the game uses 100% of the fx you arent benefiting fully on those 8 cores since they only use 4 shared floating point units which are the most important resource in Gaming. Even when you look at pure rendering the unlocked overclocked I5 can match or even beat the performance in cinebench multicore using 2x less cores, the fx Just closed the gap a Little in newer games but it still isnt better and both aged to the point where using them is simply a pain in the ass, there are much better cheap options theese days, especially the AMD AM4 platform and ryzen 3000/5000 series, Best Bang for the buck.
You don't need to make more heat with this cpu overclocking. Often you can squeeze more hz out of them with stock or less than stock voltage. Also tweaking the FSB/HT and getting some really good ram will have even more impact than an extra ghz of clock. TLDR Raise frequency, lower voltage - same power Raise FSB speeds, no extra power draw Tweak ram timings and speeds, these really come alive with 2400mhz ddr3
I had a fx 4100 before and I was a noob overclocker back then, I pegged the cpu voltage If IIRC to about 1.3 1.35? and stuck the clock to 4.3 ghz. It was so glorious back then I hope you get to try and OC that 8 core chip, it can probably do 4.5ghz ez
I know what you mean. i used Ivy Bridge processors for a good while, overclocked them, and I they were good at what they did. Even with a SSD, Ivy Bridge and Sandy Bridge really didn't seem to pop unless they were overclocked or I was using a HEDT version. Nehalem was even worse. Before the worldwide flu I ended up getting lucky and found a FX-8120 a 990FX-UD3 board for next to nothing, then later found an FX-8350 for pennies. Just for general usage, opening files, access times, boot up and load times, those FX processors were like glass. The only Intel platform I used to seemed to have near zero latency is lga1200, mainly Comet Lake.
back when I had an 8320, I had to oc it to 4.6Ghz at 1.4-something VCore to get it to quit stuttering in newer titles. One I remember in particular was Rust. Anything below 4.6, and it was a stutter fest. Never got it any higher than that though, my lowley Hyper 212 couldnt keep up past 1.4-something VCore.
I thew a optitplex away because it was striped no front panel or side ,that was a 3570 i5 it served me well it never crashed when running midi software i had a samsung i3 2130m laptop that was allways crashing i replaced them with another optiplex 3020 with an i5 4440...i still have it i dont know what to do with it because win10 end of service next year i just have it sitting there doing nothing .i also have a asusg531gv laptop 9750 i7 65w 6core 12treads rtx2060 its a heavy laptop
In this case the Win10 task manager is correct, the 8 core thing is a marketing gimmick in the end. by design if a core is sharing resources with another core in terms of cache and such, it's technically a thread and not a core. it's just a full thread instead of a "hey let's push other stuff on the same core in the background" thing with Intel chips of the time. In the end, it's still just a thread. if it had its own cache and FPU core dedicated to it, it would be a full on core, by definition. This is why the win10 task manager says 4 cores and 8 threads... but the argument is still going on and that's why CPU-Z still says 8 cores. Technically it's not wrong, but it's also technically not right.
@@amdintelxsniperx The argument to that which I have seen is like with the early intel chips like 386 with no FPU, it is still considered a cpu and a single core, if there was such a thing as a dual socket 386 it would be 2 cpu cores. If you then wanted and FPU you would likely only add one to share. Either way the cpus were still considered whole cores and sharing resources on the board doesn't just turn them into one core.
@@martinkoyle Only a fanboy would call someone else one. Been building pcs since 97 and have a pile of intel and amd cpus. Just because I am using logic to explain why calling it a 4 core is inaccurate does not = fanboy troll.
Was thinking of doing a sandy ridge vs ivy comparison since their the same socket. I remember upgrading to an ivy from sandy and noticed a huge difference.
that explains why games stuttered on nvidia cards vs AMD cards. tested last year, and even my R9 380X did better in cyberpunk vs the 1060 i tried. 1060 was a god awful stutterfest, while the 380X handled 30's / 40's smoothly.
Are you applying the updates for Meltdown and Spectre to the i5? While not applying them is more representative to how they would have performed at the time, it's pretty obvious in hindsight that most of the performance gains Intel had over AMD at this time were the result of cutting corners and not properly handling cached data.
That's a good point but no running them up to date to see which is truly faster with all things considered. I could do a vid comparing the two with and without those fixes enabled.
Lol my i3 4170 can out perform a 8320 I actually still have my 8320 mobo is dead but I still have it also just so u know u can overclock the fx to 4ghz without issue it is the same exact processor as an 8350 just lower clocked
Ich frage mich gerade ob die Entwicklung geschlafen hat. Oder ob alles Marketing ist!!!10 Jahre später nicht mehr Leistung! Keine Verdopplung! Eigentlich lächerlich!!! Nutze Fx 8320 selber muss nur wechseln wegen Windows 11. Fast kein Leistungsdruck.
I went from a FX6300 to a 2700x, boy what a difference
I chose the 3550 back then, and I would do the same again. The extra heat and noise just wouldn’t be worth it to me personally. I do own an 8350 system today, but it’s more like a fun curiosity. And I have a huge modern cooler on it 😅
aaayyy
I have a i5 3550 with me right now hahaha
I was planning on using it for a windows XP retro gaming pc, but I ended up buying a cheap i7 2600.... just because lol
a CoolerMaster 212 Evo (which was the go-to midrange aftermarket cooler back then) was just 25 bucks and a great value
I switched from FX-8350 to an i5 3570 back then and I felt sorry. The FX was providing a much smoother experience in games but also on web browsing.
I chose the fx-8350, and always would.
The slightly lower gaming was well offset by actual multithreaded performance - background tasks didn't crush and halt the system. The GPU I could afford was holding any system back (as the main bottleneck) anyway.
I still use my fx-8350 as my backup machine to this day
(My main pc has gone through: Ryzen 1700X, 2700X, 3900X, to 5950X - all on the original MB.)
My FX-8300 & RX-470 combo has been in daily use since 11/2016 and performs as well or better than ever. It's entirely possible that software upgrades have made better use of the FX-8300's multi core capabilities over time.
Yes, that was the issue. FX offered more cores when game developers did not use all that power. That's why in some games FX was beaten by i3 that was dual core 4 threads but better single core performance. But as expected, over time games asked for more cores and FX could show its true potential.
The key to fx is overclocking the north bridge insane performance increase
I've not seen much difference. Do you have detailed settings that would achieve something significant? fx990.
@@Nick_R_ agreed. you may get like 5% more performance. 10% if you are lucky.
I saw the video from MDFGaming where he shows in detail how to do it right. Tho I think you need the higher end motherboard chipsets and ram for best results which usually most people are happy with midrange.
3570K OC vs 8320 OC! See how far you can push both of them then retest!
3570k wins
Just so we can see it😀
Even 3570k oc wins, i5 3570k was very expensive. More expensive than 3570. I think a good idea keep comparing about prices. The fact is, 8320 cost (BRAZIL) about 400. i7 3770 was about 1000-1500. With 500, you could have 8350. The performance for the price of the 8350 is monstrous
I'd love to see a Phenom II vs FX comparison
I upgraded from Phenom II to FX... The FX crushes it. Well the Phenom II has just slightly better single core performance but not by much.
@@Daniel-CoolTI I also moved from Phenom II to FX back in the day however mine was plagued with motherboard and ultimately CPU issues so my experience was sadly short lived.
my buddy used to go to thrift stores and buy broken desktop PC for like $5 and strip out the parts. Most of the time either the motherboard or power supply was broken. I did the same and in the end I got a Core i5-4430 and 8GB of DDR3 RAM for $5. Mind you, this was years ago when the value of a Haswell i5 could feed a person for a week.
haha yeah I used to do that back in the day too. Found some cool things. I remember once I found a Unisys server the size of a large end table sitting about 3 feet high. Couldn't wait to get it home and tear it down. Had its own wheels and it weighed at least 75 lbs. Turns out it had a mobile Pentium III soldered onto this huge daughter card. Nothing special other than the design.
No matter what others say, the FX was ahead of it's time, maybe too much ahead in the time, when the people used 2 cores for gaming because it was optimal with same year gpus. Since we see the results of testing with modern gpus, it shows it's weakness in memory and nanometer size of chip while having good potential in multicore and value.
a good board with a good fsb and ram oc plus low latency really make fx pop
First gen FX IPC was behind phenom ii's by the time the FX 8350 came out the i7 3930k had been out for a year, by the time the FX 8370 was released both the i7 4790k and i7 5960x were in the market. 2 core CPUs have been dead since 2010 Intels first gen i7s had 6 core 12 threaded extreme skus the Phenoms II had affordable 6 core skus. 😂
First sentence is true, second is not so. Its hybrid design with shared cache between a single FPU and 2 cores is a huge bottleneck. It makes so many threads pointlessly wait for others in the same task. Memory bandwidth and process size have nothing to do with that. It's simply bad design.
Nah, FX was a gamble on developers making better programs with multithreading and games to use less of one core and less of int, so it's only shines with modern tasks and games since it has more cores to work around, while 4c4t has worse 1% in modern games due to lack of cores. The problem with FX was it being not intel, coming out too late and not fast in single core, despite being a cheaper product overall.
Also phenoms were more expensive(1000$ for X6 1090T jeez), making them a higher rank, previous gen vs cheap AM3+ starter cpus.
@@veryevilpersonfromillumina5893 Awesome, thanks for the input
I built and FX 6300 based system that lasted many years. There was a huge difference in price between FX and intel processors back then.
I upgraded from an Phenom 965 on a AM3 Motherboard to an FX8350.
They were excellent value for the money for keeping your existing socket a good while longer.
There is another video that shows how to properly milk the most out of your fx by overclocking the bus and ram as a priority.
Please OC! Also amazing vid!
Good review. I am suprise to see that there is an important difference between the FX 6300 and FX 8320. I upgraded from FX 6300 to FX 8350 on my third PC.
The AMD FX 8 cores was the best solution for an office pc or simple homelab server. You could have 8 cores to your disposal for very low cost. Sure the single core performance is lower then an i5 3570 but that did not really matter if you are running vm's for simple operations like a windows or linux server. Minecraft server anyone ? As a side note, the little brother of this i5 the 3470s was the obvious choice from a consumer standpoint back in those days. Thanks for the content, I always love your tests.
office pc? I dunno. Maybe if you were working from home, but not in a business. Having a ton of high wattage cpu's in a single office, yeah not fun. Been there already. But a home office or home server, for the price..sure!
@@martinkoyle office PC need more core to open extra Tab ~
well the fact with Fx was the Multi Tasking capabilities...
An Fx-6300 would do mostly like a 3rd gen i3 for single tasks (not talking about single threaded apps, but using a single app in the PC) while an Fx 8320 would do like a 3rd gen i5 and Fx8350 like a high end 3rd gen i5.
In some scenarios and cherry picked apps, an Fx6300 would battle toe to toe against a 2nd or 3rd gen i5, pretty much like an Fx 83xx would battle against an i7... but most of the time the performance was more on pair with i3 and i5 respectively.
The real improvement of the Fx cpus were the extra cores, which bring a lot of multitasking to the mix... in my Fx8350 i was able (thrue core afinity in Windows Task Manager) to render Photos with RawTherappee with 6 cores (making it effectively an overclocked Fx6300) while leaving 2 cores (effectively an Apu A6) for web browsing and even emulator gaming without stutters and problems... if i tried to do that with an i5 the performance penalty was too high...
Therefore Fx chips were better for multitasking, which is also relevant when you play games while streaming and/or using a com. software + come Chrome tabs open... At this point in time it might seem like there's no value left in this CPUs, and you can't do this kind of multitasking... but if you play indie and/or older tittles you can still put ahead a lot of multitasking, and for Fx 8 core chips... up until 2024 you are still kind of okeyish for 1080p/60fps, they are 12 year old designs, so you can't really ask that much nowadays, but at the correct price they can still provide a decent platform.
Next do the i7 vs an FX 9570 or 9590
Thank you for testing the OG nfs most wanted, it is very interesting game from hardware perspective, getting high fps for some reason is EXTREMELY difficult and with unlocked fps it has severe frametime issues. Now to put into perspective i have tested this game on my 5950x and i could only muster ~100fps on this ancient game😄 So its always interesting to see how other more period correct hardware performs on it.
Btw would be nice if you could revisit this amd fx with moderate OC, i heard it really benefits from it
heh I think it was during that time that developers didn't know what to do. How to properly use extra threads but still make it playable on single threaded machines. Unfortunately both suffered. As far as overclocking, sure. A lot of people have commented asking as well. I'll have to find or put together a better heatsink, but yeah definitely will.
@@jims_junk nice looking forward for your fx revisit then, in case you wanna see the nfs mw bench on 5950x here is my older test:
th-cam.com/video/ZL_jRHbkszg/w-d-xo.html
can you include cyberpunk in the tests next time? - i saw interesting results regarding fx cpus there
The FX CPUs got cut in price real fast, a few months after release. The fx 6300 became 90-100$ and the 8320 became 130$ and their motherboards were also cheaper, they were 45-70$.
When they were released, FX were expensive. Over time, reviews badmouthing them (unfairly), and picking on them especially in games, prices dropped by at least half. It took only 1 or 2 months, and prices fell.
@@opinadorrj4337 The FX 8150 came out costing 245$ while the FX 8120 was 200$. The FX 8320 came out costing 169$ and the 8350 came out costing 200$.
The FX 6100 came out costing 160$ and the FX 6300 came out costing 130$.
And it's true their prices dropped within a few months.
I still have an FX 8320 pc I some times still use.Good value,similar performance as Intel i7 2000 series.
Fx 4 modules, 8 physical threads
4 fpu, 4 cash, 8 cores, 8 threads.
Could you benchmark my mom vs my dad?
I still have to decide who to plead for in court
WTF 💀😂
FX 8xxx architecture is still capable of running relatively modern games at sluggish but steady 30-40 FPS . On the other hand, Intel 4c/4t CPUs are now a stutter fest . Thus, AMD was able to win - somehow - in the end 😁
Thats not True, if fx is running some modern game than the 4c/4t i5 Will run it Just as well and overclocked I5 Will even beat it L, especially at around 5ghz the I5 Will end up being untouchable for the fx no matter how far you'll overclock it.
@@patrickc8007 Nope. Games like Cyberpunk 2077 (even horrible mess like Starfield 😝) tend to run steadily at FX-83xx CPUs . Slowly, but steadily without much 1% 0.1% drops. On Intel 4c/4t CPUs average FPS may be fine, but they stutter heavily, with 1% and 0.1% dropping bellow 10. No amount of overclocking helps, there are simply not enough trends .
@@aleksazunjic9672 My Brother finished cp2077 on a 4670k which is 4c/4t and didnt notice any of that, from what i saw on the benchmark it ran pretty much the same like on 8350 or even slightly smoother, both are irrelevant at this point tho, dropped to 35-40 fps in busy areas.
@@patrickc8007 Hey dude, back in the day I used to play lot of games on the hardware that was less than adequate 😁If you really want to play and finish the game, you simply pretend not to notice slowdowns and stutters. This is especially true for single player titles. But objectively, if you measure FPS during the gameplay, 4c/4t is simply not there anymore. Of course, I agree that both CPUs and technologies are obsolete, but it must be said that AMD did age better and "won" at the end, i.e. their multicore technology finally become partially useful.
@@aleksazunjic9672 Benchmarks says something else, those arent Real 8 cores Man, theres more to it, shared FPUs, latency, memory controller, even if the game uses 100% of the fx you arent benefiting fully on those 8 cores since they only use 4 shared floating point units which are the most important resource in Gaming. Even when you look at pure rendering the unlocked overclocked I5 can match or even beat the performance in cinebench multicore using 2x less cores, the fx Just closed the gap a Little in newer games but it still isnt better and both aged to the point where using them is simply a pain in the ass, there are much better cheap options theese days, especially the AMD AM4 platform and ryzen 3000/5000 series, Best Bang for the buck.
You don't need to make more heat with this cpu overclocking. Often you can squeeze more hz out of them with stock or less than stock voltage. Also tweaking the FSB/HT and getting some really good ram will have even more impact than an extra ghz of clock.
TLDR
Raise frequency, lower voltage - same power
Raise FSB speeds, no extra power draw
Tweak ram timings and speeds, these really come alive with 2400mhz ddr3
I had a fx 4100 before and I was a noob overclocker back then, I pegged the cpu voltage If IIRC to about 1.3 1.35? and stuck the clock to 4.3 ghz. It was so glorious back then I hope you get to try and OC that 8 core chip, it can probably do 4.5ghz ez
The only reason a bought an FX 8 core back then was for playing GTA 4 at 60fps plus it was cheaper than intel
CPU-Z shows your i5-3570 running at 2.65V! I'm pretty sure that's a sensor glitch lol
Still have 2 8320’s and a 3570K (swapped in a 3770 years ago). Just got a 1070 GPU on eBay for $77 to test in them, expecting a big cpu bottleneck.
Can we see this with 3770? :D
i find it weird how my z77 i7 3770k doesnt feel as smooth as my 9590 though
I know what you mean. i used Ivy Bridge processors for a good while, overclocked them, and I they were good at what they did. Even with a SSD, Ivy Bridge and Sandy Bridge really didn't seem to pop unless they were overclocked or I was using a HEDT version. Nehalem was even worse.
Before the worldwide flu I ended up getting lucky and found a FX-8120 a 990FX-UD3 board for next to nothing, then later found an FX-8350 for pennies. Just for general usage, opening files, access times, boot up and load times, those FX processors were like glass.
The only Intel platform I used to seemed to have near zero latency is lga1200, mainly Comet Lake.
back when I had an 8320, I had to oc it to 4.6Ghz at 1.4-something VCore to get it to quit stuttering in newer titles. One I remember in particular was Rust. Anything below 4.6, and it was a stutter fest. Never got it any higher than that though, my lowley Hyper 212 couldnt keep up past 1.4-something VCore.
I thew a optitplex away because it was striped no front panel or side ,that was a 3570 i5 it served me well it never crashed when running midi software i had a samsung i3 2130m laptop that was allways crashing i replaced them with another optiplex 3020 with an i5 4440...i still have it i dont know what to do with it because win10 end of service next year i just have it sitting there doing nothing .i also have a asusg531gv laptop 9750 i7 65w 6core 12treads rtx2060 its a heavy laptop
In this case the Win10 task manager is correct, the 8 core thing is a marketing gimmick in the end. by design if a core is sharing resources with another core in terms of cache and such, it's technically a thread and not a core. it's just a full thread instead of a "hey let's push other stuff on the same core in the background" thing with Intel chips of the time. In the end, it's still just a thread. if it had its own cache and FPU core dedicated to it, it would be a full on core, by definition. This is why the win10 task manager says 4 cores and 8 threads... but the argument is still going on and that's why CPU-Z still says 8 cores. Technically it's not wrong, but it's also technically not right.
task manager reads the fpus
@@amdintelxsniperx The argument to that which I have seen is like with the early intel chips like 386 with no FPU, it is still considered a cpu and a single core, if there was such a thing as a dual socket 386 it would be 2 cpu cores. If you then wanted and FPU you would likely only add one to share. Either way the cpus were still considered whole cores and sharing resources on the board doesn't just turn them into one core.
@@D3M3NT3Dstrang3r
@@martinkoyle Only a fanboy would call someone else one. Been building pcs since 97 and have a pile of intel and amd cpus. Just because I am using logic to explain why calling it a 4 core is inaccurate does not = fanboy troll.
Can you test the i7 2600? or any Sandy Bridge CPU against other CPUs you have?
Was thinking of doing a sandy ridge vs ivy comparison since their the same socket. I remember upgrading to an ivy from sandy and noticed a huge difference.
Dont use an NVidia GPU..
The FX probably cant handle the software scheduler 😂
AMD GPU have a hardware scheduler..
You need extra cores for NVidia.
that explains why games stuttered on nvidia cards vs AMD cards. tested last year, and even my R9 380X did better in cyberpunk vs the 1060 i tried. 1060 was a god awful stutterfest, while the 380X handled 30's / 40's smoothly.
Are you applying the updates for Meltdown and Spectre to the i5? While not applying them is more representative to how they would have performed at the time, it's pretty obvious in hindsight that most of the performance gains Intel had over AMD at this time were the result of cutting corners and not properly handling cached data.
That's a good point but no running them up to date to see which is truly faster with all things considered. I could do a vid comparing the two with and without those fixes enabled.
@@jims_junkyes please, I'd love to see that
What program are you usibg to edit?
Movie Studio Platinum ...aka cheap Sony Vegas. Why do you ask?
@@jims_junk At 2:00 that zoom to the left was phenomenal, and I kinda wanna do something like it.
Lol my i3 4170 can out perform a 8320 I actually still have my 8320 mobo is dead but I still have it also just so u know u can overclock the fx to 4ghz without issue it is the same exact processor as an 8350 just lower clocked
2 core 4 threads? You capping😂😂
@@erviinkehhh look it up
@@erviinkehhh look it up an i3 4170 is about 13 to 17% faster than an fx8320 my i7 4790 blows it out of the water by a lot
You could overclock it and yeah it'll speed it up a lot, but it'll never touch the 3770 i7.
It literally beats it in a whole bunch of these tests. Don't speak if you haven't bothered to look up what you're talking about.
you don't really understand
intel 🎉
Ich frage mich gerade ob die Entwicklung geschlafen hat.
Oder ob alles Marketing ist!!!10 Jahre später nicht mehr Leistung!
Keine Verdopplung! Eigentlich lächerlich!!! Nutze Fx 8320 selber muss nur wechseln wegen Windows 11. Fast kein Leistungsdruck.
8 cores 4 cores
8 cores 4 threads
4 fpu, 4 cash, 8C, 8T 😉
Bruv FX8350 it ain't 8 cores!! AMD got lawsuit for lie to people!! Same as FX6300 it ain't 6 cores CPU!!
the lawyers who helped the person win that are idiots lol these do have 8 cores but only 4 integers
There are definitely 8 cores in silicon; unfortunately, they're not _good_ cores, further hampered by the horrible layout.
just google FX8350 die shots and you can literally count for yourself that there are 8 actual cores, that share resources
@@amdintelxsniperx *FPUs not ints
@@kingeling my bad thats right lol
gde razgon? noob