I used a 2500k from 2014 to 2021 and never knew that it had a legendary status. When I built my 1st PC in 2014 it just sounded like it was a good overclockable CPU that I could still use in later years. It was still chugging along in summer of 2021 when I built my current setup. Now it just sits in my old tower collecting dust.
After 10 years for me, I've decided it's time for an upgrade. I've had my 2500k OC'd to 4.8ghz (air cooled) for a decade and it's still performing identically to when I got it. I would say the 2500K is the best cpu ever made, unfortunately it was made 11 years ago. I just bought an i7-12700K and am seriously considering framing my 2500K on my wall
@zssllayers1946 depends on your luck, CPU cooler, motherboard etc. But tbh 4,5 GHz for i5 2500k is realy the norm, I had 4,6 and managed stable 4,8 (with no performance improvement though, I think my board is not good enough to get that power).
@zssllayers1946 Hmm...Take a closer look at your set up and follow some of the online guides. CY's response to you is spot on. Luck is a factor but what is your cooling situation like?
In my opinion cheaping out on CPU is unnecessary, CPUs aren't overpriced like GPUs and there are a lot of good options depending on budget, as long as your budget for a CPU is above $100. Something like a modern i3 or Ryzen 3 3100/3300X is plenty good for 60fps in modern games and a 6C/12T i5/Zen 2 or above Ryzen is going to give you a good platform for now and the future. If your budget is below $100 for a CPU, then something like an old i7 or high core count Xeon could be good cheap options and likely much better than an Athlon/Pentium.
@@younglingslayer2896 Pointless frankly. I wouldn't upgrade if I was one broken knee away from bankruptcy. Sorry man, I know it sucks. Maybe some old system, but honestly, it wouldn't be worth it.
I know someone who still has an i7-2600K in his main PC, paired with a GTX 980ti it still holds up reasonably well with a 4.2Ghz overclock. By reasonably well I mean that it manages to stay at or above 60fps in games like Star Wars Battlefront 2, he also plays Battlefield V, COD Warzone, COD Cold War (zombies specifically). It seems he's still quite happy with the performance of the CPU after nearly 10 years of service.
Literally since 2013 when I first got into PC gaming I've been exclusively a 1080p gamer with what I've played, what I've built, etc. etc. Haven't felt the need to spend the money to get into 1440p let alone 4K. Maybe someday I will but 1080p does the job for me.
The higher the resolutions the more expensive the hardware you need to fuel it properly. I was using a 5700xt @ 1080p with my first build and my second build since last year has been a 5900X/6800XT @ 1440p. I'd rather use a 4k card on a high fps 27" 1440p IPS display than a 4K but I do occasionally hook my machine up to my flat screen. Still I rather have the higher fps personally.
yeah I swapped to 1440p and regret it... you can't "unsee" it... but the price to keep it "up2date is HIGH... wish I had stayed oblivious and at 1080p... I do miss the 1650x1080 era though :D
My mate had the 2500K until last year. He overclocked it to 4.6 GHz and was able to play every Call of Duty each year (with an Nvidia 970 in the end). Legend CPU. He now rocks a 5800x though after finally upgrading.
Where were the 1% lows Chris? Testing really old CPUs with current games and GPUs is probably the most important hardware to test what the drops might be. I know you don’t always do benchmarking, but you have had a lot of criticism for other outlets methods in the past including Jayztwocents for the titles he selected, and here we are using a mix of current and older releases with built in benchmark tools and no 1% lows.
@@bcal5962 He's right average fps means nothing on barely capable cpus. It could be a pure stutter and freezing fest on some games which is no way playable and average fps doesn't show this.
Absolutely. The 1% lows render these newer games unplayable on these old cpus. I've tested with a 2600k at 5ghz several games and you will get these massive drops
you can see the frametime graph, that IS the data you looking for. Any spikes on that line, its stutter which the 1% represents in a number but the actual frametime line is king when looking at game performance
Cool test. Not really surprised by results. I still use 13 years old Intel's QuadCore2 Q9650 (4C/4T) Oced to 4.1 GHz in my backup rig coupled with 12 gigs of DDR2 1333/1050TI/SSD combo and it is still able to run most games at 1080p/40-60fps. Granted, it's not a joyride, minority of modern games simply refuse to run due to missing instruction set, but still pretty cool to see you can actually still game on a 20 bucks CPU from 2008.
And of course the TSMC 20% price hike in CPU's and GPU's now... just keeps getting better and better. And that's from TSMC, so what we see down the pipeline will probably be higher. Its getting tight out there boys. Good content, good direction to go, given the state of things now.
I must be tired when I first read this comment I swear it said "Sandy Bridge was a Joke" and got that quick back of the neck burn lol. The read it again and seen it said "Sandy Bridge was no Joke" & went phew lol. I'm still sporting a Sandy i7 2600K@5.1GHz and feel I got my full moneys worth out of this CPU 100% for sure it owes me nothing. Heck I play pretty much all of the new titles and it still chews through these new games without to much issues at all. I am sure one of those new 8/16 Intels or AMD own 8/16 would do much better but I play on a 32" 1080p 60Hz screen so I guess it does not have to sweat to much to churn out only 60FPS for me lol. If I unlock FPS it shoots up quite good in most every game. I think the hardest game I have played on it was Assassins creed OD that game would not go over 70FPS for me no matter what and it stuttered from time to time. Assassins Creed Val runs great though smooth as can be that's kind of weird but I will take it lol.
went from an i3 4th gen (2c/4t) thing was bottleneck central every game. Finally said F it and bought an i7 10th gen which is overkill and out of budget. This 8c/16t never bottle necks thats a thing of the past now, even at sub 30 fps very smooth. 1% lows was the reason for buying this cpu. going from 80 to 30-50fps from the 2 core I3 frequently just annoys you. To anybody reading this dont overspend like I did. For gaming at most go for the i5 or ryzen 5 (6 core/12threads) more than enough to game with 1% lows no less than 50 fps at worst case scenario. Grab the latest 4 core 8 thread cpu you can afford and you wont have a problem with stutter.
That's a known fact and unwritten rule by now. They really need to make a new from the ground up next gen engine to take advantage and be optimized for the PS5/XSX consoles and for the current and future PC tech.
I shifted to ryzen 5 3600 , gave the old pc to my kids i5 3470 . I bought them a used rx 570 for a round 90 usd a year ago it still run modern games very well
Before the pandemic hit and messed up the used PC market, I regularly found AE Ryzen 5 1600s for 35-45 used, B350 mobos for 30-35. It's an excellent and somewhat still relative platform.
@@chrisvanalstine8136 the i5 is not 30% faster… and in modern day games the fx-83xx will perform better. Also to answer the question it depends on the game fx can be funky
@@roughousegaming9606 I'm sorry you are right it's not 30%..... it's 28% and that's not the 0.1% loss that's the 1%. The average is around 20%. Anyways FX is never going to magically get good whatever the numbers are you want to nitpick over.
@@chrisvanalstine8136 🥱 never said fx was good a straw man ain’t necessary and can you elaborate on the 28% faster keynote the original poster said piledriver. We are long past the era of 4 core being enough single core performance is nice but not having enough resources is crippling ie it would be like running out of vram. Hence why the original poster was curious an 8/8 cpu should fair much better
@@roughousegaming9606 I didn't say you did. And the 28% is the percentage calculation from a few benchmarks and the fx 81xx 83xx 95xx is the "8 core" unless you mean the low clocked opteron stuff. Even with the i5 being 99% used it is still faster single core speed has nothing to do with it, it is the stronger floating point units and overall stronger core architecture.
I was so excited when I heard him say he was using the 6700 XT. This was the sort of combination I always to see being tested as a 4.4Ghz 3770K owner. The only unfortunate thing is the lack of 1440p and 4K testing. The 2500k clearly bottlenecks the 6700 XT at 1080p when you see sub 80% GPU utilization. I hope he does a followup video at 1440p and higher. I think older 4c8t parts like the 2600K/2700K, 3770K and 4790K still have some life in them yet. This is especially true for pcie 3.0 x16 capable parts from haswell and up.
If the cpu can’t do 1080p it can’t do any other resolution. The whole point of using 1080p with an OP GPU is to show the max fps capable with this cpu in these games. A viewer is sending me a 3770k so I’ll do that video soon. Basically you’ll see what the absolute max fps you can get out of a 3770k will be regardless of resolution.
This is great content. I'm really curious to see the same test with an i7-2600k, 2700k, or 3770k, especially paired with 2133 or 2400 mhz RAM. I bet they'd do substantially better, having 8 threads.
20-25% for the CPU. Not sure if that'd be enough. I'll probably snag a 2600k since I have the rest of the test bench already. 3770K's still command too high a price. The RAM would probably help Valhalla, but doesn't seem to be an issue for the other games tested.
@@TheGoodOldGamer My own experience with going from a 3570k to a 3770k in my old rig showed that minimum FPS was vastly improved with the i7, in modern CPU-heavy games. The RAM speed made a much bigger difference than I expected as well. In games that hit the CPU hard, I got a 15-20% boost out of 2133 mhz RAM, versus 1600. Those old i7's were starved for memory bandwidth with 1333 and 1600 RAM, at least in modern titles that hit the CPU hard.
@@BonusCrook I had the 3770 @4.2hgZ; for a few years. It worked great with 2133 but it was really struggling with Sony A7RII 42 megapixel files. Usually 80 megabytes. Regardless of that it was definitely rock solid. Until it died for some reason this year. I have upgraded to the Xenon 1680V2. OC to 4.5ghz.
For many AAA titles 4c/4t has not been enough even with "maxed out" OC since 2017-18, even if you got 60fps it was more often stuttering than lower clocked 4c/8t parts. My i5 6400 was suffering in some games as early as late 2016, but that's probably more because of the low clocks than a lack of threads. A few comments about the games tested: CP2077 becomes even more CPU demanding in parts of the city with many NPCs than the scene shown in this vid shows, and if you have a GPU that can run it with RT on it in my experience becomes even more demanding on the CPU. My 3600 could not keep my 3080 above 65-80% utilized in some areas with RT on at 1440p and DLSS Quality active. AC: Valhalla, in my experience this game is actually a bit less CPU demanding than Odyssey, so it would have been interesting to also see how that game would run. My 3600 at least would have more issues in game with Odyssey than Valhalla. Watch Dogs Legion is in my experience harder on the CPU in game than in the benchmark, most noticable when driving at high speed in London. This was another instance where my 3600 had a bit of a hard time keeping a 3080 fully utilized. HZD is one game that in my experience has a benchmark that doesn't really show how the game will actually run. It seems to mostly show the best case scenario in game and you will spend most of your time at a lower framerate while playing, at least on more balanced systems. I really liked this video, the only thing I would like added in future videos is that you would also include 1% lows as they help to show stutters a bit more clearly.
I used an I5 2400 (3.4 GHz max turbo), 16 GB RAM in dual channel (1333 MHz), Rx 480 4 GB, and an SSD until last summer (2020). I played on a 900p (1600x900) 19 inch TN panel with no type of variable refresh. Most games played quite well. Obviously there were some AAA titles, like Ubisoft titles that were CPU bottlenecked. But besides the outliers, it's doable. I locked all of my games to 60 FPS except for Rainbow Six Siege. I had an old business PC build so the motherboard was trash (the BIOs was very simple and locked down; I couldn't even change the RAM timings and there were no options at all for anything with RAM, also the boot order was locked; whoever made the motherboard assumed the person using the PC was an idiot so they locked it all down). But that build can handle most games. Especially if the games are from 2018 and earlier, you will be fine. 2019 is when some of those outliers began coming out that would cause a significant bottleneck with the CPU. Interesting video. I prefer this type of content. Keep up the good work.
I was running 2500k until near end of 2017. Last thing that made me sick of it was when update in Quake Champions made it have really bad stutters. Average fps was still about 110 but stuttering was so bad tha 1% lows were like 25. Even before that update it wasnt very smooth experience but it was okay. Upgrade was ryzen 1600x and averages jumped from 110 to like 145 while using 150fps limit and even bigger jump in 1% lows. Not sure how well it would run now as it got optimized in updates, but i think some other games might have similar issues.
i think most people would write the 2500k off as desirable because you can get better cpu's around the same price that offer better performance. There is no reason to go back 10 years to look at ancient hardware when there are better chips, for around the same or lower prices. All that really needs be said for the 2500k, is "its old, compared to current chips, it sucks, and if you like playing games as though its 2011, it will be perfect for you. The 2500k is on my list of "if you find one propping a desk leg up, its something that is capable of running old or little games or big games if you dont much in any way care about frame rate or especially frameTIME stability, but is in no way something desirable to purchase in 2021."
Doom just dont give a fuck to what hardware ppl have, just click play and play, its something other game developers should pay attention and learn. great video man, keep the coming, price/performance is and always was the thing.
Im still running my 2500k. I have it at 4.4 ghz! Still running strong! Run ms flight sim with an old gtx 770 on medium setting without an issue! Also have 32 gb of ram and 1tb ssd. I do want to upgrade to a rtx 3060 ti at some point when you can get them for 399 again.
still rocking the 2500k with a 970 (gpu and ram upgrade about 5 years ago) and is a SOLID 1080p computer to this day. finally thinking about upgrading since i want to upgrade to 1440p. just hope the new build holds up as well as this one has.
I still have this legend along with an i7 2600k in my own rig...a shame that windows 11 will finally kill it off...just to be clear mine have always been run at stock speeds...the i7 2600k coupled with my rx580 can play red dead 2 at between 45 -60 fps with ultra,high and some medium settings...an awesome result imho....i did try coupling an rx6600xt with the i7 and played red dead online but the cpu and system as a whole bottleneck the gfx card so much that fps was no better than on my rx580 4gb version
My i5 2500 (3.914ghz), p8z77 v lx, 16 gbs 1600mhz, and 1660 ti preforms nicely. 92-98 percent gpu usage with 60-70 percent cpu usage in benchmarks. Ark survival evolved (yes ark) runs on epic settings 1080p capped at 35 fps smoothly, or high settings 1080p runs 50-60 fps. The benchmarks ive seen arent much different so im sticking with this legandary cpu until its completely dead. it does everything i need. ALSO i would advise purchasing this processor! Get a used p9z77 board, i52500 for like 80-150 dollars and your set. find a used 1660 or 1070. You can play any game you want and its cheap! your losing minimal fps!
still using it. See no reason to waste money change it, I'm no competitive gamer anyway. changed motherboard once as it fried. Will buy another board (probably Z77 with m.2) soon as the current one also a bit wonky. people who complaint about 45fps = bad, are rich people. You can afford better stuff.
Even though I am like a kid in a candy store watching the amazing graphics of today's games (compared to ANSI graphics I loved in the early 90's LOL), I don't game. I do however use the heck out of my Dell i5 2500K machine for digging through the internet, researching stuff, and do use it for minor video editing for family, using Vegas 14,with no problems. Now, pardon my excitement and ignorance, carefully blended (LOL), but I have had a couple dozen browser windows open, watching a youtube video in 1080p or better, and transferring files from an SSD to a flash drive, all at once, with no slow down, no problems. THAT is why I like the i5 2500K, and it was cheap! Keep in mind though, I'm not a tech, just a truck driver by trade, AND came from the 286/386 days, so yeah, I'm happy! :)
Hello everyone, I'm still using this processor and although I'm already looking for an upgrade, I'm generally satisfied with 16 GB of ram and a GTX 1080 (which is probably being held back by the processor). My CPU is now clocked at 4.2 Ghz, in the past I used up to 4.9 Ghz, but I don't want to overload it too much anymore. Great processor.
I'll just never understand modern PC gamer's obsession with 60 fps. You should have been around in the '90s and early' 00s. We were lucky to get 20 fps sometimes. And consoles have been at 30 fps for years and years and millions upon millions of gamers have played them without complaining about it. Don't get me wrong, I can tell a difference between 30 fps and 60+ fps. I strongly prefer higher framerates, but to act like anything under 60 is unplayable? Well millions of console gamers would say otherwise. Myself, it's a bit jarring at first if I have to play something at 30 fps, but it doesn't take long to adapt and ignore it.
As long as you cap your refresh rate to match it. 30-60 FPS @ 120Hz is a damn slide show in certain games. Also there was something magical about 15-30 FPS consoles on a CRT. That's why emus to this day can look horrible compared to that experience. [insert Digital Foundry did a video on it blah blah]
Great to hear this although pairing a readily obtainable $50 CPU with a relatively unobtainable $1000 GPU is an odd choice for a new direction of the channel. I think if people pay that much for a GPU they are not going that cheap on the CPU or motherboard.
I'm still rocking a 2500 and pushing the OC to 4.7 really easily. Have OC'd RAM and Nvidia GTX 970, they're still holding up. Considering a new GPU, run the CPU into the ground then do a new build around the GPU. I actually got this processor from a mate 5 years ago. It's been an absolute beast.
That's what I have in my PC with a Radeon 5700XT. I can run some VR games on Quest 1 and Cyberpunk. It's been a good time since around 2010. I just ordered a Ryzen 9 7900X along with a Gigabyte B650E AM5 motherboard. Windows 11, here I come.
It's 2021... the GPU market sux and is a threat to the PC Gaming Industry... having said that, I don't think CPUs have been a problem. CPUs like the 6c/12t Comet Lake i5 10400/10400F at around $150-ish have been readily available for at least the past year (the 4c/8t 10300 was close to $100 for some time). If people are still clinging to 10yo CPUs in 2021its not from a lack of quality affordable options not being available... they are out there, and in 2021, even a person on a tight budget should be able to afford a 10400 CPU. The real problem right now is the GPU market... these videos should probably focus on various older GPUs rather than CPUs.
Great video...I'm still using my i5 2500k (had to replace with another one cause the 1st one wouldn't oc anymore but at 4.4) with an RX 5700 which has done well though the gpu is bottlenecked but think the 2500k should hold me out till 2nd generation of socket AM5.
I had the i7 2600k in 2012 I believe. 4 cores/8 threads. It was an awesome chip and overclocked really well too. Then in 2014 I got an i7 4930k -4c/8t- 6c/12t with quad channel memory. Probably a better match for more modern games.
You're not alone bro, I'm digging through offers and spotted a desktop Fujitsu i5-2500k, 8gb DDR3 for my dad, general use, no gaming. Just youtube, facebook and stuff like that. Was doing some research on this Gen2 i5 and it seems it was a beast of its time, I think I'm gonna buy it for my dad. It seems, based off this video that it's also good enough for gaming, still. Of course, paired with a decent GPU and with the expectation that you will need to tweak graphical settings for that sweet spot where it's both playable and looking good enough. I think, if you're on a budget, you should buy one.
Yesterday I ordered R5 1600af to upgrade my 10 y.o. system to a new platform and possibly upgrade later on :) I've been running i5 2500k till now and that CPU is astounding. Still handles itself well and if I would not need to change the MB I would not even buy a new CPU.
(I have a X99 system for my gaming) I Bought an Dell optiplex 390 server with a server bios just for fun,. replaced the Celeron with a 2500K and the 2GB Ram with 2x8 GB HyperXFury something Ram and slapped a Kingston 256 Gb SSD in it and repurposed a 500 Gb Seagate HDD for Storage. Slapped a humble Gigabyte GT730 2GB GDDR5 Direct X12 GPU in it, Overclocked that a bit, Playes up to Skyrim 2016 version and Fortnite. CPU hits 4389Mhz on rare occasions so it had allready been overclocked in the past. With Stockcooler for an i3 think it hits 75 degrees celcius max while gaming. I wonder what the most balanced GPU is with the i5 2500k. Oh and that all still on the server bios. You have to enable the Turbo Boost for the CPU in that bios 😜 Oh and the volume of L cash on the CPU alsoo affects frame rates,
I find it funny how much CPU innovation/progression has stalled in the last decade. I still use an i7-4770K and it does the job for just about every game. It came out in Q2 of 2013 and has stayed very usable 8 years out and probably will be for at least another 2 years until the PS5 and Xbox Series X become the mainstream consoles with their 8 cores and 16 threads. You definitely could not have said this about an equivalent processor bought in 2005, working well into 2013 and expecting it to perform well until 2015. Thing would’ve been a potato by then. I would really like to see the 2600K or 2700K tested with the same games and setup. If you’re testing
I had a 3770K before, which did the job quite well, but I had some really bad stuttering in SotTR in the town of Paititi or in KCD in the town of Rattay. In scenes where there are a lot of players or NPC it became a mess. When I was riding through Rattay on my horse it dropped to 24fps and I barely could manouver through the townsfolk. It was such a massive upgrade going to my used 9900K in December. Looking forward to a "hyper threaded" follow up! The DDR4 RAM support and newer architecture should really improve things. 🧡
DO NOT DO IT!!! Have you overclocked your CPU? if not, dont waste your money on 9900K and overclock your CPU to 4.7Ghz, it should be easy enough if you have a good motherboard. Secondly what GPU and Monitor do you have and what resolution do you play on?
@@gershonlapokimon4698 I appreciate you caring for my rig, but I can calm you down. I ran the i7 @4.3 GHz, as it wouldn't go any higher no matter the voltage I threw at it. I upgraded because the system became a bit unstable and my 2400 RAM would only run at 2133. I wanted to buy a 5800X, but I wasn't willing to spend 500 bucks at the time. I got the i9 and Z390 Ace cheaper than the R7. I play on a G-sync 144Hz 1440p monitor and upgraded to a 3070 as well, because the cooler on my 1080 was crap. I liked my old machine, but I would never go back, everything runs so much smoother. What PC do you have?
@@fabiusmaximuscunctator7390 3770K OC to 4.7Ghz - EVGA 3080 TI, GA-Z68XP-UD4. Playing at 4K. I have a machine 9700K as well, both machines perform the same at 4K.
@@gershonlapokimon4698 Really? I'm honestly questioning that as the i7 was holding back my 1080 in most games quite severely, despite playing on ultra. Your Ti is twice the performance and the Z68 only supports PCIe 2.0. There are some games that ran fine like Metro: Exodus or Doom, but those are exceptions. But if it's fine for you, who cares? 🤷♂️
@@fabiusmaximuscunctator7390 Z68 supports up to PCIE 3.0 with 3770K, with 2600k and below, yes its PCIE 2.0. At 1080p it will bottleneck, the CPU will be hammered but on 2K or 4K it in most cases will not. You can see similar systems paired with 2080 TI gaming at 4K and compare with 9900K the performance is about the same.
@@somerandompersonontheinter1835 I thinks hes trying to show the absolute cheapest solutions too so that i5, for some reason, still sells pretty high for what it is on ebay.
@@somerandompersonontheinter1835 well that chip does have better ipc but idk the threading on the 2600k, especially overclocked to 4.5, would def make them close to each other. Maybe a video on 2600k @ 4.5 compared to the last produced 4c 4t and see how much performance ud lose for going with the supposedly cheaper 2600k
@10:53 : Action single player games such as Assasins Creed/Horizon Zero Dawn/Red Dead Redemption/Cyberpunk , etc , are perfectly playable with a +30fps. Even a sporadic stutter here and there doesn't wreck the gameplay in such kind of games , since you can save your game any time you want. With such old CPUs , i would recommend you to include testing games were stutters can really wreck the game , such as F1 games for example were a stutter can cause you to crash your car and ruin your entire -hard fought - race in a fraction of a second. I still remember that my brother ,who loved F1 games , had serious stuttering issues with an i5-760 . ( by the way ,that was only Intel CPU i bought since 2001 and after. This issue was fixed when i tried an FX8350 ) Also , you should include some e-sports ( Fortnite/Counter strike , etc ) where again , a stutter can ruin hours of gameplay. Overall , i liked the idea in this video. I always like to see older hardware in action.
I played CP 2077 on a 3570K at 4.4 GHz and an 980ti with nearly all settings maxed out. Solid 50+fps at all times. Most of the time over 60. As long as you tweak the settings a little bit by lowering some of the post processing effects from Ultra to medium you will mostly not notice a difference at all from all Ultra. Don't remember what exactly it was but 2 settings were dipping the fps a huge amount and both had to do with post processing particle effects. Shadows were the next one dipping fps.
im running the 2500k with an hd 7950 i built like 5 years ago and its still running every game ive wanted to play what should i upgrade first in the future? i can only afford 1 thing at a time i guess it'll still be quite a while before GPU prices normalize
The 2500K only needs to be paired with the right GPU that 6700Xt was mostly running between 50-70% on most games that is a big CPU bottleneck right there in my honest opinion if you have a 2nd or 3rd gen i5 or i7 the best GPU you could pair with them is an RX470\570 or a GTX980/1060 that is the borderline max that you could get away with on those CPUs without wasting GPU performance due to the CPU bottleneck.
The whole point of cpu testing is to eliminate the gpu bottleneck and run the cpu at 100%. This shows the absolute max fps the cpu can produce in these games regardless of resolution
Nice test, I would think that 4core 4 thread wasn't gonna cut in today's Games. 6 core 12 thread is where I would put the min. I am glad you finally received the 6700xt from Paul.
i needed something for light gaming on a controller/emulation of ps1/ps2... grabbed this thing for 9 $... it works surprisingly well paired with a gtx 1650 even in some better titles and emulation with Vulkan renderer is flawless, cpu is not even such a bottleneck there
Old CPU i highly recommend just seeing if your PC can do 60 if it can't then lock it to 30FPS using a frame limiter I own a GTX 1080 i plan on playing cyberpunk as i want to play that game and i don't see me getting a new GPU for years so i will be limiting it to 30FPS i also game at 1440P
I'm running a 5800x with 32 gigs of ram. M.2 ssds. and a 1650 super. Playing cyberpunk at 1440 with 70 to 90 percent resolution scale and all low settings but only seeing about 45 fps avg and lows in the low 30s. I need a graphics card so bad but I am not paying the crazy prices.
It's all about game selection and playing older games. I used i7 920 with 2070 super with a 4k monitor for almost a year. And oh boy, when CPU bottlenecked it did it really hard. 4fps in starcraft 2 co-op in when things get way too hard for the CPU. Damn path finding of zombies. 3900x ~ 60fps. I should have upgraded my CPU way sooner. When modern CPUs are closer to three times faster in a single-threaded or low thread count performance and even faster when the core count increases are counted in. Gaming is a bad CPU benchmark, it averages things across several different bottlenecks. So when the CPU bottleneck really hits, it will have a lesser effect on average FPS but will be a really terrible experience. In the path of exile, the CPU bottleneck of loading models meant that getting into a new area caused models to appear tens of seconds after getting to the area, while FPS was in playable range the CPU bottleneck caused the game experience to suffer. So modern games FPS don't necessarily even measure the same thing between different CPUs because of the loading bottlenecks. On the other hand, if your system feels fast enough for everything you do, Intel, Nvidia, and AMD all have such rapid improvement on their products coming that no matter what you purchase it feels slow in comparison in not so distant future. Add the current pricing situation and you feel that you want more.
My brother's 6600K is starting to impact his every day use of the PC as we play some harder to run indie titles. If it had 8 threads, it'd be fine but he's just not digging the stuttering. The cost of a good CPU and Motherboard is still quite good that I wouldn't admonish someone from a platform upgrade.
I'm wondering what the minimum change necessary would be to get these games running better. it could be interesting seeing the effects (if any) of things like faster ram, using windows 7 or linux, cracked versions of games without drm, or in-socket cpu upgrades as a last resort.
What if you ran it on medium or low settings? Also why not cap it at 30 FPS? I get it we are PC gamers but at the same time its a 10 year old cpu. Isn't that part of the deal with hardware that old?
Lower settings had no effect in these games. That’s why I used an OP GPU so that wouldn’t be an issue. Also MOST PC gamers consider anything under 60fps average unacceptable
if you have old hardware at hand, upgrade GPU and monitor, play games at 2K or 4K resolution, ultra settings and you're good for at least 5 years, since games will be on par with what consoles can do, and PS5 or Xbox X, have mid range GPU's you'll be good to go with your old hardware for long long time. Even with 2500k OC to 4.5-4.7Ghz. But if you can get or have 3770K and up, you're in the clear for a very long time.
I have a 2500K@4.3 GHz just like in the video and an RX 470 + 16 GB DDR3 memory. It runs like in these tests. My combo (cpu+gpu) is very well balanced, in every single game they both ran close to 100% utilization. But it's obvious that they're old and if the prices were not that high I would upgrade things. I would like to add that if you lower your settings you can play at higher framerates and I would like to also add that for so called esports games this is still a very good system considering I bought the CPU+mobo+RAM for ~$95. :D Overwatch 1080p medium 144 fps, Warzone low ~100 fps, BF5 low ~100 fps.
To be fair it's impressive for a 10 year old CPU. Imagine trying to get Battlefield 2 (2005) to run on a Pentium 133 (1995), that's the time jump we're talking about here! Obviously not something you'd buy now, any Pc gamer should be looking at 6 core / 12 threads as a minium, and it shouldn't cost much - $100 / £80 should get you that. Ryzen 1600 would be the starting point, but ideally invest in a Ryzen 3600, or 10400F for around $150-70.
I play all i want with my i5 2500k oc on 4.2 gh, on Msi P67A-GD65 gaming mb, 16gb ram and Msi gtx 1660 super gaming x all games i play on 1080p, but some at 4k on my 43 inch uhd, i buy new Gigabyte Eagle 3050 card (4.0 pcie) but my mb have only pcie x16 2.0 and this new card make 3 beep on my bios, and no screen on my monitor, i go to the store and back 3050 card back. This 2500k is a legendary cpu.
Considering that the most popular and highly played games aren't the AAA titles on the daily. I'd say the Intel Core i5 2500K IS Still Good Enough For PC Games Today, but Intel Core i5 2500K ISN'T Good Enough For Modern AAA PC Games Today. Also most kids who would be getting these low budget cpu's setups would be pairing it with a rx580/6500xt/6600xt so the 30-60fps range is acceptable for AAA games but not ideal of course
Pretty much any game pre 2016 is playable on an i5 2500K. I noticed problems with Rise of the Tomb Raider in the Soviet Installation on ultra settings for level of detail.
"It's in the game" I'd really like to see games listed and categorized according to cpu and-or gpu optimizations.,,, And I agree with others, this is a good content direction. Anything to eliminate e-waste is a good thing in spite of what msft plans on doing with windows.
My biggest PC gaming regret is buying a 2500K instead of a 2600K or 2700K. Sure, it was a more expensive buy, but at the time the importance of multhreading just wasn't apparent to most people yet. Those extra four threads really give you a nice 2-3 extra years of life compared to the four of the single threaded 2500K.
I have a great idea for content direction. Cover Linux gaming and how to get games to work on Linux. Linux gaming is going to become a huge deal in the next few years and you would be getting in on ground zero here.
Calling a 2500K overclocked to 4.4GHz "the best case scenario" is absurd. You're telling me you've owned a dozen of those and you aren't aware that they should be overclocked to AT LEAST 4.8GHz? C'mon, man, almost all of them will do 5.0GHz. What a slap in the face to the 2500K.
any reason why you only talk about pairing this with an AMD card? Sorry if I'm missing something its been a while since I was in the pc building scene and I was thinking about reviving my old rig with a cheap gtx 1080 I recently picked up.
When I finally replaced my 2500K system, I took the time to build a box and framed my 2500K on my wall. That's the legendary status that it deserves.
That's my plan too!!!
Retired it in 2020
Nice! 👍🏻
I will do the the same with my 3770K later
@@deeloc5500 dont. Thats much better or 2x-3x better give it to me hehe
I used a 2500k from 2014 to 2021 and never knew that it had a legendary status. When I built my 1st PC in 2014 it just sounded like it was a good overclockable CPU that I could still use in later years. It was still chugging along in summer of 2021 when I built my current setup. Now it just sits in my old tower collecting dust.
After 10 years for me, I've decided it's time for an upgrade. I've had my 2500k OC'd to 4.8ghz (air cooled) for a decade and it's still performing identically to when I got it. I would say the 2500K is the best cpu ever made, unfortunately it was made 11 years ago. I just bought an i7-12700K and am seriously considering framing my 2500K on my wall
Same here. Mine was oc’d to 4.5ghz. I just bought an Intel i7 12700k. The i5 now serves as a NAS running truenas.
@zssllayers1946 depends on your luck, CPU cooler, motherboard etc.
But tbh 4,5 GHz for i5 2500k is realy the norm, I had 4,6 and managed stable 4,8 (with no performance improvement though, I think my board is not good enough to get that power).
@zssllayers1946 Hmm...Take a closer look at your set up and follow some of the online guides.
CY's response to you is spot on.
Luck is a factor but what is your cooling situation like?
In my opinion cheaping out on CPU is unnecessary, CPUs aren't overpriced like GPUs and there are a lot of good options depending on budget, as long as your budget for a CPU is above $100. Something like a modern i3 or Ryzen 3 3100/3300X is plenty good for 60fps in modern games and a 6C/12T i5/Zen 2 or above Ryzen is going to give you a good platform for now and the future. If your budget is below $100 for a CPU, then something like an old i7 or high core count Xeon could be good cheap options and likely much better than an Athlon/Pentium.
Fully agreed.
@@younglingslayer2896 Pointless frankly. I wouldn't upgrade if I was one broken knee away from bankruptcy.
Sorry man, I know it sucks.
Maybe some old system, but honestly, it wouldn't be worth it.
This is more about the ability to keep using what you already have instead of looking to buy new CPUs.
@@younglingslayer2896 Cheers.
50$ for ram 100$ for MB and 100$ for the CPU.
I know someone who still has an i7-2600K in his main PC, paired with a GTX 980ti it still holds up reasonably well with a 4.2Ghz overclock. By reasonably well I mean that it manages to stay at or above 60fps in games like Star Wars Battlefront 2, he also plays Battlefield V, COD Warzone, COD Cold War (zombies specifically). It seems he's still quite happy with the performance of the CPU after nearly 10 years of service.
Literally since 2013 when I first got into PC gaming I've been exclusively a 1080p gamer with what I've played, what I've built, etc. etc. Haven't felt the need to spend the money to get into 1440p let alone 4K. Maybe someday I will but 1080p does the job for me.
What about upscaled 4k from an oled tv?
The higher the resolutions the more expensive the hardware you need to fuel it properly. I was using a 5700xt @ 1080p with my first build and my second build since last year has been a 5900X/6800XT @ 1440p. I'd rather use a 4k card on a high fps 27" 1440p IPS display than a 4K but I do occasionally hook my machine up to my flat screen. Still I rather have the higher fps personally.
yeah I swapped to 1440p and regret it... you can't "unsee" it... but the price to keep it "up2date is HIGH... wish I had stayed oblivious and at 1080p... I do miss the 1650x1080 era though :D
@@LiLBitsDK Force yourself to see on 1080p for a month and the effect disappears. Of course you have to not play on 1440p or look at it.
@@goa141no6 that's like trying to give up sugar! :-P
Excellent video. Excited for seeing an old i7 4 cores 8 threads processor for comparison
My mate had the 2500K until last year. He overclocked it to 4.6 GHz and was able to play every Call of Duty each year (with an Nvidia 970 in the end). Legend CPU. He now rocks a 5800x though after finally upgrading.
Nice B-roll!
Thanks man :)
Where were the 1% lows Chris? Testing really old CPUs with current games and GPUs is probably the most important hardware to test what the drops might be. I know you don’t always do benchmarking, but you have had a lot of criticism for other outlets methods in the past including Jayztwocents for the titles he selected, and here we are using a mix of current and older releases with built in benchmark tools and no 1% lows.
Buzz off
@@bcal5962
He's right average fps means nothing on barely capable cpus. It could be a pure stutter and freezing fest on some games which is no way playable and average fps doesn't show this.
Absolutely. The 1% lows render these newer games unplayable on these old cpus. I've tested with a 2600k at 5ghz several games and you will get these massive drops
you can see the frametime graph, that IS the data you looking for. Any spikes on that line, its stutter which the 1% represents in a number but the actual frametime line is king when looking at game performance
@@imo098765
The line was a mess so yes it showed bad stutter.
Still using mine in 2024 with a GTX 970 12 years of use, legendary status indeed
Cool test. Not really surprised by results. I still use 13 years old Intel's QuadCore2 Q9650 (4C/4T) Oced to 4.1 GHz in my backup rig coupled with 12 gigs of DDR2 1333/1050TI/SSD combo and it is still able to run most games at 1080p/40-60fps. Granted, it's not a joyride, minority of modern games simply refuse to run due to missing instruction set, but still pretty cool to see you can actually still game on a 20 bucks CPU from 2008.
And of course the TSMC 20% price hike in CPU's and GPU's now... just keeps getting better and better. And that's from TSMC, so what we see down the pipeline will probably be higher. Its getting tight out there boys.
Good content, good direction to go, given the state of things now.
wont matter once China take Taiwan and we can't get squat in the west.
@@yurimodin7333 yup. Bye bye apple!
Not bad for a decade old CPU. Sandy Bridge was no joke.
And this is a midrange CPU with 4 core 4 threads. The Sandy Bridge I7 extreme series with 6 cores and 12 threads would fare much better.
@@valentinvas6454 Pretty much.
I must be tired when I first read this comment I swear it said "Sandy Bridge was a Joke" and got that quick back of the neck burn lol. The read it again and seen it said "Sandy Bridge was no Joke" & went phew lol. I'm still sporting a Sandy i7 2600K@5.1GHz and feel I got my full moneys worth out of this CPU 100% for sure it owes me nothing.
Heck I play pretty much all of the new titles and it still chews through these new games without to much issues at all. I am sure one of those new 8/16 Intels or AMD own 8/16 would do much better but I play on a 32" 1080p 60Hz screen so I guess it does not have to sweat to much to churn out only 60FPS for me lol. If I unlock FPS it shoots up quite good in most every game. I think the hardest game I have played on it was Assassins creed OD that game would not go over 70FPS for me no matter what and it stuttered from time to time. Assassins Creed Val runs great though smooth as can be that's kind of weird but I will take it lol.
@@Rocky-bz8wr 5.1 GHz? I never thought Sandy Bridge reached those clocks. is your CPU a golden sample running with some really good cooling? Awesome!
the most important thing in gaming is not AVG FPS, but 0.1% or 0,01% min fps
the most important thing in gaming is not AVG FPS, but 0.1% or 0,01% min fps
This, microstutter kills the joy in any game
went from an i3 4th gen (2c/4t) thing was bottleneck central every game. Finally said F it and bought an i7 10th gen which is overkill and out of budget. This 8c/16t never bottle necks thats a thing of the past now, even at sub 30 fps very smooth.
1% lows was the reason for buying this cpu. going from 80 to 30-50fps from the 2 core I3 frequently just annoys you.
To anybody reading this dont overspend like I did. For gaming at most go for the i5 or ryzen 5 (6 core/12threads) more than enough to game with 1% lows no less than 50 fps at worst case scenario. Grab the latest 4 core 8 thread cpu you can afford and you wont have a problem with stutter.
The magic sauce that ID software used for Doom Eternal is called Vulkan (and its low overhead instructions for the CPU).
Pretty sure theres more to it than just vulkan api
I hope MS doesn't try to shove DX12U up on Doom 3.
@@jangelelcangry Why would they? Doom 3 is an old game. Even f they remastered it t would likely use DX11.
@@Raivo_K I meant Eternal's sequel.
@@BonusCrook Yes, it's clever renderer design
Ubisoft are as shitty at opimizing as ID Software is amazing at optimizing :D
That's a known fact and unwritten rule by now.
They really need to make a new from the ground up next gen engine to take advantage and be optimized for the PS5/XSX consoles and for the current and future PC tech.
I shifted to ryzen 5 3600 , gave the old pc to my kids i5 3470 . I bought them a used rx 570 for a round 90 usd a year ago it still run modern games very well
I'm still running a 2500k @ 4.8GHx I just don't see a reason to upgrade - the 2500k was way ahead of its time. A modern equivalent would be like 20Ghz
Before the pandemic hit and messed up the used PC market, I regularly found AE Ryzen 5 1600s for 35-45 used, B350 mobos for 30-35. It's an excellent and somewhat still relative platform.
Since Q3'11 the 2500k rules here!!!!
If possible, maybe testing how Piledriver FX-83xx aged on newer games. Can Vulkan and DX12 push those 8 threads good?
It is rough a lot of games can't run over 40 fps. Vulcan helps some but no miracles there. The i5 2500k is about 30% faster and it was struggling.
@@chrisvanalstine8136 the i5 is not 30% faster… and in modern day games the fx-83xx will perform better. Also to answer the question it depends on the game fx can be funky
@@roughousegaming9606 I'm sorry you are right it's not 30%..... it's 28% and that's not the 0.1% loss that's the 1%. The average is around 20%. Anyways FX is never going to magically get good whatever the numbers are you want to nitpick over.
@@chrisvanalstine8136 🥱 never said fx was good a straw man ain’t necessary and can you elaborate on the 28% faster keynote the original poster said piledriver. We are long past the era of 4 core being enough single core performance is nice but not having enough resources is crippling ie it would be like running out of vram. Hence why the original poster was curious an 8/8 cpu should fair much better
@@roughousegaming9606 I didn't say you did. And the 28% is the percentage calculation from a few benchmarks and the fx 81xx 83xx 95xx is the "8 core" unless you mean the low clocked opteron stuff. Even with the i5 being 99% used it is still faster single core speed has nothing to do with it, it is the stronger floating point units and overall stronger core architecture.
I think this is a good idea Chris, theres a lot of people out there (including me) that loves old hardware, keep it up!
I was so excited when I heard him say he was using the 6700 XT. This was the sort of combination I always to see being tested as a 4.4Ghz 3770K owner. The only unfortunate thing is the lack of 1440p and 4K testing. The 2500k clearly bottlenecks the 6700 XT at 1080p when you see sub 80% GPU utilization. I hope he does a followup video at 1440p and higher. I think older 4c8t parts like the 2600K/2700K, 3770K and 4790K still have some life in them yet. This is especially true for pcie 3.0 x16 capable parts from haswell and up.
If the cpu can’t do 1080p it can’t do any other resolution. The whole point of using 1080p with an OP GPU is to show the max fps capable with this cpu in these games. A viewer is sending me a 3770k so I’ll do that video soon. Basically you’ll see what the absolute max fps you can get out of a 3770k will be regardless of resolution.
This is great content. I'm really curious to see the same test with an i7-2600k, 2700k, or 3770k, especially paired with 2133 or 2400 mhz RAM. I bet they'd do substantially better, having 8 threads.
20-25% for the CPU. Not sure if that'd be enough. I'll probably snag a 2600k since I have the rest of the test bench already. 3770K's still command too high a price. The RAM would probably help Valhalla, but doesn't seem to be an issue for the other games tested.
3770k probably holds up well these days
@@TheGoodOldGamer My own experience with going from a 3570k to a 3770k in my old rig showed that minimum FPS was vastly improved with the i7, in modern CPU-heavy games. The RAM speed made a much bigger difference than I expected as well. In games that hit the CPU hard, I got a 15-20% boost out of 2133 mhz RAM, versus 1600. Those old i7's were starved for memory bandwidth with 1333 and 1600 RAM, at least in modern titles that hit the CPU hard.
@@BonusCrook It does, especially with 2133 or 2400mhz RAM.
@@BonusCrook I had the 3770 @4.2hgZ; for a few years. It worked great with 2133 but it was really struggling with Sony A7RII 42 megapixel files. Usually 80 megabytes. Regardless of that it was definitely rock solid. Until it died for some reason this year. I have upgraded to the Xenon 1680V2. OC to 4.5ghz.
For many AAA titles 4c/4t has not been enough even with "maxed out" OC since 2017-18, even if you got 60fps it was more often stuttering than lower clocked 4c/8t parts. My i5 6400 was suffering in some games as early as late 2016, but that's probably more because of the low clocks than a lack of threads.
A few comments about the games tested:
CP2077 becomes even more CPU demanding in parts of the city with many NPCs than the scene shown in this vid shows, and if you have a GPU that can run it with RT on it in my experience becomes even more demanding on the CPU. My 3600 could not keep my 3080 above 65-80% utilized in some areas with RT on at 1440p and DLSS Quality active.
AC: Valhalla, in my experience this game is actually a bit less CPU demanding than Odyssey, so it would have been interesting to also see how that game would run. My 3600 at least would have more issues in game with Odyssey than Valhalla.
Watch Dogs Legion is in my experience harder on the CPU in game than in the benchmark, most noticable when driving at high speed in London. This was another instance where my 3600 had a bit of a hard time keeping a 3080 fully utilized.
HZD is one game that in my experience has a benchmark that doesn't really show how the game will actually run. It seems to mostly show the best case scenario in game and you will spend most of your time at a lower framerate while playing, at least on more balanced systems.
I really liked this video, the only thing I would like added in future videos is that you would also include 1% lows as they help to show stutters a bit more clearly.
I used an I5 2400 (3.4 GHz max turbo), 16 GB RAM in dual channel (1333 MHz), Rx 480 4 GB, and an SSD until last summer (2020). I played on a 900p (1600x900) 19 inch TN panel with no type of variable refresh. Most games played quite well. Obviously there were some AAA titles, like Ubisoft titles that were CPU bottlenecked. But besides the outliers, it's doable. I locked all of my games to 60 FPS except for Rainbow Six Siege. I had an old business PC build so the motherboard was trash (the BIOs was very simple and locked down; I couldn't even change the RAM timings and there were no options at all for anything with RAM, also the boot order was locked; whoever made the motherboard assumed the person using the PC was an idiot so they locked it all down).
But that build can handle most games. Especially if the games are from 2018 and earlier, you will be fine. 2019 is when some of those outliers began coming out that would cause a significant bottleneck with the CPU.
Interesting video. I prefer this type of content. Keep up the good work.
I was running 2500k until near end of 2017. Last thing that made me sick of it was when update in Quake Champions made it have really bad stutters. Average fps was still about 110 but stuttering was so bad tha 1% lows were like 25. Even before that update it wasnt very smooth experience but it was okay. Upgrade was ryzen 1600x and averages jumped from 110 to like 145 while using 150fps limit and even bigger jump in 1% lows. Not sure how well it would run now as it got optimized in updates, but i think some other games might have similar issues.
i think most people would write the 2500k off as desirable because you can get better cpu's around the same price that offer better performance. There is no reason to go back 10 years to look at ancient hardware when there are better chips, for around the same or lower prices. All that really needs be said for the 2500k, is "its old, compared to current chips, it sucks, and if you like playing games as though its 2011, it will be perfect for you. The 2500k is on my list of "if you find one propping a desk leg up, its something that is capable of running old or little games or big games if you dont much in any way care about frame rate or especially frameTIME stability, but is in no way something desirable to purchase in 2021."
Buildzoid got his 2500k to 5ghz i think
Another issue with Sandy Bridge CPUs is that those only run PCIE 2.0 IIRC.
@@jangelelcangry I believe Z68 has 3.0 already tho my P67 was indeed 2.0
Doom just dont give a fuck to what hardware ppl have, just click play and play, its something other game developers should pay attention and learn.
great video man, keep the coming, price/performance is and always was the thing.
Im still running my 2500k. I have it at 4.4 ghz! Still running strong! Run ms flight sim with an old gtx 770 on medium setting without an issue! Also have 32 gb of ram and 1tb ssd. I do want to upgrade to a rtx 3060 ti at some point when you can get them for 399 again.
I have a i5 2500k its now my son/wifes PC runs most of the latest games and it wont die!
still rocking the 2500k with a 970 (gpu and ram upgrade about 5 years ago) and is a SOLID 1080p computer to this day. finally thinking about upgrading since i want to upgrade to 1440p. just hope the new build holds up as well as this one has.
Awesome dude. You're a real one. Love to see people keep their older stuff running for a long time. I miss the 900 series.
Great video. Great new focus
I still have this legend along with an i7 2600k in my own rig...a shame that windows 11 will finally kill it off...just to be clear mine have always been run at stock speeds...the i7 2600k coupled with my rx580 can play red dead 2 at between 45 -60 fps with ultra,high and some medium settings...an awesome result imho....i did try coupling an rx6600xt with the i7 and played red dead online but the cpu and system as a whole bottleneck the gfx card so much that fps was no better than on my rx580 4gb version
I have three vetroo in my system, apart from being a bit loud, it's not bad.
My i5 2500 (3.914ghz), p8z77 v lx, 16 gbs 1600mhz, and 1660 ti preforms nicely. 92-98 percent gpu usage with 60-70 percent cpu usage in benchmarks. Ark survival evolved (yes ark) runs on epic settings 1080p capped at 35 fps smoothly, or high settings 1080p runs 50-60 fps. The benchmarks ive seen arent much different so im sticking with this legandary cpu until its completely dead. it does everything i need. ALSO i would advise purchasing this processor! Get a used p9z77 board, i52500 for like 80-150 dollars and your set. find a used 1660 or 1070. You can play any game you want and its cheap! your losing minimal fps!
My conclusion. It’s time to let the legend die.
I just upgraded my i5 2500K to a Ryzen 7 5800X a few weeks ago. Used it for literally 10 years. Best bang for the buck ever!
still using it. See no reason to waste money change it, I'm no competitive gamer anyway.
changed motherboard once as it fried. Will buy another board (probably Z77 with m.2) soon as the current one also a bit wonky.
people who complaint about 45fps = bad, are rich people. You can afford better stuff.
I have one of these laying around doing nothing right now with ram and motherboard. Guess it's time to give it something to do.
Even though I am like a kid in a candy store watching the amazing graphics of today's games (compared to ANSI graphics I loved in the early 90's LOL), I don't game. I do however use the heck out of my Dell i5 2500K machine for digging through the internet, researching stuff, and do use it for minor video editing for family, using Vegas 14,with no problems. Now, pardon my excitement and ignorance, carefully blended (LOL), but I have had a couple dozen browser windows open, watching a youtube video in 1080p or better, and transferring files from an SSD to a flash drive, all at once, with no slow down, no problems. THAT is why I like the i5 2500K, and it was cheap! Keep in mind though, I'm not a tech, just a truck driver by trade, AND came from the 286/386 days, so yeah, I'm happy! :)
I just don't have the heart to replace mine.
Hello everyone, I'm still using this processor and although I'm already looking for an upgrade, I'm generally satisfied with 16 GB of ram and a GTX 1080 (which is probably being held back by the processor). My CPU is now clocked at 4.2 Ghz, in the past I used up to 4.9 Ghz, but I don't want to overload it too much anymore. Great processor.
I'll just never understand modern PC gamer's obsession with 60 fps. You should have been around in the '90s and early' 00s. We were lucky to get 20 fps sometimes. And consoles have been at 30 fps for years and years and millions upon millions of gamers have played them without complaining about it. Don't get me wrong, I can tell a difference between 30 fps and 60+ fps. I strongly prefer higher framerates, but to act like anything under 60 is unplayable? Well millions of console gamers would say otherwise. Myself, it's a bit jarring at first if I have to play something at 30 fps, but it doesn't take long to adapt and ignore it.
As long as you cap your refresh rate to match it. 30-60 FPS @ 120Hz is a damn slide show in certain games.
Also there was something magical about 15-30 FPS consoles on a CRT. That's why emus to this day can look horrible compared to that experience. [insert Digital Foundry did a video on it blah blah]
humble expectations, 60 fps is a lot better than 30fps, why settle for 30 if for the same value you can get 60.
Great to hear this although pairing a readily obtainable $50 CPU with a relatively unobtainable $1000 GPU is an odd choice for a new direction of the channel. I think if people pay that much for a GPU they are not going that cheap on the CPU or motherboard.
The 6700xt is meant to be overkill (not a pairing suggestion). Good old gamer did it to show the max performance of the CPU.
I'm still rocking a 2500 and pushing the OC to 4.7 really easily. Have OC'd RAM and Nvidia GTX 970, they're still holding up. Considering a new GPU, run the CPU into the ground then do a new build around the GPU. I actually got this processor from a mate 5 years ago. It's been an absolute beast.
That's what I have in my PC with a Radeon 5700XT. I can run some VR games on Quest 1 and Cyberpunk. It's been a good time since around 2010. I just ordered a Ryzen 9 7900X along with a Gigabyte B650E AM5 motherboard. Windows 11, here I come.
It's 2021... the GPU market sux and is a threat to the PC Gaming Industry... having said that, I don't think CPUs have been a problem.
CPUs like the 6c/12t Comet Lake i5 10400/10400F at around $150-ish have been readily available for at least the past year (the 4c/8t 10300 was close to $100 for some time).
If people are still clinging to 10yo CPUs in 2021its not from a lack of quality affordable options not being available... they are out there, and in 2021, even a person on a tight budget should be able to afford a 10400 CPU.
The real problem right now is the GPU market... these videos should probably focus on various older GPUs rather than CPUs.
Liking the direction of the content, great video!
I'm rockin a 2500k with a gtx1650 plays pretty much everything on ultra
Great video...I'm still using my i5 2500k (had to replace with another one cause the 1st one wouldn't oc anymore but at 4.4) with an RX 5700 which has done well though the gpu is bottlenecked but think the 2500k should hold me out till 2nd generation of socket AM5.
My 2500K is still in my second pc, what a beast it was back in the day.
Excellent content - eager for for the next in the series
I enjoyed this testing.
If you already have a 2500k lying around, it's great that it can run Cyberpunk even without the AVX 2 instruction set
I had the i7 2600k in 2012 I believe. 4 cores/8 threads. It was an awesome chip and overclocked really well too. Then in 2014 I got an i7 4930k -4c/8t- 6c/12t with quad channel memory. Probably a better match for more modern games.
4930K is 6c/12t
@@Kynareth6 Doh! Fixed. 😄
@@bgtubber yeah, today you can get 5900X for about the same price
Subbing just because “we are going to see what we can do so you guys can play” well said
I like how PPL talking about thier retired 2500k and me here still dreaming to get one
You're not alone bro, I'm digging through offers and spotted a desktop Fujitsu i5-2500k, 8gb DDR3 for my dad, general use, no gaming. Just youtube, facebook and stuff like that.
Was doing some research on this Gen2 i5 and it seems it was a beast of its time, I think I'm gonna buy it for my dad.
It seems, based off this video that it's also good enough for gaming, still. Of course, paired with a decent GPU and with the expectation that you will need to tweak graphical settings for that sweet spot where it's both playable and looking good enough.
I think, if you're on a budget, you should buy one.
@@ojjagg9387 yessir I think that laptop is gonna be great for the budget
Ah, the Good ol Days when DDR3 was still a thing, oh wait, my system still has DDR3 Ram in!
Yesterday I ordered R5 1600af to upgrade my 10 y.o. system to a new platform and possibly upgrade later on :) I've been running i5 2500k till now and that CPU is astounding. Still handles itself well and if I would not need to change the MB I would not even buy a new CPU.
(I have a X99 system for my gaming) I Bought an Dell optiplex 390 server with a server bios just for fun,. replaced the Celeron with a 2500K and the 2GB Ram with 2x8 GB HyperXFury something Ram and slapped a Kingston 256 Gb SSD in it and repurposed a 500 Gb Seagate HDD for Storage. Slapped a humble Gigabyte GT730 2GB GDDR5 Direct X12 GPU in it, Overclocked that a bit, Playes up to Skyrim 2016 version and Fortnite. CPU hits 4389Mhz on rare occasions so it had allready been overclocked in the past. With Stockcooler for an i3 think it hits 75 degrees celcius max while gaming. I wonder what the most balanced GPU is with the i5 2500k. Oh and that all still on the server bios. You have to enable the Turbo Boost for the CPU in that bios 😜 Oh and the volume of L cash on the CPU alsoo affects frame rates,
I still run an i5-2500K but I don't run the latest games (also running an Nvidia GPU (960)), so I'm hanging on for a bit longer.
I find it funny how much CPU innovation/progression has stalled in the last decade. I still use an i7-4770K and it does the job for just about every game. It came out in Q2 of 2013 and has stayed very usable 8 years out and probably will be for at least another 2 years until the PS5 and Xbox Series X become the mainstream consoles with their 8 cores and 16 threads.
You definitely could not have said this about an equivalent processor bought in 2005, working well into 2013 and expecting it to perform well until 2015. Thing would’ve been a potato by then.
I would really like to see the 2600K or 2700K tested with the same games and setup. If you’re testing
I had a 3770K before, which did the job quite well, but I had some really bad stuttering in SotTR in the town of Paititi or in KCD in the town of Rattay. In scenes where there are a lot of players or NPC it became a mess. When I was riding through Rattay on my horse it dropped to 24fps and I barely could manouver through the townsfolk. It was such a massive upgrade going to my used 9900K in December.
Looking forward to a "hyper threaded" follow up! The DDR4 RAM support and newer architecture should really improve things. 🧡
DO NOT DO IT!!!
Have you overclocked your CPU? if not, dont waste your money on 9900K and overclock your CPU to 4.7Ghz, it should be easy enough if you have a good motherboard.
Secondly what GPU and Monitor do you have and what resolution do you play on?
@@gershonlapokimon4698 I appreciate you caring for my rig, but I can calm you down.
I ran the i7 @4.3 GHz, as it wouldn't go any higher no matter the voltage I threw at it. I upgraded because the system became a bit unstable and my 2400 RAM would only run at 2133.
I wanted to buy a 5800X, but I wasn't willing to spend 500 bucks at the time. I got the i9 and Z390 Ace cheaper than the R7. I play on a G-sync 144Hz 1440p monitor and upgraded to a 3070 as well, because the cooler on my 1080 was crap.
I liked my old machine, but I would never go back, everything runs so much smoother.
What PC do you have?
@@fabiusmaximuscunctator7390 3770K OC to 4.7Ghz - EVGA 3080 TI, GA-Z68XP-UD4.
Playing at 4K.
I have a machine 9700K as well, both machines perform the same at 4K.
@@gershonlapokimon4698 Really? I'm honestly questioning that as the i7 was holding back my 1080 in most games quite severely, despite playing on ultra. Your Ti is twice the performance and the Z68 only supports PCIe 2.0. There are some games that ran fine like Metro: Exodus or Doom, but those are exceptions. But if it's fine for you, who cares? 🤷♂️
@@fabiusmaximuscunctator7390 Z68 supports up to PCIE 3.0 with 3770K, with 2600k and below, yes its PCIE 2.0.
At 1080p it will bottleneck, the CPU will be hammered but on 2K or 4K it in most cases will not.
You can see similar systems paired with 2080 TI gaming at 4K and compare with 9900K the performance is about the same.
2600k at 4.5 still runs great.... should test that difference that the hyperthreading does
@@somerandompersonontheinter1835 I thinks hes trying to show the absolute cheapest solutions too so that i5, for some reason, still sells pretty high for what it is on ebay.
@@somerandompersonontheinter1835 well that chip does have better ipc but idk the threading on the 2600k, especially overclocked to 4.5, would def make them close to each other. Maybe a video on 2600k @ 4.5 compared to the last produced 4c 4t and see how much performance ud lose for going with the supposedly cheaper 2600k
Even my i7 920, 930, 960, and 980X are plenty for 1080p/1440p gaming aside from the last four AC: DRM edition
The i7 920 is probably the longest lasting CPU of all time ;)
quad core vs ht quad core vs 8 core Sandy Bridge LGA 2011 CPUs with quad channel memory.
@10:53 : Action single player games such as Assasins Creed/Horizon Zero Dawn/Red Dead Redemption/Cyberpunk , etc , are perfectly playable with a +30fps. Even a sporadic stutter here and there doesn't wreck the gameplay in such kind of games , since you can save your game any time you want.
With such old CPUs , i would recommend you to include testing games were stutters can really wreck the game , such as F1 games for example were a stutter can cause you to crash your car and ruin your entire -hard fought - race in a fraction of a second.
I still remember that my brother ,who loved F1 games , had serious stuttering issues with an i5-760 . ( by the way ,that was only Intel CPU i bought since 2001 and after. This issue was fixed when i tried an FX8350 )
Also , you should include some e-sports ( Fortnite/Counter strike , etc ) where again , a stutter can ruin hours of gameplay.
Overall , i liked the idea in this video. I always like to see older hardware in action.
I played CP 2077 on a 3570K at 4.4 GHz and an 980ti with nearly all settings maxed out. Solid 50+fps at all times. Most of the time over 60. As long as you tweak the settings a little bit by lowering some of the post processing effects from Ultra to medium you will mostly not notice a difference at all from all Ultra. Don't remember what exactly it was but 2 settings were dipping the fps a huge amount and both had to do with post processing particle effects. Shadows were the next one dipping fps.
Classic CPU there. Got one with a Rampage IV Gene kicking about still
The shortly awaited 2500k video mentioned in naaf stream is here :D
im running the 2500k with an hd 7950 i built like 5 years ago and its still running every game ive wanted to play what should i upgrade first in the future? i can only afford 1 thing at a time
i guess it'll still be quite a while before GPU prices normalize
The 2500K only needs to be paired with the right GPU that 6700Xt was mostly running between 50-70% on most games that is a big CPU bottleneck right there in my honest opinion if you have a 2nd or 3rd gen i5 or i7 the best GPU you could pair with them is an RX470\570 or a GTX980/1060 that is the borderline max that you could get away with on those CPUs without wasting GPU performance due to the CPU bottleneck.
The whole point of cpu testing is to eliminate the gpu bottleneck and run the cpu at 100%. This shows the absolute max fps the cpu can produce in these games regardless of resolution
Nice test, I would think that 4core 4 thread wasn't gonna cut in today's Games. 6 core 12 thread is where I would put the min. I am glad you finally received the 6700xt from Paul.
i needed something for light gaming on a controller/emulation of ps1/ps2... grabbed this thing for 9 $... it works surprisingly well paired with a gtx 1650 even in some better titles and emulation with Vulkan renderer is flawless, cpu is not even such a bottleneck there
Old CPU i highly recommend just seeing if your PC can do 60 if it can't then lock it to 30FPS using a frame limiter
I own a GTX 1080 i plan on playing cyberpunk as i want to play that game and i don't see me getting a new GPU for years so i will be limiting it to 30FPS i also game at 1440P
I'm still running i5 2500k & GTX 680. They run games really solid on lowest settings.
If you’re ok with x79 another great option is a e5-1650 unlocked Xeon
I'm running a 5800x with 32 gigs of ram. M.2 ssds. and a 1650 super. Playing cyberpunk at 1440 with 70 to 90 percent resolution scale and all low settings but only seeing about 45 fps avg and lows in the low 30s. I need a graphics card so bad but I am not paying the crazy prices.
It's all about game selection and playing older games. I used i7 920 with 2070 super with a 4k monitor for almost a year. And oh boy, when CPU bottlenecked it did it really hard. 4fps in starcraft 2 co-op in when things get way too hard for the CPU. Damn path finding of zombies. 3900x ~ 60fps.
I should have upgraded my CPU way sooner. When modern CPUs are closer to three times faster in a single-threaded or low thread count performance and even faster when the core count increases are counted in.
Gaming is a bad CPU benchmark, it averages things across several different bottlenecks. So when the CPU bottleneck really hits, it will have a lesser effect on average FPS but will be a really terrible experience.
In the path of exile, the CPU bottleneck of loading models meant that getting into a new area caused models to appear tens of seconds after getting to the area, while FPS was in playable range the CPU bottleneck caused the game experience to suffer. So modern games FPS don't necessarily even measure the same thing between different CPUs because of the loading bottlenecks.
On the other hand, if your system feels fast enough for everything you do, Intel, Nvidia, and AMD all have such rapid improvement on their products coming that no matter what you purchase it feels slow in comparison in not so distant future. Add the current pricing situation and you feel that you want more.
God damn that 2500K is still holding on thats actually amazing
Still using my god tier 5GHz 2500k @1.4v
My brother's 6600K is starting to impact his every day use of the PC as we play some harder to run indie titles. If it had 8 threads, it'd be fine but he's just not digging the stuttering. The cost of a good CPU and Motherboard is still quite good that I wouldn't admonish someone from a platform upgrade.
I'm wondering what the minimum change necessary would be to get these games running better. it could be interesting seeing the effects (if any) of things like faster ram, using windows 7 or linux, cracked versions of games without drm, or in-socket cpu upgrades as a last resort.
What if you ran it on medium or low settings? Also why not cap it at 30 FPS? I get it we are PC gamers but at the same time its a 10 year old cpu. Isn't that part of the deal with hardware that old?
Lower settings had no effect in these games. That’s why I used an OP GPU so that wouldn’t be an issue. Also MOST PC gamers consider anything under 60fps average unacceptable
if you have old hardware at hand, upgrade GPU and monitor, play games at 2K or 4K resolution, ultra settings and you're good for at least 5 years, since games will be on par with what consoles can do, and PS5 or Xbox X, have mid range GPU's you'll be good to go with your old hardware for long long time.
Even with 2500k OC to 4.5-4.7Ghz.
But if you can get or have 3770K and up, you're in the clear for a very long time.
I have a 2500K@4.3 GHz just like in the video and an RX 470 + 16 GB DDR3 memory. It runs like in these tests. My combo (cpu+gpu) is very well balanced, in every single game they both ran close to 100% utilization. But it's obvious that they're old and if the prices were not that high I would upgrade things.
I would like to add that if you lower your settings you can play at higher framerates and I would like to also add that for so called esports games this is still a very good system considering I bought the CPU+mobo+RAM for ~$95. :D
Overwatch 1080p medium 144 fps, Warzone low ~100 fps, BF5 low ~100 fps.
Please do the 3770K, I'd be curious to see how that does!
To be fair it's impressive for a 10 year old CPU. Imagine trying to get Battlefield 2 (2005) to run on a Pentium 133 (1995), that's the time jump we're talking about here! Obviously not something you'd buy now, any Pc gamer should be looking at 6 core / 12 threads as a minium, and it shouldn't cost much - $100 / £80 should get you that. Ryzen 1600 would be the starting point, but ideally invest in a Ryzen 3600, or 10400F for around $150-70.
I have one of them 2500k it paired with a 55rx it works fine
I play all i want with my i5 2500k oc on 4.2 gh, on Msi P67A-GD65 gaming mb, 16gb ram and Msi gtx 1660 super gaming x all games i play on 1080p, but some at 4k on my 43 inch uhd, i buy new Gigabyte Eagle 3050 card (4.0 pcie) but my mb have only pcie x16 2.0 and this new card make 3 beep on my bios, and no screen on my monitor, i go to the store and back 3050 card back. This 2500k is a legendary cpu.
I love how the Tomb Raider bench the 6700xt is only running at like 50-60 watts. Golf ball meet garden hose!
well they did cap it at 60fps.
@@WayStedYou I know. I just think it's amazing how modern parts run slightly older games so effortlessly at 1080p now.
Considering that the most popular and highly played games aren't the AAA titles on the daily.
I'd say the Intel Core i5 2500K IS Still Good Enough For PC Games Today, but Intel Core i5 2500K ISN'T Good Enough For Modern AAA PC Games Today.
Also most kids who would be getting these low budget cpu's setups would be pairing it with a rx580/6500xt/6600xt so the 30-60fps range is acceptable for AAA games but not ideal of course
wrong, you can do AAA titles on 2500K at higher resolutions with high-end GPU's
See: th-cam.com/users/AdiiSsearch?query=2600k
Pretty much any game pre 2016 is playable on an i5 2500K. I noticed problems with Rise of the Tomb Raider in the Soviet Installation on ultra settings for level of detail.
"It's in the game"
I'd really like to see games listed and categorized according to cpu and-or gpu optimizations.,,, And I agree with others, this is a good content direction. Anything to eliminate e-waste is a good thing in spite of what msft plans on doing with windows.
I still have my 2600K it’s one of my favorite CPUs and it’s easy to oc
My biggest PC gaming regret is buying a 2500K instead of a 2600K or 2700K. Sure, it was a more expensive buy, but at the time the importance of multhreading just wasn't apparent to most people yet. Those extra four threads really give you a nice 2-3 extra years of life compared to the four of the single threaded 2500K.
I have a great idea for content direction. Cover Linux gaming and how to get games to work on Linux. Linux gaming is going to become a huge deal in the next few years and you would be getting in on ground zero here.
Calling a 2500K overclocked to 4.4GHz "the best case scenario" is absurd. You're telling me you've owned a dozen of those and you aren't aware that they should be overclocked to AT LEAST 4.8GHz? C'mon, man, almost all of them will do 5.0GHz. What a slap in the face to the 2500K.
I got the white vertoo v5 and I love it. I think they're better than CM HYPER 212 Evo.
Assassins creed origins was the games that made me realise i needed to move on. Textures was not loading in cities.
How about an i5 7600K(4 cores, 4 threads), 16 gb Ram(3200] and the same gpu.
any reason why you only talk about pairing this with an AMD card? Sorry if I'm missing something its been a while since I was in the pc building scene and I was thinking about reviving my old rig with a cheap gtx 1080 I recently picked up.